The view from heti

Tackling methane in the energy transition: Takeaways from Global Methane Hub and HETI

Leaders from across the energy value chain gathered in Houston for a roundtable to discuss tackling methane. Photo via Canva

Leaders from across the energy value chain gathered in Houston for a roundtable hosted by the Global Methane Hub (GMH) and the Houston Energy Transition Initiative (HETI). The session underscored the continued progress to reduce methane emissions as the energy industry addresses the dual challenge of producing more energy that the world demands while simultaneously reducing emissions.

The Industry’s Shared Commitment and Challenge

There’s broad recognition across the industry that methane emissions must be tackled with urgency, especially as natural gas demand is projected to grow 3050% by 2050. This growth makes reducing methane leakage more than a sustainability issue—it’s also a matter of global market access and investor confidence.

Solving this issue, however, requires overcoming technical challenges that span infrastructure, data acquisition, measurement precision, and regulatory alignment.

Getting the Data Right: Top-Down vs. Bottom-Up

Accurate methane leak monitoring and quantification is the cornerstone of any effective mitigation strategy. A key point of discussion was the differentiation between top-down and bottom-up measurement approaches.

Top-down methods such as satellite and aerial monitoring offer broad-area coverage and can identify large emission plumes. Technologies such as satellite-based remote sensing (e.g., using high-resolution imagery) or airborne methane surveys (using aircraft equipped with tunable diode laser absorption spectroscopy) are commonly used for wide-area detection. While these methods are efficient for identifying large-scale emission hotspots, their accuracy is lower when it comes to quantifying emissions at the source, detecting smaller, diffuse leaks, and providing continuous monitoring.

In contrast, bottom-up methods focus on direct, on-site detection at the equipment level, providing more granular and precise measurements. Technologies used here include optical gas imaging (OGI) cameras, flame ionization detectors (FID), and infrared sensors, which can directly detect methane at the point of release. These methods are more accurate but can be resource and infrastructure intensive, requiring frequent manual inspections or continuous monitoring installations, which can be costly and technically challenging in certain environments.

The challenge lies in combining both methods: top-down for large-scale monitoring and bottom-up for detailed, accurate measurements. No single technology is perfect or all-inclusive. An integrated approach that uses both datasets will help to create a more comprehensive picture of emissions and improve mitigation efforts.

From Detection to Action: Bridging the Gap

Data collection is just the first step—effective action follows. Operators are increasingly focused on real-time detection and mitigation. However, operational realities present obstacles. For example, real-time leak detection and repair (LDAR) systems—particularly for continuous monitoring—face challenges due to infrastructure limitations. Remote locations like the Permian Basin may lack the stable power sources needed to run continuous monitoring equipment to individual assets.

Policy, Incentives, and Regulatory Alignment

Another critical aspect of the conversation was the need for policy incentives that both promote best practices and accommodate operational constraints. Methane fees, introduced to penalize emissions, have faced widespread resistance due to their design flaws that in many cases actually disincentivize methane emissions reductions. Industry stakeholders are advocating for better alignment between policy frameworks and operational capabilities.

In the United States, the Subpart W rule, for example, mandates methane reporting for certain facilities, but its implementation has raised concerns about the accuracy of some of the new reporting requirements. Many in the industry continue to work with the EPA to update these regulations to ensure implementation meets desired legislative expectations.

The EU’s demand for quantified methane emissions for imported natural gas is another driving force, prompting a shift toward more detailed emissions accounting and better data transparency. Technologies that provide continuous, real-time monitoring and automated reporting will be crucial in meeting these international standards.

Looking Ahead: Innovation and Collaboration

The roundtable highlighted the critical importance of advancing methane detection and mitigation technologies and integrating them into broader emissions reduction strategies. The United States’ 45V tax policy—focused on incentivizing production of low-carbon intensity hydrogen often via reforming of natural gas—illustrates the growing momentum towards science-based accounting and transparent data management. To qualify for 45V incentives, operators can differentiate their lower emissions intensity natural gas by providing foreground data to the EPA that is precise and auditable, essential for the industry to meet both environmental and regulatory expectations. Ultimately, the success of methane reduction strategies depends on collaboration between the energy industry, technology providers, and regulators.

The roundtable underscored that while significant progress has been made in addressing methane emissions, technical, regulatory, and operational challenges remain. Collaboration across industry, government, and technology providers is essential to overcoming these barriers. With better data, regulatory alignment, and investments in new technologies, the energy sector can continue to reduce methane emissions while supporting global energy demands.

———

HETI thanks Chris Duffy, Baytown Blue Hydrogen Venture Executive, ExxonMobil; Cody Johnson, CEO, SCS Technologies; and Nishadi Davis, Head of Carbon Advisory Americas, wood plc, for their participation in this event.

This article originally appeared on the Greater Houston Partnership's Houston Energy Transition Initiative blog. HETI exists to support Houston's future as an energy leader. For more information about the Houston Energy Transition Initiative, EnergyCapitalHTX's presenting sponsor, visit htxenergytransition.org.

Trending News

A View From HETI

Houston's data center scene has received its latest bullish forecast. Photo via serverfarmllc.com

The Houston market could more than double its data center capacity by the end of 2028, a new report indicates.

The report, published by commercial real estate services provider CBRE, says greater demand for data center capacity in the Houston area is being fueled by energy companies, along with large-scale cloud services and AI-driven tenants.

In the second half of 2025, the Houston market had 154 megawatts of data center capacity, which was on par with capacity in the second half of 2024. Another 28.5 megawatts of capacity was under construction during that period.

“Multiple providers are advancing new builds and redevelopments, including significant power upgrades to recently purchased buildings, underscoring long-term confidence even as the market works through elevated vacancy and uneven absorption,” CBRE says of Houston’s data center presence.

One project alone promises to significantly boost the Houston market’s data center capacity. Data center developer Serverfarm plans to use part of a $3 billion credit facility to build a 250-acre, AI-ready data center campus near Houston with a potential capacity of more than 500 megawatts. The Houston campus and two other Serverfarm projects are already leased to unidentified tenants, according to CoStar.

A 60-megawatt, AI-ready Serverfarm data center is under construction in Houston. The $137 million, 438,000-square-foot project, located near the former headquarters of computer manufacturer Compaq, is supposed to be completed in the third quarter of 2027.

Data Center Map identifies 59 data centers in the Houston area managed by 36 operators, including DataBank, Data Foundry, Digital Realty, IBM, Logix Fiber Networks, Lumen and TRG Datacenters. That compares with more than 180 data centers in Dallas-Fort Worth, more than 50 in the San Antonio area and 40 in the Austin area.

Texas is home to more than 400 data centers, according to Data Center Map.

In November, Google said it’s investing $40 billion to build AI data centers in West Texas and the Texas Panhandle.

“This is a Texas-sized investment in the future of our great state,” Gov. Greg Abbott said when Google’s commitment was announced. “Texas is the epicenter of AI development, where companies can pair innovation with expanding energy. Google's $40 billion investment makes Texas Google's largest investment in any state in the country and supports energy efficiency and workforce development in our state.”

Trending News