The data shows the biggest leaks are in the Permian basin of Texas and New Mexico. Photo via Getty Images

American oil and natural gas wells, pipelines and compressors are spewing three times the amount of the potent heat-trapping gas methane as the government thinks, causing $9.3 billion in yearly climate damage, a new comprehensive study calculates.

But because more than half of these methane emissions are coming from a tiny number of oil and gas sites, 1% or less, this means the problem is both worse than the government thought but also fairly fixable, said the lead author of a study in Wednesday's journal Nature.

The same issue is happening globally. Large methane emissions events around the world detected by satellites grew 50% in 2023 compared to 2022 with more than 5 million metric tons spotted in major fossil fuel leaks, the International Energy Agency reported Wednesday in their Global Methane Tracker 2024. World methane emissions rose slightly in 2023 to 120 million metric tons, the report said.

“This is really an opportunity to cut emissions quite rapidly with targeted efforts at these highest emitting sites,” said lead author Evan Sherwin, an energy and policy analyst at the U.S. Department of Energy's Lawrence Berkeley National Lab who wrote the study while at Stanford University. “If we can get this roughly 1% of sites under control, then we're halfway there because that's about half of the emissions in most cases.”

Sherwin said the fugitive emissions come throughout the oil and gas production and delivery system, starting with gas flaring. That's when firms release natural gas to the air or burn it instead of capturing the gas that comes out of energy extraction. There's also substantial leaks throughout the rest of the system, including tanks, compressors and pipelines, he said.

“It's actually straightforward to fix,” Sherwin said.

In general about 3% of the U.S. gas produced goes wasted into the air, compared to the Environmental Protection Agency figures of 1%, the study found. Sherwin said that's a substantial amount, about 6.2 million tons per hour in leaks measured over the daytime. It could be lower at night, but they don't have those measurements.

The study gets that figure using one million anonymized measurements from airplanes that flew over 52% of American oil wells and 29% of gas production and delivery system sites over a decade. Sherwin said the 3% leak figure is the average for the six regions they looked at and they did not calculate a national average.

Methane over a two-decade period traps about 80 times more heat than carbon dioxide, but only lasts in the atmosphere for about a decade instead of hundreds of years like carbon dioxide, according to the EPA.

About 30% of the world's warming since pre-industrial times comes from methane emissions, said IEA energy supply unit head Christophe McGlade. The United States is the No. 1 oil and gas production methane emitter, with China polluting even more methane from coal, he said.

Last December, the Biden administration issued a new rule forcing the U.S. oil and natural gas industry to cut its methane emissions. At the same time at the United Nations climate negotiations in Dubai, 50 oil companies around the world pledged to reach near zero methane emissions and end routine flaring in operations by 2030. That Dubai agreement would trim about one-tenth of a degree Celsius, nearly two-tenths of a degree Fahrenheit, from future warming, a prominent climate scientist told The Associated Press.

Monitoring methane from above, instead of at the sites or relying on company estimates, is a growing trend. Earlier this month the market-based Environmental Defense Fund and others launched MethaneSAT into orbit. For energy companies, the lost methane is valuable with Sherwin's study estimate it is worth about $1 billion a year.

About 40% of the global methane emissions from oil, gas and coal could have been avoided at no extra cost, which is “a massive missed opportunity,” IEA's McGlade said. The IEA report said if countries do what they promised in Dubai they could cut half of the global methane pollution by 2030, but actions put in place so far only would trim 20% instead, “a very large gap between emissions and actions,” McGlade said.

“It is critical to reduce methane emissions if the world is to meet climate targets,” said Cornell University methane researcher Robert Horwath, who wasn't part of Sherwin's study.

“Their analysis makes sense and is the most comprehensive study by far out there on the topic,” said Howarth, who is updating figures in a forthcoming study to incorporate the new data.

The overflight data shows the biggest leaks are in the Permian basin of Texas and New Mexico.

“It's a region of rapid growth, primarily driven by oil production,” Sherwin said. “So when the drilling happens, both oil and gas comes out, but the main thing that the companies want to sell in most cases was the oil. And there wasn't enough pipeline capacity to take the gas away” so it spewed into the air instead.

Contrast that with tiny leak rates found in drilling in the Denver region and the Pennsylvania area. Denver leaks are so low because of local strictly enforced regulations and Pennsylvania is more gas-oriented, Sherwin said.

This shows a real problem with what National Oceanic and Atmospheric Association methane-monitoring scientist Gabrielle Petron calls “super-emitters."

“Reliably detecting and fixing super-emitters is a low hanging fruit to reduce real life greenhouse gas emissions,” Petron, who wasn't part of Sherwin's study, said. “This is very important because these super-emitter emissions are ignored by most ‘official’ accounting.”

Stanford University climate scientist Rob Jackson, who also wasn't part of the study, said, “a few facilities are poisoning the air for everyone.”

“For more than a decade, we’ve been showing that the industry emits far more methane than they or government agencies admit," Jackson said. “This study is capstone evidence. And yet nothing changes.”

Ad Placement 300x100
Ad Placement 300x600

CultureMap Emails are Awesome

UH's $44 million mass timber building slashed energy use in first year

building up

The University of Houston recently completed assessments on year one of the first mass timber project on campus, and the results show it has had a major impact.

Known as the Retail, Auxiliary, and Dining Center, or RAD Center, the $44 million building showed an 84 percent reduction in predicted energy use intensity, a measure of how much energy a building uses relative to its size, compared to similar buildings. Its Global Warming Potential rating, a ratio determined by the Intergovernmental Panel on Climate Change, shows a 39 percent reduction compared to the benchmark for other buildings of its type.

In comparison to similar structures, the RAD Center saved the equivalent of taking 472 gasoline-powered cars driven for one year off the road, according to architecture firm Perkins & Will.

The RAD Center was created in alignment with the AIA 2030 Commitment to carbon-neutral buildings, designed by Perkins & Will and constructed by Houston-based general contractor Turner Construction.

Perkins & Will’s work reduced the building's carbon footprint by incorporating lighter mass timber structural systems, which allowed the RAD Center to reuse the foundation, columns and beams of the building it replaced. Reused elements account for 45 percent of the RAD Center’s total mass, according to Perkins & Will.

Mass timber is considered a sustainable alternative to steel and concrete construction. The RAD Center, a 41,000-square-foot development, replaced the once popular Satellite, which was a food, retail and hangout center for students on UH’s campus near the Science & Research Building 2 and the Jack J. Valenti School of Communication.

The RAD Center uses more than a million pounds of timber, which can store over 650 metric tons of CO2. Aesthetically, the building complements the surrounding campus woodlands and offers students a view both inside and out.

“Spaces are designed to create a sense of serenity and calm in an ecologically-minded environment,” Diego Rozo, a senior project manager and associate principal at Perkins & Will, said in a news release. “They were conceptually inspired by the notion of ‘unleashing the senses’ – the design celebrating different sights, sounds, smells and tastes alongside the tactile nature of the timber.”

In addition to its mass timber design, the building was also part of an Energy Use Intensity (EUI) reduction effort. It features high-performance insulation and barriers, natural light to illuminate a building's interior, efficient indoor lighting fixtures, and optimized equipment, including HVAC systems.

The RAD Center officially opened Phase I in Spring 2024. The third and final phase of construction is scheduled for this summer, with a planned opening set for the fall.

Experts on U.S. energy infrastructure, sustainability, and the future of data

Guest column

Digital infrastructure is the dominant theme in energy and infrastructure, real estate and technology markets.

Data, the byproduct and primary value generated by digital infrastructure, is referred to as “the fifth utility,” along with water, gas, electricity and telecommunications. Data is created, aggregated, stored, transmitted, shared, traded and sold. Data requires data centers. Data centers require energy. The United States is home to approximately 40% of the world's data centers. The U.S. is set to lead the world in digital infrastructure advancement and has an opportunity to lead on energy for a very long time.

Data centers consume vast amounts of electricity due to their computational and cooling requirements. According to the United States Department of Energy, data centers consume “10 to 50 times the energy per floor space of a typical commercial office building.” Lawrence Berkeley National Laboratory issued a report in December 2024 stating that U.S. data center energy use reached 176 TWh by 2023, “representing 4.4% of total U.S. electricity consumption.” This percentage will increase significantly with near-term investment into high performance computing (HPC) and artificial intelligence (AI). The markets recognize the need for digital infrastructure build-out and, developers, engineers, investors and asset owners are responding at an incredible clip.

However, the energy demands required to meet this digital load growth pose significant challenges to the U.S. power grid. Reliability and cost-efficiency have been, and will continue to be, two non-negotiable priorities of the legal, regulatory and quasi-regulatory regime overlaying the U.S. power grid.

Maintaining and improving reliability requires physical solutions. The grid must be perfectly balanced, with neither too little nor too much electricity at any given time. Specifically, new-build, physical power generation and transmission (a topic worthy of another article) projects must be built. To be sure, innovative financial products such as virtual power purchase agreements (VPPAs), hedges, environmental attributes, and other offtake strategies have been, and will continue to be, critical to growing the U.S. renewable energy markets and facilitating the energy transition, but the U.S. electrical grid needs to generate and move significantly more electrons to support the digital infrastructure transformation.

But there is now a third permanent priority: sustainability. New power generation over the next decade will include a mix of solar (large and small scale, offsite and onsite), wind and natural gas resources, with existing nuclear power, hydro, biomass, and geothermal remaining important in their respective regions.

Solar, in particular, will grow as a percentage of U.S grid generation. The Solar Energy Industries Association (SEIA) reported that solar added 50 gigawatts of new capacity to the U.S. grid in 2024, “the largest single year of new capacity added to the grid by an energy technology in over two decades.” Solar is leading, as it can be flexibly sized and sited.

Under-utilized technology such as carbon capture, utilization and storage (CCUS) will become more prominent. Hydrogen may be a potential game-changer in the medium-to-long-term. Further, a nuclear power renaissance (conventional and small modular reactor (SMR) technologies) appears to be real, with recent commitments from some of the largest companies in the world, led by technology companies. Nuclear is poised to be a part of a “net-zero” future in the United States, also in the medium-to-long term.

The transition from fossil fuels to zero carbon renewable energy is well on its way – this is undeniable – and will continue, regardless of U.S. political and market cycles. Along with reliability and cost efficiency, sustainability has become a permanent third leg of the U.S. power grid stool.

Sustainability is now non-negotiable. Corporate renewable and low carbon energy procurement is strong. State renewable portfolio standards (RPS) and clean energy standards (CES) have established aggressive goals. Domestic manufacturing of the equipment deployed in the U.S. is growing meaningfully and in politically diverse regions of the country. Solar, wind and batteries are increasing less expensive. But, perhaps more importantly, the grid needs as much renewable and low carbon power generation as possible - not in lieu of gas generation, but as an increasingly growing pairing with gas and other technologies. This is not an “R” or “D” issue (as we say in Washington), and it's not an “either, or” issue, it's good business and a physical necessity.

As a result, solar, wind and battery storage deployment, in particular, will continue to accelerate in the U.S. These clean technologies will inevitably become more efficient as the buildout in the U.S. increases, investments continue and technology advances.

At some point in the future (it won’t be in the 2020s, it could be in the 2030s, but, more realistically, in the 2040s), the U.S. will have achieved the remarkable – a truly modern (if not entirely overhauled) grid dependent largely on a mix of zero and low carbon power generation and storage technology. And when this happens, it will have been due in large part to the clean technology deployment and advances over the next 10 to 15 years resulting from the current digital infrastructure boom.

---

Hans Dyke and Gabbie Hindera are lawyers at Bracewell. Dyke's experience includes transactions in the electric power and oil and gas midstream space, as well as transactions involving energy intensive industries such as data storage. Hindera focuses on mergers and acquisitions, joint ventures, and public and private capital market offerings.

Rice researchers' quantum breakthrough could pave the way for next-gen superconductors

new findings

A new study from researchers at Rice University, published in Nature Communications, could lead to future advances in superconductors with the potential to transform energy use.

The study revealed that electrons in strange metals, which exhibit unusual resistance to electricity and behave strangely at low temperatures, become more entangled at a specific tipping point, shedding new light on these materials.

A team led by Rice’s Qimiao Si, the Harry C. and Olga K. Wiess Professor of Physics and Astronomy, used quantum Fisher information (QFI), a concept from quantum metrology, to measure how electron interactions evolve under extreme conditions. The research team also included Rice’s Yuan Fang, Yiming Wang, Mounica Mahankali and Lei Chen along with Haoyu Hu of the Donostia International Physics Center and Silke Paschen of the Vienna University of Technology. Their work showed that the quantum phenomenon of electron entanglement peaks at a quantum critical point, which is the transition between two states of matter.

“Our findings reveal that strange metals exhibit a unique entanglement pattern, which offers a new lens to understand their exotic behavior,” Si said in a news release. “By leveraging quantum information theory, we are uncovering deep quantum correlations that were previously inaccessible.”

The researchers examined a theoretical framework known as the Kondo lattice, which explains how magnetic moments interact with surrounding electrons. At a critical transition point, these interactions intensify to the extent that the quasiparticles—key to understanding electrical behavior—disappear. Using QFI, the team traced this loss of quasiparticles to the growing entanglement of electron spins, which peaks precisely at the quantum critical point.

In terms of future use, the materials share a close connection with high-temperature superconductors, which have the potential to transmit electricity without energy loss, according to the researchers. By unblocking their properties, researchers believe this could revolutionize power grids and make energy transmission more efficient.

The team also found that quantum information tools can be applied to other “exotic materials” and quantum technologies.

“By integrating quantum information science with condensed matter physics, we are pivoting in a new direction in materials research,” Si said in the release.