While our grid may be showing its age, this is the perfect time to shift from reacting to problems to getting ahead of them.

Did you catch those images of idle generators that CenterPoint had on standby during Hurricane Beryl? With over 2 million people in the Houston area left in the dark, many were wondering, "if the generators are ready, why didn’t they get used?" It seems like power outages are becoming just as common as the severe storms themselves.

But as Ken Medlock, Senior Director of the Baker Institute Center for Energy Studies (CES) explains, it's not a simple fix. The outages during Hurricane Beryl were different from what we saw during Winter Storm Uri. This time, with so many poles and wires down, those generators couldn’t be put to use. It’s a reminder that each storm brings its own set of challenges, and there’s no one-size-fits-all solution when it comes to keeping the lights on. While extreme weather is one of the leading threats to our electric grid, it's certainly not the only one adding strain on our power infrastructure.

The rapid rise of artificial intelligence (AI) and electric vehicles (EVs) is transforming the way we live, work, and move. Beneath the surface of these technological marvels lies a challenge that could define the future of our energy infrastructure: they all depend on our electrical grid. As AI-powered data centers and a growing fleet of EVs demand more power than ever before, our grid—already under pressure from extreme weather events and an increasing reliance on renewable energy—faces a critical test. The question goes beyond whether our grid can keep up, but rather focuses on how we can ensure it evolves to support the innovations of tomorrow without compromising reliability today. The intersection of these emerging technologies with our aging energy infrastructure poses a dilemma that policymakers, industry leaders, and consumers must address.

Julie Cohn, Nonresident Fellow at the Center for Energy Studies at the Baker Institute for Public Policy, presents several key findings and recommendations to address concerns about the reliability of the Texas energy grid in her Energy Insight. She suggests there’s at least six developments unfolding that will affect the reliability of the Texas Interconnected System, operated by the Electric Reliability Council of Texas (ERCOT) and the regional distribution networks operated by regulated utilities.

Let’s dig deeper into some of these issues:

AI

AI requires substantial computational power, particularly in data centers that house servers processing vast amounts of data. These data centers consume large amounts of electricity, putting additional strain on the grid.

According to McKinsey & Company, a single hyperscale data center can consume as much electricity as 80,000 homes combined. In 2022, data centers consumed about 200 terawatt-hours (TWh), close to 4 percent, of the total electricity used in the United States and approximately 460 TWh globally. That’s nearly the consumption of the entire State of Texas, which consumed approximately 475.4 TWh of electricity in the same year. However, this percentage is expected to increase significantly as demand for data processing and storage continues to grow. In 2026, data centers are expected to account for 6 percent, almost 260 TWh, of total electricity demand in the U.S.

EVs

According to the Texas Department of Motor Vehicles, approximately 170,000 EVs have been registered across the state of Texas as of 2023, with Texas receiving $408 million in funding to expand its EV charging network. As Cohn suggests, a central question remains: Where will these emerging economic drivers for Texas, such as EVs and AI, obtain their electric power?

EVs draw power from the grid every time they’re plugged in to charge. This may come as a shock to some, but “the thing that’s recharging EV batteries in ERCOT right now, is natural gas,” says Medlock. And as McKinsey & Company explains, the impact of switching to EVs on reducing greenhouse gas (GHG) emissions will largely depend on how much GHG is produced by the electricity used to charge them. This adds a layer of complexity as regulators look to decarbonize the power sector.

Depending on the charger, a single EV fast charger can pull anywhere from 50 kW to 350 kW of electricity per hour. Now, factor in the constant energy drain from data centers, our growing population using power for homes and businesses, and then account for the sudden impact of severe environmental events—which have increased in frequency and intensity—and it’s clear: Houston… we have a problem.

The Weather Wildcard

Texas is gearing up for its 2025 legislative session on January 14. The state's electricity grid once again stands at the forefront of political discussions. The question is not just whether our power will stay on during the next winter storm or scorching summer heatwave, but whether our approach to grid management is sustainable in the face of mounting challenges. The events of recent years, from Winter Storm Uri to unprecedented heatwaves, have exposed significant vulnerabilities in the Texas electricity grid, and while legislative measures have been taken, they have been largely patchwork solutions.

Winter Storm Uri in 2021 was a wake-up call, but it wasn’t the first or last extreme weather event to test the Texas grid. With deep freezes, scorching summers, and unpredictable storms becoming the norm rather than the exception, it is clear that the grid’s current state is not capable of withstanding these extremes. The measures passed in 2021 and 2023 were steps in the right direction, but they were reactive, not proactive. They focused on strengthening the grid against cold weather, yet extreme heat, a more consistent challenge in Texas, remains a less-addressed threat. The upcoming legislative session must prioritize comprehensive climate resilience strategies that go beyond cold weather prep.

“The planners for the Texas grid have important questions to address regarding anticipated weather extremes: Will there be enough energy? Will power be available when and where it is needed? Is the state prepared for extreme weather events? Are regional distribution utilities prepared for extreme weather events? Texas is not alone in facing these challenges as other states have likewise experienced extremely hot and dry summers, wildfires, polar vortexes, and other weather conditions that have tested their regional power systems,” writes Cohn.

Renewable Energy and Transmission

Texas leads the nation in wind and solar capacity (Map: Energy, Environment, and Policy in the US), however the complexity lies in getting that energy from where it’s produced to where it’s needed. Transmission lines are feeling the pressure, and the grid is struggling to keep pace with the rapid expansion of renewables. In 2005, the Competitive Renewable Energy Zones (CREZ) initiative showed that state intervention could significantly accelerate grid expansion. With renewables continuing to grow, the big question now is whether the state will step up again, or risk allowing progress to stall due to the inadequacy of the infrastructure in place. The legislature has a choice to make: take the lead in this energy transition or face the consequences of not keeping up with the pace of change.

Conclusion

The electrical grid continues to face serious challenges, especially as demand is expected to rise. There is hope, however, as regulators are fully aware of the strain. While our grid may be showing its age, this is the perfect time to shift from reacting to problems to getting ahead of them.

As Cohn puts it, “In the end, successful resolution of the various issues will carry significant benefits for existing Texas industrial, commercial, and residential consumers and have implications for the longer-term economic attractiveness of Texas. Suffice it to say, eyes will be, and should be, on the Texas legislature in the coming session.”

------------

Scott Nyquist is a senior advisor at McKinsey & Company and vice chairman, Houston Energy Transition Initiative of the Greater Houston Partnership. The views expressed herein are Nyquist's own and not those of McKinsey & Company or of the Greater Houston Partnership. This article originally ran on LinkedIn on September 11, 2024.

What does the future of global energy hold? A Rice University institute published its research-backed findings on the subject. Photo via Getty Images

Rice University releases data, analysis on future of global energy

eyes on insights

The Center for Energy Studies at Rice University’s Baker Institute for Public Policy has released a collection of articles addressing the most pressing policy issues in global energy.

The inaugural Energy Insights was supported by ongoing research at CES, with a goal of better understanding the energy landscape over the next few years.

“While no one can predict exactly what comes next, if we are paying attention, the road we travel provides plenty of signposts that can be used to understand the challenges and opportunities ahead,” wrote CES Senior Director Kenneth Medlock.

The articles, which are available online in a 120-page packet, focus on a wide variety of key issues — Texas electricity policy, energy and geopolitics in Eurasia, how the energy transition will affect the Middle East, the growing necessity of minerals and materials, and more.

All in all, the new Energy Insights will look at the ever-changing energy landscape.

“Industrialization, improved living standards, technological and process innovation, and increased mobility of people and goods, to name a few, are all hallmarks of continual energy transition,” Medlock adds. “The process is not done. The past lives on through long-lived legacy infrastructures, and the future evolves most rapidly when it can leverage that legacy. Exactly how though, remains an elusive topic.”

Contributors to the publication include: Medlock, Julie Cohn, Gabe Collins, Ted Loch-Temzelides, Jim Krane, Osamah Alsayegh, Francisco Monaldi, Tilsa Oré Mónago, Michelle Michot Foss, Steven Miles, Mark Finley, Mahmoud El-Gamal, Chris Bronk, Rachel Meidl and Ed Emmett.

The initiative plans to bring together leading experts and policymakers to study the Argentine energy sector from oil and gas to renewables. Photo via Getty Images

Rice University to target Argentina energy sector with new initiative

headed south

A program at Rice University aiming to target the Argentine energy sector by including reports, workshops and conferences.

The Baker Institute for Public Policy announced a new initiative, the Baker Institute’s Argentina Energy Sector Initiatives, that will launch in September.

The initiative plans to bring together leading experts and policymakers to study the Argentine energy sector like oil and natural gas exploration and production, energy infrastructure (e.g., pipelines, electricity transmission and LNG export terminals), and the mining sector in the renewable energy transition. The initiative will include written reports and hold in-person conferences and workshops in Houston and Buenos Aires. There will also be a monthly online seminar series.

Fellows from the institute’s Center for Energy Studies will collaborate on the initiative with Argentine policymakers and technical experts and policymakers. Argentina contains the world’s second largest unconventional natural gas and fourth largest unconventional petroleum reserves, the Vaca Muerta shale formation.

The institute's Center for Energy Studies, which the Argentina program will take place in, has ranked as the top energy think tank in the world.

September’s formal launch will take place at the Baker Institute in Houston, and will be open to the public and live-streamed. The event will feature the participation of Baker Institute fellows, Argentina Program non-resident fellows, Argentine elected officials and others

Ad Placement 300x100
Ad Placement 300x600

CultureMap Emails are Awesome

Rice research team's study keeps CO2-to-fuel devices running 50 times longer

new findings

In a new study published in the journal Science, a team of Rice University researchers shared findings on how acid bubbles can improve the stability of electrochemical devices that convert carbon dioxide into useful fuels and chemicals.

The team led by Rice associate professor Hoatian Wang addressed an issue in the performance and stability of CO2 reduction systems. The gas flow channels in the systems often clog due to salt buildup, reducing efficiency and causing the devices to fail prematurely after about 80 hours of operation.

“Salt precipitation blocks CO2 transport and floods the gas diffusion electrode, which leads to performance failure,” Wang said in a news release. “This typically happens within a few hundred hours, which is far from commercial viability.”

By using an acid-humidified CO2 technique, the team was able to extend the operational life of a CO2 reduction system more than 50-fold, demonstrating more than 4,500 hours of stable operation in a scaled-up reactor.

The Rice team made a simple swap with a significant impact. Instead of using water to humidify the CO2 gas input into the reactor, the team bubbled the gas through an acid solution such as hydrochloric, formic or acetic acid. This process made more soluble salt formations that did not crystallize or block the channels.

The process has major implications for an emerging green technology known as electrochemical CO2 reduction, or CO2RR, that transforms climate-warming CO2 into products like carbon monoxide, ethylene, or alcohols. The products can be further refined into fuels or feedstocks.

“Using the traditional method of water-humidified CO2 could lead to salt formation in the cathode gas flow channels,” Shaoyun Hao, postdoctoral research associate in chemical and biomolecular engineering at Rice and co-first author, explained in the news release. “We hypothesized — and confirmed — that acid vapor could dissolve the salt and convert the low solubility KHCO3 into salt with higher solubility, thus shifting the solubility balance just enough to avoid clogging without affecting catalyst performance.”

The Rice team believes the work can lead to more scalable CO2 electrolyzers, which is vital if the technology is to be deployed at industrial scales as part of carbon capture and utilization strategies. Since the approach itself is relatively simple, it could lead to a more cost-effective and efficient solution. It also worked well with multiple catalyst types, including zinc oxide, copper oxide and bismuth oxide, which are allo used to target different CO2RR products.

“Our method addresses a long-standing obstacle with a low-cost, easily implementable solution,” Ahmad Elgazzar, co-first author and graduate student in chemical and biomolecular engineering at Rice, added in the release. “It’s a step toward making carbon utilization technologies more commercially viable and more sustainable.”

A team led by Wang and in collaboration with researchers from the University of Houston also shared findings on salt precipitation buildup and CO2RR in a recent edition of the journal Nature Energy. Read more here.

The case for smarter CUI inspections in the energy sector

Guest Column

Corrosion under insulation (CUI) accounts for roughly 60% of pipeline leaks in the U.S. oil and gas sector. Yet many operators still rely on outdated inspection methods that are slow, risky, and economically unsustainable.

This year, widespread budget cuts and layoffs across the sector are forcing refineries to do more with less. Efficiency is no longer a goal; it’s a mandate. The challenge: how to maintain safety and reliability without overextending resources?

Fortunately, a new generation of technologies is gaining traction in the oil and gas industry, offering operators faster, safer, and more cost-effective ways to identify and mitigate CUI.

Hidden cost of corrosion

Corrosion is a pervasive threat, with CUI posing the greatest risk to refinery operations. Insulation conceals damage until it becomes severe, making detection difficult and ultimately leading to failure. NACE International estimates the annual cost of corrosion in the U.S. at $276 billion.

Compounding the issue is aging infrastructure: roughly half of the nation’s 2.6 million miles of pipeline are over 50 years old. Aging infrastructure increases the urgency and the cost of inspections.

So, the question is: Are we at a breaking point or an inflection point? The answer depends largely on how quickly the industry can move beyond inspection methods that no longer match today's operational or economic realities.

Legacy methods such as insulation stripping, scaffolding, and manual NDT are slow, hazardous, and offer incomplete coverage. With maintenance budgets tightening, these methods are no longer viable.

Why traditional inspection falls short

Without question, what worked 50 years ago no longer works today. Traditional inspection methods are slow, siloed, and dangerously incomplete.

Insulation removal:

  • Disruptive and expensive.
  • Labor-intensive and time-consuming, with a high risk of process upsets and insulation damage.
  • Limited coverage. Often targets a small percentage of piping, leaving large areas unchecked.
  • Health risks: Exposes workers to hazardous materials such as asbestos or fiberglass.

Rope access and scaffolding:

  • Safety hazards. Falls from height remain a leading cause of injury.
  • Restricted time and access. Weather, fatigue, and complex layouts limit coverage and effectiveness.
  • High coordination costs. Multiple contractors, complex scheduling, and oversight, which require continuous monitoring, documentation, and compliance assurance across vendors and protocols drive up costs.

Spot checks:

  • Low detection probability. Random sampling often fails to detect localized corrosion.
  • Data gaps. Paper records and inconsistent methods hinder lifecycle asset planning.
  • Reactive, not proactive: Problems are often discovered late after damage has already occurred.

A smarter way forward

While traditional NDT methods for CUI like Pulsed Eddy Current (PEC) and Real-Time Radiography (RTR) remain valuable, the addition of robotic systems, sensors, and AI are transforming CUI inspection.

Robotic systems, sensors, and AI are reshaping how CUI inspections are conducted, reducing reliance on manual labor and enabling broader, data-rich asset visibility for better planning and decision-making.

ARIX Technologies, for example, introduced pipe-climbing robotic systems capable of full-coverage inspections of insulated pipes without the need for insulation removal. Venus, ARIX’s pipe-climbing robot, delivers full 360° CUI data across both vertical and horizontal pipe circuits — without magnets, scaffolding, or insulation removal. It captures high-resolution visuals and Pulsed Eddy Current (PEC) data simultaneously, allowing operators to review inspection video and analyze corrosion insights in one integrated workflow. This streamlines data collection, speeds up analysis, and keeps personnel out of hazardous zones — making inspections faster, safer, and far more actionable.

These integrated technology platforms are driving measurable gains:

  • Autonomous grid scanning: Delivers structured, repeatable coverage across pipe surfaces for greater inspection consistency.
  • Integrated inspection portal: Combines PEC, RTR, and video into a unified 3D visualization, streamlining analysis across inspection teams.
  • Actionable insights: Enables more confident planning and risk forecasting through digital, shareable data—not siloed or static.

Real-world results

Petromax Refining adopted ARIX’s robotic inspection systems to modernize its CUI inspections, and its results were substantial and measurable:

  • Inspection time dropped from nine months to 39 days.
  • Costs were cut by 63% compared to traditional methods.
  • Scaffolding was minimized 99%, reducing hazardous risks and labor demands.
  • Data accuracy improved, supporting more innovative maintenance planning.

Why the time is now

Energy operators face mounting pressure from all sides: aging infrastructure, constrained budgets, rising safety risks, and growing ESG expectations.

In the U.S., downstream operators are increasingly piloting drone and crawler solutions to automate inspection rounds in refineries, tank farms, and pipelines. Over 92% of oil and gas companies report that they are investing in AI or robotic technologies or have plans to invest soon to modernize operations.

The tools are here. The data is here. Smarter inspection is no longer aspirational — it’s operational. The case has been made. Petromax and others are showing what’s possible. Smarter inspection is no longer a leap but a step forward.

---

Tyler Flanagan is director of service & operations at Houston-based ARIX Technologies.


Scientists warn greenhouse gas accumulation is accelerating and more extreme weather will come

Climate Report

Humans are on track to release so much greenhouse gas in less than three years that a key threshold for limiting global warming will be nearly unavoidable, according to a study released June 19.

The report predicts that society will have emitted enough carbon dioxide by early 2028 that crossing an important long-term temperature boundary will be more likely than not. The scientists calculate that by that point there will be enough of the heat-trapping gas in the atmosphere to create a 50-50 chance or greater that the world will be locked in to 1.5 degrees Celsius (2.7 degrees Fahrenheit) of long-term warming since preindustrial times. That level of gas accumulation, which comes from the burning of fuels like gasoline, oil and coal, is sooner than the same group of 60 international scientists calculated in a study last year.

“Things aren’t just getting worse. They’re getting worse faster,” said study co-author Zeke Hausfather of the tech firm Stripe and the climate monitoring group Berkeley Earth. “We’re actively moving in the wrong direction in a critical period of time that we would need to meet our most ambitious climate goals. Some reports, there’s a silver lining. I don’t think there really is one in this one.”

That 1.5 goal, first set in the 2015 Paris agreement, has been a cornerstone of international efforts to curb worsening climate change. Scientists say crossing that limit would mean worse heat waves and droughts, bigger storms and sea-level rise that could imperil small island nations. Over the last 150 years, scientists have established a direct correlation between the release of certain levels of carbon dioxide, along with other greenhouse gases like methane, and specific increases in global temperatures.

In Thursday's Indicators of Global Climate Change report, researchers calculated that society can spew only 143 billion more tons (130 billion metric tons) of carbon dioxide before the 1.5 limit becomes technically inevitable. The world is producing 46 billion tons (42 billion metric tons) a year, so that inevitability should hit around February 2028 because the report is measured from the start of this year, the scientists wrote. The world now stands at about 1.24 degrees Celsius (2.23 degrees Fahrenheit) of long-term warming since preindustrial times, the report said.

Earth's energy imbalance

The report, which was published in the journal Earth System Science Data, shows that the rate of human-caused warming per decade has increased to nearly half a degree (0.27 degrees Celsius) per decade, Hausfather said. And the imbalance between the heat Earth absorbs from the sun and the amount it radiates out to space, a key climate change signal, is accelerating, the report said.

“It's quite a depressing picture unfortunately, where if you look across the indicators, we find that records are really being broken everywhere,” said lead author Piers Forster, director of the Priestley Centre for Climate Futures at the University of Leeds in England. “I can't conceive of a situation where we can really avoid passing 1.5 degrees of very long-term temperature change.”

The increase in emissions from fossil-fuel burning is the main driver. But reduced particle pollution, which includes soot and smog, is another factor because those particles had a cooling effect that masked even more warming from appearing, scientists said. Changes in clouds also factor in. That all shows up in Earth’s energy imbalance, which is now 25% higher than it was just a decade or so ago, Forster said.

Earth’s energy imbalance “is the most important measure of the amount of heat being trapped in the system,” Hausfather said.

Earth keeps absorbing more and more heat than it releases. “It is very clearly accelerating. It’s worrisome,” he said.

Crossing the temperature limit

The planet temporarily passed the key 1.5 limit last year. The world hit 1.52 degrees Celsius (2.74 degrees Fahrenheit) of warming since preindustrial times for an entire year in 2024, but the Paris threshold is meant to be measured over a longer period, usually considered 20 years. Still, the globe could reach that long-term threshold in the next few years even if individual years haven't consistently hit that mark, because of how the Earth's carbon cycle works.

That 1.5 is “a clear limit, a political limit for which countries have decided that beyond which the impact of climate change would be unacceptable to their societies,” said study co-author Joeri Rogelj, a climate scientist at Imperial College London.

The mark is so important because once it is crossed, many small island nations could eventually disappear because of sea level rise, and scientific evidence shows that the impacts become particularly extreme beyond that level, especially hurting poor and vulnerable populations, he said. He added that efforts to curb emissions and the impacts of climate change must continue even if the 1.5 degree threshold is exceeded.

Crossing the threshold "means increasingly more frequent and severe climate extremes of the type we are now seeing all too often in the U.S. and around the world — unprecedented heat waves, extreme hot drought, extreme rainfall events, and bigger storms,” said University of Michigan environment school dean Jonathan Overpeck, who wasn't part of the study.

Andrew Dessler, a Texas A&M University climate scientist who wasn't part of the study, said the 1.5 goal was aspirational and not realistic, so people shouldn’t focus on that particular threshold.

“Missing it does not mean the end of the world,” Dessler said in an email, though he agreed that “each tenth of a degree of warming will bring increasingly worse impacts.”