What is the future of "the fifth utility"? Getty Images

Digital infrastructure is the dominant theme in energy and infrastructure, real estate and technology markets.

Data, the byproduct and primary value generated by digital infrastructure, is referred to as “the fifth utility,” along with water, gas, electricity and telecommunications. Data is created, aggregated, stored, transmitted, shared, traded and sold. Data requires data centers. Data centers require energy. The United States is home to approximately 40% of the world's data centers. The U.S. is set to lead the world in digital infrastructure advancement and has an opportunity to lead on energy for a very long time.

Data centers consume vast amounts of electricity due to their computational and cooling requirements. According to the United States Department of Energy, data centers consume “10 to 50 times the energy per floor space of a typical commercial office building.” Lawrence Berkeley National Laboratory issued a report in December 2024 stating that U.S. data center energy use reached 176 TWh by 2023, “representing 4.4% of total U.S. electricity consumption.” This percentage will increase significantly with near-term investment into high performance computing (HPC) and artificial intelligence (AI). The markets recognize the need for digital infrastructure build-out and, developers, engineers, investors and asset owners are responding at an incredible clip.

However, the energy demands required to meet this digital load growth pose significant challenges to the U.S. power grid. Reliability and cost-efficiency have been, and will continue to be, two non-negotiable priorities of the legal, regulatory and quasi-regulatory regime overlaying the U.S. power grid.

Maintaining and improving reliability requires physical solutions. The grid must be perfectly balanced, with neither too little nor too much electricity at any given time. Specifically, new-build, physical power generation and transmission (a topic worthy of another article) projects must be built. To be sure, innovative financial products such as virtual power purchase agreements (VPPAs), hedges, environmental attributes, and other offtake strategies have been, and will continue to be, critical to growing the U.S. renewable energy markets and facilitating the energy transition, but the U.S. electrical grid needs to generate and move significantly more electrons to support the digital infrastructure transformation.

But there is now a third permanent priority: sustainability. New power generation over the next decade will include a mix of solar (large and small scale, offsite and onsite), wind and natural gas resources, with existing nuclear power, hydro, biomass, and geothermal remaining important in their respective regions.

Solar, in particular, will grow as a percentage of U.S grid generation. The Solar Energy Industries Association (SEIA) reported that solar added 50 gigawatts of new capacity to the U.S. grid in 2024, “the largest single year of new capacity added to the grid by an energy technology in over two decades.” Solar is leading, as it can be flexibly sized and sited.

Under-utilized technology such as carbon capture, utilization and storage (CCUS) will become more prominent. Hydrogen may be a potential game-changer in the medium-to-long-term. Further, a nuclear power renaissance (conventional and small modular reactor (SMR) technologies) appears to be real, with recent commitments from some of the largest companies in the world, led by technology companies. Nuclear is poised to be a part of a “net-zero” future in the United States, also in the medium-to-long term.

The transition from fossil fuels to zero carbon renewable energy is well on its way – this is undeniable – and will continue, regardless of U.S. political and market cycles. Along with reliability and cost efficiency, sustainability has become a permanent third leg of the U.S. power grid stool.

Sustainability is now non-negotiable. Corporate renewable and low carbon energy procurement is strong. State renewable portfolio standards (RPS) and clean energy standards (CES) have established aggressive goals. Domestic manufacturing of the equipment deployed in the U.S. is growing meaningfully and in politically diverse regions of the country. Solar, wind and batteries are increasing less expensive. But, perhaps more importantly, the grid needs as much renewable and low carbon power generation as possible - not in lieu of gas generation, but as an increasingly growing pairing with gas and other technologies. This is not an “R” or “D” issue (as we say in Washington), and it's not an “either, or” issue, it's good business and a physical necessity.

As a result, solar, wind and battery storage deployment, in particular, will continue to accelerate in the U.S. These clean technologies will inevitably become more efficient as the buildout in the U.S. increases, investments continue and technology advances.

At some point in the future (it won’t be in the 2020s, it could be in the 2030s, but, more realistically, in the 2040s), the U.S. will have achieved the remarkable – a truly modern (if not entirely overhauled) grid dependent largely on a mix of zero and low carbon power generation and storage technology. And when this happens, it will have been due in large part to the clean technology deployment and advances over the next 10 to 15 years resulting from the current digital infrastructure boom.

---

Hans Dyke and Gabbie Hindera are lawyers at Bracewell. Dyke's experience includes transactions in the electric power and oil and gas midstream space, as well as transactions involving energy intensive industries such as data storage. Hindera focuses on mergers and acquisitions, joint ventures, and public and private capital market offerings.

Texans are facing extreme weather at every turn — can the grid withstand these events? Photo via heimdallpower.com

Can the Texas grid handle extreme weather conditions across regions?

Guest Column

From raging wildfires to dangerous dust storms and fierce tornadoes, Texans are facing extreme weather conditions at every turn across the state. Recently, thousands in the Texas Panhandle-South Plains lost power as strong winds ranging from 35 to 45 mph with gusts upwards of 65 mph blew through. Meanwhile, many North Texas communities are still reeling from tornadoes, thunderstorms, and damaging winds that occurred earlier this month.

A report from the National Oceanic and Atmospheric Administration found that Texas led the nation with the most billion-dollar weather and climate disasters in 2023, while a report from Texas A&M University researchers indicates Texas will experience twice as many 100-degree days, 30-50% more urban flooding and more intense droughts 15 years from now if present climate trends persist.

With the extreme weather conditions increasing in Texas and nationally, recovering from these disasters will only become harder and costlier. When it comes to examining the grid’s capacity to withstand these volatile changes, we’re past due. As of now, the grid likely isn’t resilient enough to make do, but there is hope.

Where does the grid stand now?

Investment from utility companies have resulted in significant improvements, but ongoing challenges remain, especially as extreme weather events become more frequent. While the immediate fixes have helped improve reliability for the time being, it won't be enough to withstand continuous extreme weather events. Grid resiliency will require ongoing efforts over one-time bandaid approaches.

What can be done?

Transmission and distribution infrastructure improvements must vary geographically because each region of Texas faces a different set of hazards. This makes a one-size-fits-all solution impossible. We’re already seeing planning and investment in various regions, but sweeping action needs to happen responsibly and quickly to protect our power needs.

After investigators determined that the 2024 Smokehouse Creek fire (the largest wildfire in Texas history) was caused by a decayed utility pole breaking, it raised the question of whether the Panhandle should invest more in wrapping poles with fire retardant material or covering wires so they are less likely to spark.

In response, Xcel Energy (the Panhandle’s version of CenterPoint) filed its initial System Resiliency Plan with the Public Utility Commission of Texas, with proposed investments to upgrade and strengthen the electric grid and ensure electricity for about 280,000 homes and businesses in Texas. Tailored to the needs of the Texas Panhandle and South Plains, the $539 million resiliency plan will upgrade equipment’s fire resistance to better stand up to extreme weather and wildfires.

Oncor, whose territories include Dallas-Fort Worth and Midland-Odessa, analyzed more than two decades of weather damage data and the impact on customers to identify the priorities and investments needed across its service area. In response, it proposed investing nearly $3 billion to harden poles, replace old cables, install underground wires, and expand the company's vegetation management program.

What about Houston?

While installing underground wires in a city like Dallas makes for a good investment in grid resiliency, this is not a practical option in the more flood-prone areas of Southeast Texas like Houston. Burying power lines is incredibly expensive, and extended exposure to water from flood surges can still cause damage. Flood surges are also likely to seriously damage substations and transformers. When those components fail, there’s no power to run through the lines, buried or otherwise.

As part of its resiliency plan for the Houston metro area, CenterPoint Energy plans to invest $5.75 billion to strengthen the power grid against extreme weather. It represents the largest single grid resiliency investment in CenterPoint’s history and is currently the most expensive resiliency plan filed by a Texas electric utility. The proposal calls for wooden transmission structures to be replaced with steel or concrete. It aims to replace or strengthen 5,000 wooden distribution poles per year until 2027.

While some of our neighboring regions focus on fire resistance, others must invest heavily in strengthening power lines and replacing wooden poles. These solutions aim to address the same critical and urgent goal: creating a resilient grid that is capable of withstanding the increasingly frequent and severe weather events that Texans are facing.

The immediate problem at hand? These solutions take time, meaning we’re likely to encounter further grid instability in the near future.

---

Sam Luna is director at BKV Energy, where he oversees brand and go-to-market strategy, customer experience, marketing execution, and more.

Georg Rute ,CEO of Gridraven, discusses the potential of AI and DLR. Photo via Getty Images

Energy expert: Unlocking the potential of the Texas grid with AI & DLR

guest column

From bitter cold and flash flooding to wildfire threats, Texas is no stranger to extreme weather, bringing up concerns about the reliability of its grid. Since the winter freeze of 2021, the state’s leaders and lawmakers have more urgently wrestled with how to strengthen the resilience of the grid while also supporting immense load growth.

As Maeve Allsup at Latitude Media pointed out, many of today’s most pressing energy trends are converging in Texas. In fact, a recent ERCOT report estimates that power demand will nearly double by 2030. This spike is a result of lots of large industries, including AI data centers, looking for power. To meet this growing demand, Texas has abundant natural gas, solar and wind resources, making it a focal point for the future of energy.

Several new initiatives are underway to modernize the grid, but the problem is that they take a long time to complete. While building new power generation facilities and transmission lines is necessary, these processes can take 10-plus years to finish. None of these approaches enables both significantly expanded power and the transmission capacity needed to deliver it in the near future.

Beyond “curtailment-enabled headroom”

A study released by Duke University highlighted the “extensive untapped potential” in U.S. power plants for powering up to 100 gigawatts of large loads “while mitigating the need for costly system upgrades.” In a nutshell: There’s enough generating capacity to meet peak demand, so it’s possible to add new loads as long as they’re not adding to the peak. New data centers must connect flexibly with limited on-site generation or storage to cover those few peak hours. This is what the authors mean by “load flexibility” and “curtailment-enabled headroom.”

As I shared with POWER Magazine, while power plants do have significant untapped capacity, the transmission grid might not. The study doesn’t address transmission constraints that can limit power delivery where it’s needed. Congestion is a real problem already without the extra load and could easily wipe out a majority of that additional capacity.

To illustrate this point, think about where you would build a large data center. Next to a nuclear plant? A nuclear plant will already operate flat out and will not have any extra capacity. The “headroom” is available on average in the whole system, not at any single power plant. A peaking gas plant might indeed be idle most of the time, but not 99.5% of the time as highlighted by the Duke authors as the threshold. Your data center would need to take the extra capacity from a number of plants, which may be hundreds of miles apart. The transmission grid might not be able to cope with it.

However, there is also additional headroom or untapped potential in the transmission grid itself that has not been used so far. Grid operators have not been able to maximize their grids because the technology has not existed to do so.

The problem with existing grid management and static line ratings

Traditionally, power lines are given a static rating throughout the year, which is calculated by assuming the worst possible cooling conditions of a hot summer day with no wind. This method leads to conservative capacity estimates and does not account for environmental factors that can impact how much power can actually flow through a line.

Take the wind-cooling effect, for example. Wind cools down power lines and can significantly increase the capacity of the grid. Even a slight wind blowing around four miles per hour can increase transmission line capacity by 30 percent through cooling.

That’s why dynamic line ratings (DLR) are such a useful tool for grid operators. DLR enables the assessment of individual spans of transmission lines to determine how much capacity they can carry under current conditions. On average, DLR increases capacity by a third, helping utilities sell more power while bringing down energy prices for consumers.

However, DLR is not yet widely used. The core problem is that weather models are not accurate enough for grid operators. Wind is very dependent on the detailed landscape, such as forests or hills, surrounding the power line. A typical weather forecast will tell you the average conditions in the 10 square miles around you, not the wind speed in the forest where the power line is. Without accurate wind data at every section, even a small portion of the line risks overheating unless the line is managed conservatively.

DLR solutions have been forced to rely on sensors installed on transmission lines to collect real-time weather measurements, which are then used to estimate line ratings. However, installing and maintaining hundreds of thousands of sensors is extremely time-consuming, if not practically infeasible.

The Elering case study

Last year, my company, Gridraven, tested our machine learning-powered DLR system, which uses a AI-enabled weather model, on 3,100 miles of 110-kilovolt and 330-kilovolt lines operated by Elering, Estonia’s transmission system operator, predicting ratings in 15,000 individual locations. The power lines run through forests and hills, where conventional forecasting systems cannot predict conditions with precision.

From September to November 2024, our average wind forecast accuracy saw a 60 percent improvement over existing technology, resulting in a 40 percent capacity increase compared to the traditional seasonal rating. These results were further validated against actual measurements on transmission towers.

This pilot not only demonstrated the power of AI solutions against traditional DLR systems but also their reliability in challenging conditions and terrain.

---

Georg Rute is the CEO of Gridraven, a software provider for Dynamic Line Ratings based on precision weather forecasting available globally. Prior to Gridraven, Rute founded Sympower, a virtual power plant, and was the head of smart grid development at Elering, Estonia's Transmission System Operator. Rute will be onsite at CERAWeek in Houston, March 10-14.

The views expressed herein are Rute's own. A version of this article originally appeared on LinkedIn.

Texas energy experts look ahead to what's in store for oil and gas in 2025. Photo via Getty Images

Experts reveal top 6 predictions for oil and gas industry in 2025

guest column

If you tune in to the popular national narrative, 2025 will be the year the oil and gas industry receives a big, shiny gift in the form of the U.S. presidential election.

President Donald Trump’s vocal support for the industry throughout his campaign has casual observers betting on a blissful new era for oil and gas. Already there are plans to lift the pause on LNG export permits and remove tons of regulatory red tape; the nomination of Chris Wright, chief executive of Liberty Energy, to lead the Department of Energy; and the new administration’s reported wide-ranging energy plan to boost gas exports and drilling — the list goes on.

While the outlook is positive in many of these areas, the perception of a “drill, baby, drill” bonanza masks a much more complicated reality. Oil and gas operators are facing a growing number of challenges, including intense pressure to reduce costs and boost productivity, and uncertainty caused by geopolitical factors such as the ongoing conflicts in the Middle East and Russia-Ukraine.

From our vantage point working with many of the country’s biggest operators and suppliers, we’re seeing activity that will have major implications for the industry — including the many companies based in and operating around Texas — in the coming year. Let’s dig in.

1. The industry’s cost crunch will continue — and intensify.
In 2024, oil and gas company leaders reported that rising costs and pressure to cut costs were two of the top three challenges they faced, according to a national Workrise-Newton X study that surveyed decision makers from operators and suppliers of all sizes. Respondents reported being asked to find an astonishing 40% to 60% reduction in supply chain-related costs across categories, on average.

Given the seemingly endless stream of geopolitical uncertainty (an expanded war in the Middle East, continued conflict after Russia’s invasion of Ukraine, and China’s flailing economy, for starters), energy companies are between a rock and a hard place when it comes to achieving cost savings from suppliers.

With lower average oil prices expected in 2025, expect the cost crunch to continue. That’s because today’s operators have only two levers they can rely on to drive an increase in shareholder returns: reducing costs and increasing well productivity. Historically, the industry could rely on a third lever: an increase in oil demand, which, combined with limited ability to meet that demand with supply, led to steadily increasing oil prices over time. But that is no longer the case.

2. The consolidation trend in oil and gas will continue, but its shape will change.
In the wake of the great oil and gas M&A wave of 2024, the number of deals will decrease — but the number of dollars spent will not. Fewer, larger transactions will be the face of consolidation in the coming year. Expect newly merged entities to spin off non-core assets, which will create opportunities for private equity to return to the space.

This will be the year the oil and gas industry becomes investable again, with potential for multiple expansions across the entire value chain — both the E&P and the service side. From what we’re hearing in the industry, expect 2 times more startups in 2025 than there were this year.

With roughly the same amount of deals next year, but less volume and fewer total transactions, there will be more scale — more pressure from the top to push down service costs. This will lead to better service providers. But there will also be losers, and those are the service providers that cannot scale with their large clients.

3. Refilling SPR will become a national priority.
The outgoing administration pulled about 300 million barrels out of the country’s Strategic Petroleum Reserve (SPR) during the early stages of the Russia-Ukraine conflict. In the coming year, replenishing those stores will be crucial.

There will be a steady buyer — the U.S. government — and it will reload the SPR to 600-plus million barrels. The government will be opportunistic, targeting the lowest price while taking care not to create too much imbalance in the supply-demand curve. A priority of the new administration will be to ensure they don’t create demand shocks, driving up prices for consumers while absorbing temporary oversupply that may occur due to seasonality (i.e. reduced demand in spring and fall).

The nation’s SPR was created following the 1973 oil embargo so that the U.S. has a cushion when there’s a supply disruption. With the current conflict in the Middle East continuing to intensify, the lessons learned in 1973 will be top of mind.

If OPEC + moves from defending prices to defending market share, we can expect their temporary production cuts to come back on market over time, causing oversupply and a resulting dramatic drop in oil prices. The U.S. government could absorb the balance, defending U.S. exploration and production companies while defending our country's interest in energy security. Refilling the SPR could create a hedge, protecting the American worker from this oversupply scenario.

4. The environment and emissions will remain a priority, and the economic viability of carbon capture will take center stage.
Despite speculation to the contrary, there will be a continuation of conservation efforts and emissions reduction among the biggest operators. The industry is not going to say, “Things have changed in Washington, so we no longer care about the environment.”

But there will be a shift in focus from energy alternatives that have a high degree of difficulty and cost keeping pace with increasing energy demand (think solar and wind) to technologies that are adjacent to the oil and gas industry’s core competencies. This means the industry will go all in on carbon capture and storage (CCS) technologies, driven by both environmental concerns and operational benefits. This is already in motion with major players (EQT, Exxon, Chevron, Conoco and more) investing heavily in CCS capabilities.

As the world races to reach net-zero emissions by 2050, there will be a push for carbon capture to be economical and scalable — in part because of the need for CO2 for operations in the business. In the not-so-distant future, we believe some operators will be able to capture as much carbon as they're extracting from the earth.

5. The sharp rise in electricity demand to power AI data centers will rely heavily on natural gas.
Growth in technologies like generative AI and edge computing is expected to propel U.S. electricity demand to hit record highs in 2025 after staying flat for about two decades. This is a big national priority — President Trump has said we’ll need to more than double our electricity supply to lead the globe in artificial intelligence capabilities — and the urgent need for power will bring more investment in new natural gas infrastructure.

Natural gas is seen as a crucial “bridge fuel” in the energy transition. The U.S. became the world's top exporter of LNG in 2023 — and in the year ahead, brace for a huge push for pipeline infrastructure development in the range of 10-15 Bcf of new pipeline capacity in the next two to three years. (Translation: development on a massive scale, akin to railway construction during the Industrial Revolution.)

Big operators have already been working on deals to use natural gas and carbon capture to power the tech industry; given the significant increase in the electricity transmission capabilities needed to support fast-growing technologies, there will continue to be big opportunities behind the meter.

6. Regulatory processes will become more efficient, not less stringent.
This year will bring a focus on streamlining and aligning regulations, rather than on wholesale rollbacks. It’s not carte blanche for the industry to do whatever it wants, but rather a very aggressive challenge to the things that are holding operators back.

Historically, authorities have stacked regulation upon regulation and, as new problems arise, added even more regulations on top.There will be a very deliberate effort this year to challenge the regulations currently in place, to make sure they are aligned and not just stacked.

The new administration is signaling that it will be deliberate about regulation matching intent. They’ll examine whether or not particular policies are valuable to retain, or reconfigure, or realign with the industry to enable growth and also still protect the environment.

Easing the regulatory environment will enable growth in savings, lower project costs and speed to bring projects online. Another benefit of regulatory certainty: it will make large capital project financing more readily available. We’ve seen major gridlock in large project financing due to a lack of trust in the regulatory environment and potential for rules to change mid-project (see: Keystone XL). If they are certain the new administration will be supportive of projects that are viable and meet regulatory requirements, companies will once again be able to obtain the financing needed to accelerate development and commissioning of those projects.

But we shouldn’t mistake a new era of regulatory certainty for a regulatory free-for-all. Take LNG permits. They should be accelerated — but don’t expect a reduction in the actual level of environmental protection as a result. It currently takes 18 months to get a single permit to drill a well on federal land. It should take three weeks. Before 2020, it took about a month to obtain a federal permit.

2025 will be the year we begin to return to regulatory efficiency without sacrificing the protections the rules and policies set out to accomplish in the first place.

---

Adam Hirschfeld and Jacob Gritte are executives at Austin-basedWorkrise, the leading labor provider and source-to-pay solution for energy companies throughout Texas and beyond.


The insurance crisis is reverberating across the nation. Photograph by Geoffrey George/Getty Images

Capitalism and climate: How financial shifts will shape our behavior

guest column

I never imagined I would see Los Angeles engulfed in flames in this way in my lifetime. As someone who has devoted years to studying climate science and advocating for climate technology solutions, I'm still caught off guard by the immediacy of these disasters. A part of me wants to believe the intensifying hurricanes, floods, and wildfires are merely an unfortunate string of bad luck. Whether through misplaced optimism or a subconscious shield of denial, I hadn't fully processed that these weren't just harbingers of a distant future, but our present reality. The recent fires have shattered that denial, bringing to mind the haunting prescience of the movie Don't Look Up. Perhaps we aren't as wise as we fancy ourselves to be.

The LA fires aren't an isolated incident. They're part of a terrifying pattern: the Canadian wildfires that darkened our skies, the devastating floods in Spain and Pakistan, and the increasingly powerful hurricanes in the Gulf. A stark new reality is emerging for climate-vulnerable cities, and whether we acknowledge the underlying crisis or not, climate change is making its presence felt – not just in death and destruction, but in our wallets.

The insurance industry, with its cold actuarial logic, is already responding. Even before the recent LA fires, major insurers like State Farm and Allstate had stopped writing new home policies in California, citing unmanageable wildfire risks. In the devastated Palisades area, 70% of homes had lost their insurance coverage before disaster struck. While some homeowners may have enrolled in California's limited FAIR plan, others likely went without coverage. Now, the FAIR plan faces $5.9 billion in potential claims, far exceeding its reinsurance backup – a shortfall that promises delayed payments and costlier coverage.

The insurance crisis is reverberating across the nation, and Houston sits squarely in its path. As a city all too familiar with the destructive power of extreme weather, we're experiencing our own reckoning. The Houston Chronicle recently reported that local homeowners are paying a $3,740 annually for insurance – nearly triple the national average and 60% higher than the Texas state average. Our region isn't just listed among the most expensive areas for home insurance; it's identified as one of the most vulnerable to climate hazards.

For Houston homeowners, Hurricane Harvey taught us a harsh lesson: flood zones are merely suggestions, not guarantees. The next major hurricane won't respect the city's floodplain designations. This reality poses a sobering question: Would you risk having your largest asset – your home – uninsured when flooding becomes increasingly likely in the next decade or two?

For most Americans, home equity represents one of the largest components of household wealth, a crucial stepping stone to financial security and generational advancement. Insurance isn't just about protecting physical property; it's about preserving the foundation of middle-class economic stability. When insurance becomes unavailable or unaffordable, it threatens the very basis of financial security for millions of families.

The insurance industry's retreat from vulnerable markets – as evidenced by Progressive and Foremost Insurance's withdrawal from writing new policies in Texas – is more than a business decision. It's a market signal. These companies are essentially pricing in the reality of climate change, whether we choose to call it that or not.

What we're witnessing is the market beginning to price us out of areas where we've either built unsustainably or perhaps should never have built at all. This isn't just about insurance rates; it's about the future viability of entire communities and regional economies. The invisible hand of the market is doing what political will has failed to do: forcing us to confront the true costs of our choices in a warming world.

Insurance companies aren't the only ones sounding the alarm. Lenders and investors are quietly rewriting the rules of capital access based on climate risk. Banks are adjusting mortgage terms and raising borrowing costs in vulnerable areas, while major investment firms are factoring carbon intensity into their lending decisions. Companies with higher environmental risks have faced higher loan spreads and borrowing costs – a trend that's accelerating as climate impacts intensify. This financial reckoning is creating a new economic geography, where access to capital increasingly depends on climate resilience.

The insurance crisis is the canary in the coal mine, warning us of the systemic risks ahead. As actuaries and risk managers factor climate risks into their models, we're seeing the beginning of a profound economic shift that will ripple far beyond housing, affecting businesses, agriculture, and entire regional economies. The question isn't whether we'll adapt to this new reality, but how much it will cost us – in both financial and human terms – before we finally act.

---

Nada Ahmed is the founding partner at Houston-based Energy Tech Nexus.

How has the Texas grid improved since Winter Storm Uri in 2021? Getty Images

Being prepared: Has the Texas grid been adequately winterized?

Winter in Texas

Houstonians may feel anxious as the city and state brace for additional freezing temperatures this winter. Every year since 2021’s Winter Storm Uri, Texans wonder whether the grid will keep them safe in the face of another winter weather event. The record-breaking cold temperatures of Uri exposed a crucial vulnerability in the state’s power and water infrastructure.

According to ERCOT’s 6-day supply and demand forecast from January 3, 2025, it expected plenty of generation capacity to meet the needs of Texans during the most recent period of colder weather. So why did the grid fail so spectacularly in 2021?

  1. Demand for electricity surged as millions of people tried to heat their homes.
  2. ERCOT was simply not prepared despite previous winter storms of similar intensity to offer lessons in similarities.
  3. The state was highly dependent on un-winterized natural gas power plants for electricity.
  4. The Texas grid is isolated from other states.
  5. Failures of communication and coordination between ERCOT, state officials, utility companies, gas suppliers, electricity providers, and power plants contributed to the devastating outages.

The domino effect resulted in power outages for millions of Texans, the deaths of hundreds of Texans, billions of dollars in damages, with some households going nearly a week without heat, power, and water. This catastrophe highlighted the need for swift and sweeping upgrades and protections against future extreme weather events.

Texas State Legislature Responds

Texas lawmakers proactively introduced and passed legislation aimed at upgrading the state’s power infrastructure and preventing repeated failures within weeks of the storm. Senate Bill 3 (SB3) measures included:

  • Requirements to weatherize gas supply chain and pipeline facilities that sell electric energy within ERCOT.
  • The ability to impose penalties of up to $1 million for violation of these requirements.
  • Requirement for ERCOT to procure new power sources to ensure grid reliability during extreme heat and extreme cold.
  • Designation of specific natural gas facilities that are critical for power delivery during energy emergencies.
  • Development of an alert system that is to be activated when supply may not be able to meet demand.
  • Requirement for the Public Utility Commission of Texas, or PUCT, to establish an emergency wholesale electricity pricing program.

Texas Weatherization by Natural Gas Plants

In a Railroad Commission of Texas document published May 2024 and geared to gas supply chain and pipeline facilities, dozens of solutions were outlined with weatherization best practices and approaches in an effort to prevent another climate-affected crisis from severe winter weather.

Some solutions included:

  • Installation of insulation on critical components of a facility.
  • Construction of permanent or temporary windbreaks, housing, or barriers around critical equipment to reduce the impact of windchill.
  • Guidelines for the removal of ice and snow from critical equipment.
  • Instructions for the use of temporary heat systems on localized freezing problems like heating blankets, catalytic heaters, or fuel line heaters.

According to Daniel Cohan, professor of environmental engineering at Rice University, power plants across Texas have installed hundreds of millions of dollars worth of weatherization upgrades to their facilities. In ERCOT’s January 2022 winterization report, it stated that 321 out of 324 electricity generation units and transmission facilities fully passed the new regulations.

Is the Texas Grid Adequately Winterized?

Utilities, power generators, ERCOT, and the PUCT have all made changes to their operations and facilities since 2021 to be better prepared for extreme winter weather. Are these changes enough? Has the Texas grid officially been winterized?

This season, as winter weather tests Texans, residents may potentially experience localized outages. When tree branches cannot support the weight of the ice, they can snap and knock out power lines to neighborhoods across the state. In the instance of a downed power line, we must rely on regional utilities to act quickly to restore power.

The specific legislation enacted by the Texas state government in response to the 2021 disaster addressed to the relevant parties ensures that they have done their part to winterize the Texas grid.

---

Sam Luna is director at BKV Energy, where he oversees brand and go-to-market strategy, customer experience, marketing execution, and more.

This article first appeared on our sister site, InnovationMap.com.

Ad Placement 300x100
Ad Placement 300x600

CultureMap Emails are Awesome

UH's $44 million mass timber building slashed energy use in first year

building up

The University of Houston recently completed assessments on year one of the first mass timber project on campus, and the results show it has had a major impact.

Known as the Retail, Auxiliary, and Dining Center, or RAD Center, the $44 million building showed an 84 percent reduction in predicted energy use intensity, a measure of how much energy a building uses relative to its size, compared to similar buildings. Its Global Warming Potential rating, a ratio determined by the Intergovernmental Panel on Climate Change, shows a 39 percent reduction compared to the benchmark for other buildings of its type.

In comparison to similar structures, the RAD Center saved the equivalent of taking 472 gasoline-powered cars driven for one year off the road, according to architecture firm Perkins & Will.

The RAD Center was created in alignment with the AIA 2030 Commitment to carbon-neutral buildings, designed by Perkins & Will and constructed by Houston-based general contractor Turner Construction.

Perkins & Will’s work reduced the building's carbon footprint by incorporating lighter mass timber structural systems, which allowed the RAD Center to reuse the foundation, columns and beams of the building it replaced. Reused elements account for 45 percent of the RAD Center’s total mass, according to Perkins & Will.

Mass timber is considered a sustainable alternative to steel and concrete construction. The RAD Center, a 41,000-square-foot development, replaced the once popular Satellite, which was a food, retail and hangout center for students on UH’s campus near the Science & Research Building 2 and the Jack J. Valenti School of Communication.

The RAD Center uses more than a million pounds of timber, which can store over 650 metric tons of CO2. Aesthetically, the building complements the surrounding campus woodlands and offers students a view both inside and out.

“Spaces are designed to create a sense of serenity and calm in an ecologically-minded environment,” Diego Rozo, a senior project manager and associate principal at Perkins & Will, said in a news release. “They were conceptually inspired by the notion of ‘unleashing the senses’ – the design celebrating different sights, sounds, smells and tastes alongside the tactile nature of the timber.”

In addition to its mass timber design, the building was also part of an Energy Use Intensity (EUI) reduction effort. It features high-performance insulation and barriers, natural light to illuminate a building's interior, efficient indoor lighting fixtures, and optimized equipment, including HVAC systems.

The RAD Center officially opened Phase I in Spring 2024. The third and final phase of construction is scheduled for this summer, with a planned opening set for the fall.

Rice researchers' quantum breakthrough could pave the way for next-gen superconductors

new findings

A new study from researchers at Rice University, published in Nature Communications, could lead to future advances in superconductors with the potential to transform energy use.

The study revealed that electrons in strange metals, which exhibit unusual resistance to electricity and behave strangely at low temperatures, become more entangled at a specific tipping point, shedding new light on these materials.

A team led by Rice’s Qimiao Si, the Harry C. and Olga K. Wiess Professor of Physics and Astronomy, used quantum Fisher information (QFI), a concept from quantum metrology, to measure how electron interactions evolve under extreme conditions. The research team also included Rice’s Yuan Fang, Yiming Wang, Mounica Mahankali and Lei Chen along with Haoyu Hu of the Donostia International Physics Center and Silke Paschen of the Vienna University of Technology. Their work showed that the quantum phenomenon of electron entanglement peaks at a quantum critical point, which is the transition between two states of matter.

“Our findings reveal that strange metals exhibit a unique entanglement pattern, which offers a new lens to understand their exotic behavior,” Si said in a news release. “By leveraging quantum information theory, we are uncovering deep quantum correlations that were previously inaccessible.”

The researchers examined a theoretical framework known as the Kondo lattice, which explains how magnetic moments interact with surrounding electrons. At a critical transition point, these interactions intensify to the extent that the quasiparticles—key to understanding electrical behavior—disappear. Using QFI, the team traced this loss of quasiparticles to the growing entanglement of electron spins, which peaks precisely at the quantum critical point.

In terms of future use, the materials share a close connection with high-temperature superconductors, which have the potential to transmit electricity without energy loss, according to the researchers. By unblocking their properties, researchers believe this could revolutionize power grids and make energy transmission more efficient.

The team also found that quantum information tools can be applied to other “exotic materials” and quantum technologies.

“By integrating quantum information science with condensed matter physics, we are pivoting in a new direction in materials research,” Si said in the release.

Oxy subsidiary granted landmark EPA permits for carbon capture facility

making progress

Houston’s Occidental Petroleum Corp., or Oxy, and its subsidiary 1PointFive announced that the U.S Environmental Protection Agency approved its Class VI permits to sequester carbon dioxide captured from its STRATOS Direct Air Capture (DAC) facility near Odessa. These are the first such permits issued for a DAC project, according to a news release.

The $1.3 billion STRATOS project, which 1PointFive is developing through a joint venture with investment manager BlackRock, is designed to capture up to 500,000 metric tons of CO2 annually and is expected to begin commercial operations this year. DAC technology pulls CO2 from the air at any location, not just where carbon dioxide is emitted. Major companies, such as Microsoft and AT&T, have secured carbon removal credit agreements through the project.

The permits are issued under the Safe Drinking Water Act's Underground Injection Control program. The captured CO2 will be stored in geologic formations more than a mile underground, meeting the EPA’s review standards.

“This is a significant milestone for the company as we are continuing to develop vital infrastructure that will help the United States achieve energy security,” Vicki Hollub, Oxy president and CEO, said in a news release.“The permits are a catalyst to unlock value from carbon dioxide and advance Direct Air Capture technology as a solution to help organizations address their emissions or produce vital resources and fuels.”

Additionally, Oxy and 1PointFive announced the signing of a 25-year offtake agreement for 2.3 million metric tons of CO2 per year from CF Industries’ upcoming Bluepoint low-carbon ammonia facility in Ascension Parish, Louisiana.

The captured CO2 will be transported to and stored at 1PointFive’s Pelican Sequestration Hub, which is currently under development. Eventually, 1PointFive’s Pelican hub in Louisiana will include infrastructure to safely and economically sequester industrial emissions in underground geologic formations, similar to the STRATOS project.

“CF Industries’ and its partners' confidence in our Pelican Sequestration Hub is a validation of our expertise managing carbon dioxide and how we collaborate with industrial organizations to become their commercial sequestration partner,” Jeff Alvarez, President of 1PointFive Sequestration, said in a news release.

1PointFive is storing up to 20 million tons of CO2 per year, according to the company.

“By working together, we can unlock the potential of American manufacturing and energy production, while advancing industries that deliver high-quality jobs and economic growth,” Alvarez said in a news release.