Researchers have secured $3.3 million in funding to develop an AI-powered subsurface sensing system aimed at improving the safety and efficiency of underground power line installation. Photo via Getty Images

Researchers from the University of Houston — along with a Hawaiian company — have received $3.3 million in funding to explore artificial intelligence-backed subsurface sensing system for safe and efficient underground power line installation.

Houston's power lines are above ground, but studies show underground power is more reliable. Installing underground power lines is costly and disruptive, but the U.S. Department of Energy, in an effort to find a solution, has put $34 million into its new GOPHURRS program, which stands for Grid Overhaul with Proactive, High-speed Undergrounding for Reliability, Resilience, and Security. The funding has been distributed across 12 projects in 11 states.

“Modernizing our nation’s power grid is essential to building a clean energy future that lowers energy costs for working Americans and strengthens our national security,” U.S. Secretary of Energy Jennifer M. Granholm says in a DOE press release.

UH and Hawaii-based Oceanit are behind one of the funded projects, entitled “Artificial Intelligence and Unmanned Aerial Vehicle Real-Time Advanced Look-Ahead Subsurface Sensor.”

The researchers are looking a developing a subsurface sensing system for underground power line installation, potentially using machine learning, electromagnetic resistivity well logging, and drone technology to predict and sense obstacles to installation.

Jiefu Chen, associate professor of electrical and computer engineering at UH, is a key collaborator on the project, focused on electromagnetic antennas installed on UAV and HDD drilling string. He's working with Yueqin Huang, assistant professor of information science technology, who leads the geophysical signal processing and Xuqing Wu, associate professor of computer information systems, responsible for integrating machine learning.

“Advanced subsurface sensing and characterization technologies are essential for the undergrounding of power lines,” says Chen in the release. “This initiative can enhance the grid's resilience against natural hazards such as wildfires and hurricanes.”

“If proven successful, our proposed look-ahead subsurface sensing system could significantly reduce the costs of horizontal directional drilling for installing underground utilities,” Chen continues. “Promoting HDD offers environmental advantages over traditional trenching methods and enhances the power grid’s resilience.”

University of Houston professor Xiaonan Shan and the rest of his research team are celebrating fresh funding from a federal grant. Photo via UH.edu

Houston scientists land $1M NSF funding for AI-powered clean energy project

A team of scientists from the University of Houston, in collaboration with Howard University in Washington D.C., has received a $1 million award from the National Science Foundation for a project that aims to automate the discovery of new clean-energy catalysts.

The project, dubbed "Multidisciplinary High-Performance Computing and Artificial Intelligence Enabled Catalyst Design for Micro-Plasma Technologies in Clean Energy Transition," aims to use machine learning and AI to improve the efficiency of catalysts in hydrogen generation, carbon capture and energy storage, according to UH.

“This research directly contributes to these global challenges,” Jiefu Chen, the principal investigator of the project and associate professor of electrical and computer engineering, said in a statement. “This interdisciplinary effort ensures comprehensive and innovative solutions to complex problems.”

Chen is joined by Lars Grabow, professor of chemical and biomolecular engineering; Xiaonan Shan, associate professor of electrical and computing engineering; and Xuquing Wu, associate professor of information science technology. Su Yan, an associate professor of electrical engineering and computer science at Howard University, is collaborating on the project.

The University of Houston team: Xiaonan Shan, associate professor electrical and computing engineering, Jiefu Chen, associate professor of electrical and computer engineering, Lars Grabow, professor of chemical and biomolecular engineering, and Xuquing Wu, associate professor of information science technology. Photo via UH.edu

The team will create a robotic synthesis and testing facility that will automate the experimental testing and verification process of the catalyst design process, which traditionally is slow-going. It will implement AI and advanced, unsupervised machine learning techniques, and have a special focus on plasma reactions.

The project has four main focuses, according to UH.

  1. Using machine learning to discover materials for plasma-assisted catalytic reactions
  2. Developing a model to simulate complex interactions to better understand microwave-plasma-assisted heating
  3. Designing catalysts supports for efficient microwave-assisted reactions
  4. Developing a bench scale reactor to demonstrate the efficiency of the catalysts support system

Additionally, the team will put the funding toward the development of a multidisciplinary research and education program that will train students on using machine learning for topics like computational catalysis, applied electromagnetics and material synthesis. The team is also looking to partner with industry on related projects.

“This project will help create a knowledgeable and skilled workforce capable of addressing critical challenges in the clean energy transition,” Grabow added in a statement. “Moreover, this interdisciplinary project is going to be transformative in that it advances insights and knowledge that will lead to tangible economic impact in the not-too-far future.”

This spring, UH launched a new micro-credential course focused on other applications for AI and robotics in the energy industry.

Around the same time, Microsoft's famous renowned co-founder Bill Gates spoke at CERAWeek to a standing-room-only crowd on the future of the industry. Also founder of Breakthrough Energy, Gates addressed the topic of AI.

ExxonMobil and Intel are working to design, test, research and develop new liquid cooling technologies to optimize data center performance and help customers meet their sustainability goals. Photo via Getty Images

ExxonMobil, Intel eye sustainable solutions within data center innovation

the view from heti

Two multinational corporations have announced a new collaboration to create energy-efficient and sustainable solutions for data centers as the market experiences significant growth.

ExxonMobil and Intel are working to design, test, research and develop new liquid cooling technologies to optimize data center performance and help customers meet their sustainability goals. Liquid cooling solutions serve as an alternative to traditional air-cooling methods in data centers.

“Our partnership with ExxonMobil to co-develop turnkey solutions for liquid cooling will enable significant energy and water savings for data center and network deployments,” said Jen Huffstetler, Chief Product Sustainability Officer, Intel.

According to consulting firm McKinsey, “a hyperscaler’s data center can use as much power as 80,000 households do,” and that demand is expected to keep surging. Power consumption by the U.S. data center market is forecasted “to reach 35 gigawatts (GW) by 2030, up from 17 GW in 2022,” according to a McKinsey analysis. Artificial intelligence and machine learning, and other advanced computing techniques are increasing computational workloads, and in return, increasing electricity demand. Therefore, companies are searching for solutions to support this growth.

ExxonMobil launched its full portfolio of data center immersion fluid products last year. The partnership with Intel will allow them to further advance their efforts in this market.

“By integrating ExxonMobil’s proven expertise in liquid cooling technologies with Intel’s long legacy of industry leadership in world-changing computing technologies, together we will further the industry’s adoption and acceptance as it transitions to liquid cooling technologies,” said Sarah Horne, Vice President, ExxonMobil.

Learn more about this collaboration here.

———

This article originally ran on the Greater Houston Partnership's Houston Energy Transition Initiative blog. HETI exists to support Houston's future as an energy leader. For more information about the Houston Energy Transition Initiative, EnergyCapitalHTX's presenting sponsor, visit htxenergytransition.org.

The UH team is developing ways to use machine learning to ensure that power systems can continue to run efficiently when pulling their energy from wind and solar sources. Photo via Getty Images

Houston researcher wins competitive NSF award for work tying machine learning to the power grid

grant funding

An associate professor at the University of Houston received the highly competitive National Science Foundation CAREER Award earlier this month for a proposal focused on integrating renewable resources to improve power grids.

The award grants more than $500,000 to Xingpeng Li, assistant professor of electrical and computer engineering and leader of the Renewable Power Grid Lab at UH, to continue his work on developing ways to use machine learning to ensure that power systems can continue to run efficiently when pulling their energy from wind and solar sources, according to a statement from UH. This work has applications in the events of large disturbances to the grid.

Li explains that currently, power grids run off of converted, stored kinetic energy during grid disturbances.

"For example, when the grid experiences sudden large generation losses or increased electrical loads, the stored kinetic energy immediately converted to electrical energy and addressed the temporary shortfall in generation,” Li said in a statement. “However, as the proportion of wind and solar power increases in the grid, we want to maximize their use since their marginal costs are zero and they provide clean energy. Since we reduce the use of those traditional generators, we also reduce the power system inertia (or stored kinetic energy) substantially.”

Li plans to use machine learning to create more streamlined models that can be implemented into day-ahead scheduling applications that grid operators currently use.

“With the proposed new modeling and computational approaches, we can better manage grids and ensure it can supply continuous quality power to all the consumers," he said.

In addition to supporting Li's research and model creations, the funds will also go toward Li and his team's creation of a free, open-source tool for students from kindergarten up through their graduate studies. They are also developing an “Applied Machine Learning in Power Systems” course. Li says the course will help meet workforce needs.

The CAREER Award recognizes early-career faculty members who “have the potential to serve as academic role models in research and education and to lead advances in the mission of their department or organization,” according to the NSF. It's given to about 500 researchers each year.

Earlier this year, Rice assistant professor Amanda Marciel was also granted an NSF CAREER Award to continue her research in designing branch elastomers that return to their original shape after being stretched. The research has applications in stretchable electronics and biomimetic tissues.
The new course will provide participants with insights on how to use robotics to enhance efficiency in data collection, AI data analysis tools for industry, risk management with AI, and more. Photo courtesy of UH

Houston university launches latest micro-credential course focused on AI, robotics for the energy industry

coming soon

The University of Houston will launch its latest micro-credential course next month that focuses on how AI and robotics can be used in inspection processes for the energy industry.

Running from March 22 through April 22, the course is open to "engineers, technicians and industry professionals with advanced knowledge in the dynamic fields of robotics and AI," according to a statement from UH. It will combine weekly online lectures and in-person hands-on demonstrations and provide participants with insights on how to use robotics to enhance efficiency in data collection, AI data analysis tools for industry, risk management with AI, and more.

“By blending theoretical knowledge with practical applications and hands-on experience, the course aims to empower participants with the skills needed to evaluate and adopt these advanced technologies to address real-world challenges in asset management,” Vedhus Hoskere, assistant professor at the UH Cullen College of Engineering, said in a statement. “We hope that upskilling and knowledge gained from this course will help accelerate the adoption of AI and robotics and contribute to the advancement of safer and more resource-efficient energy infrastructure systems.”

Hoskere will teach the course module titled “Computer Vision and Deep Learning for Inspections.” He also recently received a $500,000 grant from the Texas Department of Transportation (TxDOT) to look at how to use drones, cameras, sensors and AI to support Texas' bridge maintenance programs.

Other leaders of the UH Energy course will include:

  • Kimberley Hayes, founder of Valkim Technologies: Lead speaker who will provide an overview and introduction of AI applications, standards and certification
  • Gangbing Song, Moores Professor of Mechanical Engineering at UH: Machine learning hands-on exercises
  • Pete Peterson, head of product management and marketing with XaaS Lab: Computer vision technology in the oil and gas industry
  • Matthew Alberts, head of project management with Future Technologies Venture Venture LLC: Use cases, workflow and optimizing inspections with AI and drones
  • Suchet Bargoti, chief technology officer at Abyss Solutions: AI and robots for integrity management.

Registration accepted up to the first day of the course and can be completed online.

The world can't keep on with what it's doing and expect to reach its goals when it comes to climate change. Radical innovations are needed at this point, writes Scott Nyquist. Photo via Getty Images

Only radical innovation can get the world to its climate goals, says this Houston expert

guest column

Almost 3 years ago, McKinsey published a report arguing that limiting global temperature rises to 1.5 degrees Celsius above pre-industrial levels was “technically achievable,” but that the “math is daunting.” Indeed, when the 1.5°C figure was agreed to at the 2015 Paris climate conference, the assumption was that emissions would peak before 2025, and then fall 43 percent by 2030.

Given that 2022 saw the highest emissions ever—36.8 gigatons—the math is now more daunting still: cuts would need to be greater, and faster, than envisioned in Paris. Perhaps that is why the Intergovernmental Panel on Climate Change (IPCC) noted March 20 (with “high confidence”) that it was “likely that warming will exceed 1.5°C during the 21st century.”

I agree with that gloomy assessment. Given the rate of progress so far, 1.5°C looks all but impossible. That puts me in the company of people like Bill Gates; the Economist; the Australian Academy of Science, and apparently many IPCC scientists. McKinsey has estimated that even if all countries deliver on their net zero commitments, temperatures will likely be 1.7°C higher in 2100.

In October, the UN Environment Program argued that there was “no credible pathway to 1.5°C in place” and called for “an urgent system-wide transformation” to change the trajectory. Among the changes it considers necessary: carbon taxes, land use reform, dietary changes in which individuals “consume food for environmental sustainability and carbon reduction,” investment of $4 trillion to $6 trillion a year; applying current technology to all new buildings; no new fossil fuel infrastructure. And so on.

Let’s assume that the UNEP is right. What are the chances of all this happening in the next few years? Or, indeed, any of it? President Obama’s former science adviser, Daniel Schrag, put it this way: “ Who believes that we can halve global emissions by 2030?... It’s so far from reality that it’s kind of absurd.”

Having a goal is useful, concentrating minds and organizing effort. And I think that has been the case with 1.5°C, or recent commitments to get to net zero. Targets create a sense of urgency that has led to real progress on decarbonization.

The 2020 McKinsey report set out how to get on the 1.5°C pathway, and was careful to note that this was not a description of probability or reality but “a picture of a world that could be.” Three years later, that “world that could be” looks even more remote.

Consider the United States, the world’s second-largest emitter. In 2021, 79 percent of primary energy demand (see chart) was met by fossil fuels, about the same as a decade before. Globally, the figures are similar, with renewables accounting for just 12.5 percent of consumption and low-emissions nuclear another 4 percent. Those numbers would have to basically reverse in the next decade or so to get on track. I don’t see how that can happen.

No alt text provided for this image

Credit: Energy Information Administration

But even if 1.5°C is improbable in the short term, that doesn’t mean that missing the target won’t have consequences. And it certainly doesn’t mean giving up on addressing climate change. And in fact, there are some positive trends. Many companies are developing comprehensive plans for achieving net-zero emissions and are making those plans part of their long-term strategy. Moreover, while global emissions grew 0.9 percent in 2022, that was much less than GDP growth (3.2 percent). It’s worth noting, too, that much of the increase came from switching from gas to coal in response to the Russian invasion of Ukraine; that is the kind of supply shock that can be reversed. The point is that growth and emissions no longer move in lockstep; rather the opposite. That is critical because poorer countries are never going to take serious climate action if they believe it threatens their future prosperity.

Another implication is that limiting emissions means addressing the use of fossil fuels. As noted, even with the substantial rise in the use of renewables, coal, gas, and oil are still the core of the global energy system. They cannot be wished away. Perhaps it is time to think differently—that is, making fossil fuels more emissions efficient, by using carbon capture or other technologies; cutting methane emissions; and electrifying oil and gas operations. This is not popular among many climate advocates, who would prefer to see fossil fuels “stay in the ground.” That just isn’t happening. The much likelier scenario is that they are gradually displaced. McKinsey projects peak oil demand later this decade, for example, and for gas, maybe sometime in the late 2030s. Even after the peak, though, oil and gas will still be important for decades.

Second, in the longer term, it may be possible to get back onto 1.5°C if, in addition to reducing emissions, we actually remove them from the atmosphere, in the form of “negative emissions,” such as direct air capture and bioenergy with carbon capture and storage in power and heavy industry. The IPCC itself assumed negative emissions would play a major role in reaching the 1.5°C target; in fact, because of cost and deployment problems, it’s been tiny.

Finally, as I have argued before, it’s hard to see how we limit warming even to 2°C without more nuclear power, which can provide low-emissions energy 24/7, and is the largest single source of such power right now.

None of these things is particularly popular; none get the publicity of things like a cool new electric truck or an offshore wind farm (of which two are operating now in the United States, generating enough power for about 20,000 homes; another 40 are in development). And we cannot assume fast development of offshore wind. NIMBY concerns have already derailed some high-profile projects, and are also emerging in regard to land-based wind farms.

Carbon capture, negative emissions, and nuclear will have to face NIMBY, too. But they all have the potential to move the needle on emissions. Think of the potential if fast-growing India and China, for example, were to develop an assembly line of small nuclear reactors. Of course, the economics have to make sense—something that is true for all climate-change technologies.

And as the UN points out, there needs to be progress on other issues, such as food, buildings, and finance. I don’t think we can assume that such progress will happen on a massive scale in the next few years; the actual record since Paris demonstrates the opposite. That is troubling: the IPCC notes that the risks of abrupt and damaging impacts, such as flooding and crop yields, rise “with every increment of global warming.” But it is the reality.

There is one way to get us to 1.5°C, although not in the Paris timeframe: a radical acceleration of innovation. The approaches being scaled now, such as wind, solar, and batteries, are the same ideas that were being discussed 30 years ago. We are benefiting from long-term, incremental improvements, not disruptive innovation. To move the ball down the field quickly, though, we need to complete a Hail Mary pass.

It’s a long shot. But we’re entering an era of accelerated innovation, driven by advanced computing, artificial intelligence, and machine learning that could narrow the odds. For example, could carbon nanotubes displace demand for high-emissions steel? Might it be possible to store carbon deep in the ocean? Could geo-engineering bend the curve?

I believe that, on the whole, the world is serious about climate change. I am certain that the energy transition is happening. But I don’t think we are anywhere near to being on track to hit the 1.5°C target. And I don’t see how doing more of the same will get us there.

------

Scott Nyquist is a senior advisor at McKinsey & Company and vice chairman, Houston Energy Transition Initiative of the Greater Houston Partnership. The views expressed herein are Nyquist's own and not those of McKinsey & Company or of the Greater Houston Partnership. This article originally ran on LinkedIn.

Ad Placement 300x100
Ad Placement 300x600

CultureMap Emails are Awesome

Houston companies scoop up $31 million in funds from DOE, EPA methane emissions program

fresh funds

The U.S. Department of Energy and the U.S. Environmental Protection Agency announced the selection of seven projects from Houston companies to receive funding through the Methane Emissions Reduction Program.

The projects are among 43 others nationwide, including 12 from Texas, that reduce, monitor, measure, and quantify methane emissions from the oil and gas sector. The DOE and EPA awarded $850 million in total through the program.

The Houston companies picked up $31.7 million in federal funding through the program in addition to more than $9.5 million in non-federal dollars.

“I’m excited about the opportunities these will create internally but even more so the creation of jobs and training opportunities for the communities in which we work,” Scott McCurdy, Encino Environmental Services CEO, said in a news release. His company received awards for two projects.

“These projects will allow us to further support and strengthen the U.S. Energy industry’s ability to deliver clean, reliable, and affordable energy globally,” he added.

The Houston-area awards included:

DaphneTech USA LLC

Total funding: $5.8 million (approximately $4.5 million in federal, $1.3 million in non-federal)

The award was granted for the company’s Daphne and Williams Methane Slip Abatement Plasma-Catalyst Scale-Up project. Daphne will study how its SlipPure technology, a novel exhaust gas cleaning system that abates methane and exhaust gas pollution from natural gas-fueled engines, can be economically viable across multiple engine types and operating conditions.

Baker Hughes Energy Transition LLC 

Total funding: $7.47 million (approximately $6 million in federal, $1.5 million in non-federal)

The award was granted for the company’s Advancing Low Cost CH4 Emissions Reduction from Flares through Large Scale Deployment of Retrofittable and Adaptive Technology project. The project aims to develop a scalable, integrated methane emissions reduction system for flares based on optical gas imaging and estimation algorithms.

Encino Environmental Services

Total funding: $15.17 million (approximately $11 million in federal, $4.17 million in non-federal)

The award was granted for two projects. The Advanced Methane Reduction System: Integrating Infrared and Visual Imaging to Assess Net Heating Value at the Combustion Zone and Determine Combustion Efficiency to Enhance Flaring Performance project aims to develop and deploy an advanced continuous emissions monitoring system. It’s Advancing Methane Emissions Reduction through Innovative Technology project will develop and deploy a technology using sensors and composite materials to address emissions originating in storage tanks.

Envana Software Solutions

Total funding: $5.26 million (approximately $4.2 million in federal, $1 million in non-federal)

The award was granted for the company’s Leak Detection and Reduction Software to Identify Methane Emissions and Trigger Mitigation at Oil and Gas Production Facilities Based on SCADA Data project. It aims to improve its Recon software for monitoring methane emissions and develop partnerships with local universities and organizations.

Capwell Services Inc.

Total funding: $4.19 million (approximately $3.3 million in federal, $837,000 in non-federal)

The award was granted for its Methane Emissions Abatement Technology for Low-Flow and Intermittent Emission Sources project. It aims to to deploy and field-test a methane abatement unit and improve air quality and health outcomes for communities near production facilities and establish field technician internships for local residents.

Blue Sky Measurements 

Total funding: $3.41 million (approximately $2.7 million in federal, $683,000 in non-federal)

The award was granted for its Field Validation of Novel Fixed Position Optical Sensor for Fugitive Methane Emission Detection Quantification and Location with Real-Time Notification for Rapid Mitigation project. It aims to field test an optical sensing technology at six well sites in the Permian Basin.

Southern Methodist University, The University of Texas at Austin, Texas A&M Engineering Experiment Station and Hyliion Inc. were other Texas-based organizations to earn awards. See the full list of projects here.

Texas university's 'WaterHub' will dramatically reduce water usage by 40%

Sustainable Move

A major advancement in sustainability is coming to one Texas university. A new UT WaterHub at the University of Texas at Austin will be the largest facility of its kind in the U.S. and will transform how the university manages its water resources.

It's designed to work with natural processes instead of against them for water savings of an estimated 40 percent. It's slated for completion in late 2027.

The university has had an active water recovery program since the 1980s. Still, water is becoming an increasing concern in Austin. According to Texas Living Waters, a coalition of conservation groups, Texas loses enough water annually to fill Lady Bird Lake roughly 89 times over.

As Austin continues to expand and face water shortages, the region's water supply faces increased pressure. The UT WaterHub plans to address this challenge by recycling water for campus energy operations, helping preserve water resources for both the university and local communities.

The 9,600-square-foot water treatment facility will use an innovative filtration approach. To reduce reliance on expensive machinery and chemicals, the system uses plants to naturally filter water and gravity to pull it in the direction it needs to go. Used water will be gathered from a new collection point near the Darrell K Royal Texas Memorial Stadium and transported to the WaterHub, located in the heart of the engineering district. The facility's design includes a greenhouse viewable to the public, serving as an interactive learning space.

Beyond water conservation, the facility is designed to protect the university against extreme weather events like winter storms. This new initiative will create a reliable backup water supply while decreasing university water usage, and will even reduce wastewater sent to the city by up to 70 percent.

H2O Innovation, UT’s collaborator in this project, specializes in water solutions, helping organizations manage their water efficiently.

"By combining cutting-edge technology with our innovative financing approach, we’re making it easier for organizations to adopt sustainable water practices that benefit both their bottom line and the environment, paving a step forward in water positivity,” said H2O Innovation president and CEO Frédéric Dugré in a press release.

The university expects significant cost savings with this project, since it won't have to spend as much on buying water from the city or paying fees to dispose of used water. Over the next several years, this could add up to millions of dollars.

---

A version of this story originally appeared on our sister site, CultureMap Austin.

Report: Texas solar power, battery storage helped stabilize grid in summer 2024, but challenges remain

by the numbers

Research from the Federal Reserve Bank of Dallas shows that solar power and battery storage capacity helped stabilize Texas’ electric grid last summer.

Between June 1 and Aug. 31, solar power met nearly 25 percent of midday electricity demand within the Electric Reliability Council of Texas (ERCOT) power grid. Rising solar and battery output in ERCOT assisted Texans during a summer of triple-digit heat and record load demands, but the report fears that the state’s power load will be “pushed to its limits” soon.

The report examined how the grid performed during more demanding hours. At peak times, between 11 a.m. and 2 p.m. in the summer of 2024, solar output averaged nearly 17,000 megawatts compared with 12,000 megawatts during those hours in the previous year. Between 6 p.m. and 9 p.m., discharge from battery facilities averaged 714 megawatts in 2024 after averaging 238 megawatts for those hours in 2023. Solar and battery output have continued to grow since then, according to the report.

“Batteries made a meaningful contribution to what those shoulder periods look like and how much scarcity we get into during these peak events,” ERCOT CEO Pablo Vegas said at a board of directors conference call.

Increases in capacity from solar and battery-storage power in 2024 also eclipsed those of 2023. In 2023 ECOT added 4,570 megawatts of solar, compared to adding nearly 9,700 megawatts in 2024. Growth in battery storage capacity also increased from about 1,500 megawatts added in 2023 to more than 4,000 megawatts added in 2024. Natural gas capacity also saw increases while wind capacity dropped by about 50 percent.

Texas’ installation of utility-scale solar surpassed California’s in the spring of last year, and jumped from 1,900 megawatts in 2019 to over 20,000 megawatts in 2024 with solar meeting about 50 percent of Texas' peak power demand during some days.

While the numbers are encouraging, the report states that there could be future challenges, as more generating capacity will be required due to data center construction and broader electrification trends. The development of generating more capacity will rely on multiple factors like price signals and market conditions that invite more baseload and dispatchable generating capacity, which includes longer-duration batteries, and investment in power purchase agreements and other power arrangements by large-scale consumers, according to the report.

Additionally, peak demand during winter freezes presents challenges not seen in the summer. For example, in colder months, peak electricity demand often occurs in the early morning before solar energy is available, and it predicts that current battery storage may be insufficient to meet the demand. The analysis indicated a 50% chance of rolling outages during a cold snap similar to December 2022 and an 80% chance if conditions mirror the February 2021 deep freeze at the grid’s current state.

The report also claimed that ERCOT’s energy-only market design and new incentive structures, such as the Texas Energy Fund, do not appear to be enough to meet the predicted future magnitude and speed of load growth.

Read the full report here.