Researchers have secured $3.3 million in funding to develop an AI-powered subsurface sensing system aimed at improving the safety and efficiency of underground power line installation. Photo via Getty Images

Researchers from the University of Houston — along with a Hawaiian company — have received $3.3 million in funding to explore artificial intelligence-backed subsurface sensing system for safe and efficient underground power line installation.

Houston's power lines are above ground, but studies show underground power is more reliable. Installing underground power lines is costly and disruptive, but the U.S. Department of Energy, in an effort to find a solution, has put $34 million into its new GOPHURRS program, which stands for Grid Overhaul with Proactive, High-speed Undergrounding for Reliability, Resilience, and Security. The funding has been distributed across 12 projects in 11 states.

“Modernizing our nation’s power grid is essential to building a clean energy future that lowers energy costs for working Americans and strengthens our national security,” U.S. Secretary of Energy Jennifer M. Granholm says in a DOE press release.

UH and Hawaii-based Oceanit are behind one of the funded projects, entitled “Artificial Intelligence and Unmanned Aerial Vehicle Real-Time Advanced Look-Ahead Subsurface Sensor.”

The researchers are looking a developing a subsurface sensing system for underground power line installation, potentially using machine learning, electromagnetic resistivity well logging, and drone technology to predict and sense obstacles to installation.

Jiefu Chen, associate professor of electrical and computer engineering at UH, is a key collaborator on the project, focused on electromagnetic antennas installed on UAV and HDD drilling string. He's working with Yueqin Huang, assistant professor of information science technology, who leads the geophysical signal processing and Xuqing Wu, associate professor of computer information systems, responsible for integrating machine learning.

“Advanced subsurface sensing and characterization technologies are essential for the undergrounding of power lines,” says Chen in the release. “This initiative can enhance the grid's resilience against natural hazards such as wildfires and hurricanes.”

“If proven successful, our proposed look-ahead subsurface sensing system could significantly reduce the costs of horizontal directional drilling for installing underground utilities,” Chen continues. “Promoting HDD offers environmental advantages over traditional trenching methods and enhances the power grid’s resilience.”

University of Houston professor Xiaonan Shan and the rest of his research team are celebrating fresh funding from a federal grant. Photo via UH.edu

Houston scientists land $1M NSF funding for AI-powered clean energy project

A team of scientists from the University of Houston, in collaboration with Howard University in Washington D.C., has received a $1 million award from the National Science Foundation for a project that aims to automate the discovery of new clean-energy catalysts.

The project, dubbed "Multidisciplinary High-Performance Computing and Artificial Intelligence Enabled Catalyst Design for Micro-Plasma Technologies in Clean Energy Transition," aims to use machine learning and AI to improve the efficiency of catalysts in hydrogen generation, carbon capture and energy storage, according to UH.

“This research directly contributes to these global challenges,” Jiefu Chen, the principal investigator of the project and associate professor of electrical and computer engineering, said in a statement. “This interdisciplinary effort ensures comprehensive and innovative solutions to complex problems.”

Chen is joined by Lars Grabow, professor of chemical and biomolecular engineering; Xiaonan Shan, associate professor of electrical and computing engineering; and Xuquing Wu, associate professor of information science technology. Su Yan, an associate professor of electrical engineering and computer science at Howard University, is collaborating on the project.

The University of Houston team: Xiaonan Shan, associate professor electrical and computing engineering, Jiefu Chen, associate professor of electrical and computer engineering, Lars Grabow, professor of chemical and biomolecular engineering, and Xuquing Wu, associate professor of information science technology. Photo via UH.edu

The team will create a robotic synthesis and testing facility that will automate the experimental testing and verification process of the catalyst design process, which traditionally is slow-going. It will implement AI and advanced, unsupervised machine learning techniques, and have a special focus on plasma reactions.

The project has four main focuses, according to UH.

  1. Using machine learning to discover materials for plasma-assisted catalytic reactions
  2. Developing a model to simulate complex interactions to better understand microwave-plasma-assisted heating
  3. Designing catalysts supports for efficient microwave-assisted reactions
  4. Developing a bench scale reactor to demonstrate the efficiency of the catalysts support system

Additionally, the team will put the funding toward the development of a multidisciplinary research and education program that will train students on using machine learning for topics like computational catalysis, applied electromagnetics and material synthesis. The team is also looking to partner with industry on related projects.

“This project will help create a knowledgeable and skilled workforce capable of addressing critical challenges in the clean energy transition,” Grabow added in a statement. “Moreover, this interdisciplinary project is going to be transformative in that it advances insights and knowledge that will lead to tangible economic impact in the not-too-far future.”

This spring, UH launched a new micro-credential course focused on other applications for AI and robotics in the energy industry.

Around the same time, Microsoft's famous renowned co-founder Bill Gates spoke at CERAWeek to a standing-room-only crowd on the future of the industry. Also founder of Breakthrough Energy, Gates addressed the topic of AI.

ExxonMobil and Intel are working to design, test, research and develop new liquid cooling technologies to optimize data center performance and help customers meet their sustainability goals. Photo via Getty Images

ExxonMobil, Intel eye sustainable solutions within data center innovation

the view from heti

Two multinational corporations have announced a new collaboration to create energy-efficient and sustainable solutions for data centers as the market experiences significant growth.

ExxonMobil and Intel are working to design, test, research and develop new liquid cooling technologies to optimize data center performance and help customers meet their sustainability goals. Liquid cooling solutions serve as an alternative to traditional air-cooling methods in data centers.

“Our partnership with ExxonMobil to co-develop turnkey solutions for liquid cooling will enable significant energy and water savings for data center and network deployments,” said Jen Huffstetler, Chief Product Sustainability Officer, Intel.

According to consulting firm McKinsey, “a hyperscaler’s data center can use as much power as 80,000 households do,” and that demand is expected to keep surging. Power consumption by the U.S. data center market is forecasted “to reach 35 gigawatts (GW) by 2030, up from 17 GW in 2022,” according to a McKinsey analysis. Artificial intelligence and machine learning, and other advanced computing techniques are increasing computational workloads, and in return, increasing electricity demand. Therefore, companies are searching for solutions to support this growth.

ExxonMobil launched its full portfolio of data center immersion fluid products last year. The partnership with Intel will allow them to further advance their efforts in this market.

“By integrating ExxonMobil’s proven expertise in liquid cooling technologies with Intel’s long legacy of industry leadership in world-changing computing technologies, together we will further the industry’s adoption and acceptance as it transitions to liquid cooling technologies,” said Sarah Horne, Vice President, ExxonMobil.

Learn more about this collaboration here.

———

This article originally ran on the Greater Houston Partnership's Houston Energy Transition Initiative blog. HETI exists to support Houston's future as an energy leader. For more information about the Houston Energy Transition Initiative, EnergyCapitalHTX's presenting sponsor, visit htxenergytransition.org.

The UH team is developing ways to use machine learning to ensure that power systems can continue to run efficiently when pulling their energy from wind and solar sources. Photo via Getty Images

Houston researcher wins competitive NSF award for work tying machine learning to the power grid

grant funding

An associate professor at the University of Houston received the highly competitive National Science Foundation CAREER Award earlier this month for a proposal focused on integrating renewable resources to improve power grids.

The award grants more than $500,000 to Xingpeng Li, assistant professor of electrical and computer engineering and leader of the Renewable Power Grid Lab at UH, to continue his work on developing ways to use machine learning to ensure that power systems can continue to run efficiently when pulling their energy from wind and solar sources, according to a statement from UH. This work has applications in the events of large disturbances to the grid.

Li explains that currently, power grids run off of converted, stored kinetic energy during grid disturbances.

"For example, when the grid experiences sudden large generation losses or increased electrical loads, the stored kinetic energy immediately converted to electrical energy and addressed the temporary shortfall in generation,” Li said in a statement. “However, as the proportion of wind and solar power increases in the grid, we want to maximize their use since their marginal costs are zero and they provide clean energy. Since we reduce the use of those traditional generators, we also reduce the power system inertia (or stored kinetic energy) substantially.”

Li plans to use machine learning to create more streamlined models that can be implemented into day-ahead scheduling applications that grid operators currently use.

“With the proposed new modeling and computational approaches, we can better manage grids and ensure it can supply continuous quality power to all the consumers," he said.

In addition to supporting Li's research and model creations, the funds will also go toward Li and his team's creation of a free, open-source tool for students from kindergarten up through their graduate studies. They are also developing an “Applied Machine Learning in Power Systems” course. Li says the course will help meet workforce needs.

The CAREER Award recognizes early-career faculty members who “have the potential to serve as academic role models in research and education and to lead advances in the mission of their department or organization,” according to the NSF. It's given to about 500 researchers each year.

Earlier this year, Rice assistant professor Amanda Marciel was also granted an NSF CAREER Award to continue her research in designing branch elastomers that return to their original shape after being stretched. The research has applications in stretchable electronics and biomimetic tissues.
The new course will provide participants with insights on how to use robotics to enhance efficiency in data collection, AI data analysis tools for industry, risk management with AI, and more. Photo courtesy of UH

Houston university launches latest micro-credential course focused on AI, robotics for the energy industry

coming soon

The University of Houston will launch its latest micro-credential course next month that focuses on how AI and robotics can be used in inspection processes for the energy industry.

Running from March 22 through April 22, the course is open to "engineers, technicians and industry professionals with advanced knowledge in the dynamic fields of robotics and AI," according to a statement from UH. It will combine weekly online lectures and in-person hands-on demonstrations and provide participants with insights on how to use robotics to enhance efficiency in data collection, AI data analysis tools for industry, risk management with AI, and more.

“By blending theoretical knowledge with practical applications and hands-on experience, the course aims to empower participants with the skills needed to evaluate and adopt these advanced technologies to address real-world challenges in asset management,” Vedhus Hoskere, assistant professor at the UH Cullen College of Engineering, said in a statement. “We hope that upskilling and knowledge gained from this course will help accelerate the adoption of AI and robotics and contribute to the advancement of safer and more resource-efficient energy infrastructure systems.”

Hoskere will teach the course module titled “Computer Vision and Deep Learning for Inspections.” He also recently received a $500,000 grant from the Texas Department of Transportation (TxDOT) to look at how to use drones, cameras, sensors and AI to support Texas' bridge maintenance programs.

Other leaders of the UH Energy course will include:

  • Kimberley Hayes, founder of Valkim Technologies: Lead speaker who will provide an overview and introduction of AI applications, standards and certification
  • Gangbing Song, Moores Professor of Mechanical Engineering at UH: Machine learning hands-on exercises
  • Pete Peterson, head of product management and marketing with XaaS Lab: Computer vision technology in the oil and gas industry
  • Matthew Alberts, head of project management with Future Technologies Venture Venture LLC: Use cases, workflow and optimizing inspections with AI and drones
  • Suchet Bargoti, chief technology officer at Abyss Solutions: AI and robots for integrity management.

Registration accepted up to the first day of the course and can be completed online.

The world can't keep on with what it's doing and expect to reach its goals when it comes to climate change. Radical innovations are needed at this point, writes Scott Nyquist. Photo via Getty Images

Only radical innovation can get the world to its climate goals, says this Houston expert

guest column

Almost 3 years ago, McKinsey published a report arguing that limiting global temperature rises to 1.5 degrees Celsius above pre-industrial levels was “technically achievable,” but that the “math is daunting.” Indeed, when the 1.5°C figure was agreed to at the 2015 Paris climate conference, the assumption was that emissions would peak before 2025, and then fall 43 percent by 2030.

Given that 2022 saw the highest emissions ever—36.8 gigatons—the math is now more daunting still: cuts would need to be greater, and faster, than envisioned in Paris. Perhaps that is why the Intergovernmental Panel on Climate Change (IPCC) noted March 20 (with “high confidence”) that it was “likely that warming will exceed 1.5°C during the 21st century.”

I agree with that gloomy assessment. Given the rate of progress so far, 1.5°C looks all but impossible. That puts me in the company of people like Bill Gates; the Economist; the Australian Academy of Science, and apparently many IPCC scientists. McKinsey has estimated that even if all countries deliver on their net zero commitments, temperatures will likely be 1.7°C higher in 2100.

In October, the UN Environment Program argued that there was “no credible pathway to 1.5°C in place” and called for “an urgent system-wide transformation” to change the trajectory. Among the changes it considers necessary: carbon taxes, land use reform, dietary changes in which individuals “consume food for environmental sustainability and carbon reduction,” investment of $4 trillion to $6 trillion a year; applying current technology to all new buildings; no new fossil fuel infrastructure. And so on.

Let’s assume that the UNEP is right. What are the chances of all this happening in the next few years? Or, indeed, any of it? President Obama’s former science adviser, Daniel Schrag, put it this way: “ Who believes that we can halve global emissions by 2030?... It’s so far from reality that it’s kind of absurd.”

Having a goal is useful, concentrating minds and organizing effort. And I think that has been the case with 1.5°C, or recent commitments to get to net zero. Targets create a sense of urgency that has led to real progress on decarbonization.

The 2020 McKinsey report set out how to get on the 1.5°C pathway, and was careful to note that this was not a description of probability or reality but “a picture of a world that could be.” Three years later, that “world that could be” looks even more remote.

Consider the United States, the world’s second-largest emitter. In 2021, 79 percent of primary energy demand (see chart) was met by fossil fuels, about the same as a decade before. Globally, the figures are similar, with renewables accounting for just 12.5 percent of consumption and low-emissions nuclear another 4 percent. Those numbers would have to basically reverse in the next decade or so to get on track. I don’t see how that can happen.

No alt text provided for this image

Credit: Energy Information Administration

But even if 1.5°C is improbable in the short term, that doesn’t mean that missing the target won’t have consequences. And it certainly doesn’t mean giving up on addressing climate change. And in fact, there are some positive trends. Many companies are developing comprehensive plans for achieving net-zero emissions and are making those plans part of their long-term strategy. Moreover, while global emissions grew 0.9 percent in 2022, that was much less than GDP growth (3.2 percent). It’s worth noting, too, that much of the increase came from switching from gas to coal in response to the Russian invasion of Ukraine; that is the kind of supply shock that can be reversed. The point is that growth and emissions no longer move in lockstep; rather the opposite. That is critical because poorer countries are never going to take serious climate action if they believe it threatens their future prosperity.

Another implication is that limiting emissions means addressing the use of fossil fuels. As noted, even with the substantial rise in the use of renewables, coal, gas, and oil are still the core of the global energy system. They cannot be wished away. Perhaps it is time to think differently—that is, making fossil fuels more emissions efficient, by using carbon capture or other technologies; cutting methane emissions; and electrifying oil and gas operations. This is not popular among many climate advocates, who would prefer to see fossil fuels “stay in the ground.” That just isn’t happening. The much likelier scenario is that they are gradually displaced. McKinsey projects peak oil demand later this decade, for example, and for gas, maybe sometime in the late 2030s. Even after the peak, though, oil and gas will still be important for decades.

Second, in the longer term, it may be possible to get back onto 1.5°C if, in addition to reducing emissions, we actually remove them from the atmosphere, in the form of “negative emissions,” such as direct air capture and bioenergy with carbon capture and storage in power and heavy industry. The IPCC itself assumed negative emissions would play a major role in reaching the 1.5°C target; in fact, because of cost and deployment problems, it’s been tiny.

Finally, as I have argued before, it’s hard to see how we limit warming even to 2°C without more nuclear power, which can provide low-emissions energy 24/7, and is the largest single source of such power right now.

None of these things is particularly popular; none get the publicity of things like a cool new electric truck or an offshore wind farm (of which two are operating now in the United States, generating enough power for about 20,000 homes; another 40 are in development). And we cannot assume fast development of offshore wind. NIMBY concerns have already derailed some high-profile projects, and are also emerging in regard to land-based wind farms.

Carbon capture, negative emissions, and nuclear will have to face NIMBY, too. But they all have the potential to move the needle on emissions. Think of the potential if fast-growing India and China, for example, were to develop an assembly line of small nuclear reactors. Of course, the economics have to make sense—something that is true for all climate-change technologies.

And as the UN points out, there needs to be progress on other issues, such as food, buildings, and finance. I don’t think we can assume that such progress will happen on a massive scale in the next few years; the actual record since Paris demonstrates the opposite. That is troubling: the IPCC notes that the risks of abrupt and damaging impacts, such as flooding and crop yields, rise “with every increment of global warming.” But it is the reality.

There is one way to get us to 1.5°C, although not in the Paris timeframe: a radical acceleration of innovation. The approaches being scaled now, such as wind, solar, and batteries, are the same ideas that were being discussed 30 years ago. We are benefiting from long-term, incremental improvements, not disruptive innovation. To move the ball down the field quickly, though, we need to complete a Hail Mary pass.

It’s a long shot. But we’re entering an era of accelerated innovation, driven by advanced computing, artificial intelligence, and machine learning that could narrow the odds. For example, could carbon nanotubes displace demand for high-emissions steel? Might it be possible to store carbon deep in the ocean? Could geo-engineering bend the curve?

I believe that, on the whole, the world is serious about climate change. I am certain that the energy transition is happening. But I don’t think we are anywhere near to being on track to hit the 1.5°C target. And I don’t see how doing more of the same will get us there.

------

Scott Nyquist is a senior advisor at McKinsey & Company and vice chairman, Houston Energy Transition Initiative of the Greater Houston Partnership. The views expressed herein are Nyquist's own and not those of McKinsey & Company or of the Greater Houston Partnership. This article originally ran on LinkedIn.

Ad Placement 300x100
Ad Placement 300x600

CultureMap Emails are Awesome

Chevron and ExxonMobil feed the need for gas-powered data centers

data center demand

Two of the Houston area’s oil and gas goliaths, Chevron and ExxonMobil, are duking it out in the emerging market for natural gas-powered data centers—centers that would ease the burden on electric grids.

Chevron said it’s negotiating with an unnamed company to supply natural gas-generated power for the data center industry, whose energy consumption is soaring mostly due to AI. The power would come from a 2.5-gigawatt plant that Chevron plans to build in West Texas. The company says the plant could eventually accommodate 5 gigawatts of power generation.

The Chevron plant is expected to come online in 2027. A final decision on investing in the plant will be made next year, Jeff Gustavson, vice president of Chevron’s low-carbon energy business, said at a recent gathering for investors.

“Demand for gas is expected to grow even faster than for oil, including the critical role gas will play [in] providing the energy backbone for data centers and advanced computing,” Gustavson said.

In January, the company’s Chevron USA subsidiary unveiled a partnership with investment firm Engine No. 1 and energy equipment manufacturer GE Vernova to develop large-scale natural gas power plants co-located with data centers.

The plants will feature behind-the-meter energy generation and storage systems on the customer side of the electricity meter, meaning they supply power directly to a customer without being connected to an electric grid. The venture is expected to start delivering power by the end of 2027.

Chevron rival ExxonMobil is focusing on data centers in a slightly different way.

ExxonMobil Chairman and CEO Darren Woods said the company aims to enable the capture of more than 90 percent of emissions from data centers. The company would achieve this by building natural gas plants that incorporate carbon capture and storage technology. These plants would “bring a unique advantage” to the power market for data centers, Woods said.

“In the near to medium term, we are probably the only realistic game in town to accomplish that,” he said during ExxonMobil’s third-quarter earnings call. “I think we can do it pretty effectively.”

Woods said ExxonMobil is in advanced talks with hyperscalers, or large-scale providers of cloud computing services, to equip their data centers with low-carbon energy.

“We will see what gets translated into actual contracts and then into construction,” he said.

Houston company wins contract to operate South Texas wind farm

wind deal

Houston-based Consolidated Asset Management Services (CAMS), which provides services for owners of energy infrastructure, has added the owner of a South Texas wind power project to its customer list.

The new customer, InfraRed Capital Partners, owns the 202-megawatt Mesteño Wind Project in the Rio Grande Valley. InfraRed bought the wind farm from Charlotte, North Carolina-based power provider Duke Energy in 2024. CAMS will provide asset management, remote operations, maintenance, compliance and IT services for the Mesteño project.

Mesteño began generating power in 2019. The wind farm is connected to the electric grid operated by the Energy Reliability Council of Texas (ERCOT).

With the addition of Mesteño, CAMS now manages wind energy projects with generation capacity of more than 2,500 megawatts.

Mesteño features one of the tallest wind turbine installations in the U.S., with towers reaching 590.5 feet. Located near Rio Grande City, the project produces enough clean energy to power about 60,000 average homes.

In June, CAMS was named to the Financial Times’ list of the 300 fastest-growing companies in North and South America. The company’s revenue grew more than 70 percent from 2020 to 2023.

Earlier this year, CAMS jumped into the super-hot data center sector with the rollout of services designed to help deliver reliable, cost-effective power to energy-hungry data centers. The initiative focuses on supplying renewable energy and natural gas.

Google's $40B investment in Texas data centers includes energy infrastructure

The future of data

Google is investing a huge chunk of money in Texas: According to a release, the company will invest $40 billion on cloud and artificial intelligence (AI) infrastructure, with the development of new data centers in Armstrong and Haskell counties.

The company announced its intentions at a meeting on November 14 attended by federal, state, and local leaders including Gov. Greg Abbott who called it "a Texas-sized investment."

Google will open two new data center campuses in Haskell County and a data center campus in Armstrong County.

Additionally, the first building at the company’s Red Oak campus in Ellis County is now operational. Google is continuing to invest in its existing Midlothian campus and Dallas cloud region, which are part of the company’s global network of 42 cloud regions that deliver high-performance, low-latency services that businesses and organizations use to build and scale their own AI-powered solutions.

Energy demands

Google is committed to responsibly growing its infrastructure by bringing new energy resources onto the grid, paying for costs associated with its operations, and supporting community energy efficiency initiatives.

One of the new Haskell data centers will be co-located with — or built directly alongside — a new solar and battery energy storage plant, creating the first industrial park to be developed through Google’s partnership with Intersect and TPG Rise Climate announced last year.

Google has contracted to add more than 6,200 megawatts (MW) of net new energy generation and capacity to the Texas electricity grid through power purchase agreements (PPAs) with energy developers such as AES Corporation, Enel North America, Intersect, Clearway, ENGIE, SB Energy, Ørsted, and X-Elio.

Water demands

Google’s three new facilities in Armstrong and Haskell counties will use air-cooling technology, limiting water use to site operations like kitchens. The company is also contributing $2.6 million to help Texas Water Trade create and enhance up to 1,000 acres of wetlands along the Trinity-San Jacinto Estuary. Google is also sponsoring a regenerative agriculture program with Indigo Ag in the Dallas-Fort Worth area and an irrigation efficiency project with N-Drip in the Texas High Plains.

In addition to the data centers, Google is committing $7 million in grants to support AI-related initiatives in healthcare, energy, and education across the state. This includes helping CareMessage enhance rural healthcare access; enabling the University of Texas at Austin and Texas Tech University to address energy challenges that will arise with AI, and expanding AI training for Texas educators and students through support to Houston City College.

---

This article originally appeared on CultureMap.com.