ExxonMobil and Intel are working to design, test, research and develop new liquid cooling technologies to optimize data center performance and help customers meet their sustainability goals. Photo via Getty Images

Two multinational corporations have announced a new collaboration to create energy-efficient and sustainable solutions for data centers as the market experiences significant growth.

ExxonMobil and Intel are working to design, test, research and develop new liquid cooling technologies to optimize data center performance and help customers meet their sustainability goals. Liquid cooling solutions serve as an alternative to traditional air-cooling methods in data centers.

“Our partnership with ExxonMobil to co-develop turnkey solutions for liquid cooling will enable significant energy and water savings for data center and network deployments,” said Jen Huffstetler, Chief Product Sustainability Officer, Intel.

According to consulting firm McKinsey, “a hyperscaler’s data center can use as much power as 80,000 households do,” and that demand is expected to keep surging. Power consumption by the U.S. data center market is forecasted “to reach 35 gigawatts (GW) by 2030, up from 17 GW in 2022,” according to a McKinsey analysis. Artificial intelligence and machine learning, and other advanced computing techniques are increasing computational workloads, and in return, increasing electricity demand. Therefore, companies are searching for solutions to support this growth.

ExxonMobil launched its full portfolio of data center immersion fluid products last year. The partnership with Intel will allow them to further advance their efforts in this market.

“By integrating ExxonMobil’s proven expertise in liquid cooling technologies with Intel’s long legacy of industry leadership in world-changing computing technologies, together we will further the industry’s adoption and acceptance as it transitions to liquid cooling technologies,” said Sarah Horne, Vice President, ExxonMobil.

Learn more about this collaboration here.

———

This article originally ran on the Greater Houston Partnership's Houston Energy Transition Initiative blog. HETI exists to support Houston's future as an energy leader. For more information about the Houston Energy Transition Initiative, EnergyCapitalHTX's presenting sponsor, visit htxenergytransition.org.

The UH team is developing ways to use machine learning to ensure that power systems can continue to run efficiently when pulling their energy from wind and solar sources. Photo via Getty Images

Houston researcher wins competitive NSF award for work tying machine learning to the power grid

grant funding

An associate professor at the University of Houston received the highly competitive National Science Foundation CAREER Award earlier this month for a proposal focused on integrating renewable resources to improve power grids.

The award grants more than $500,000 to Xingpeng Li, assistant professor of electrical and computer engineering and leader of the Renewable Power Grid Lab at UH, to continue his work on developing ways to use machine learning to ensure that power systems can continue to run efficiently when pulling their energy from wind and solar sources, according to a statement from UH. This work has applications in the events of large disturbances to the grid.

Li explains that currently, power grids run off of converted, stored kinetic energy during grid disturbances.

"For example, when the grid experiences sudden large generation losses or increased electrical loads, the stored kinetic energy immediately converted to electrical energy and addressed the temporary shortfall in generation,” Li said in a statement. “However, as the proportion of wind and solar power increases in the grid, we want to maximize their use since their marginal costs are zero and they provide clean energy. Since we reduce the use of those traditional generators, we also reduce the power system inertia (or stored kinetic energy) substantially.”

Li plans to use machine learning to create more streamlined models that can be implemented into day-ahead scheduling applications that grid operators currently use.

“With the proposed new modeling and computational approaches, we can better manage grids and ensure it can supply continuous quality power to all the consumers," he said.

In addition to supporting Li's research and model creations, the funds will also go toward Li and his team's creation of a free, open-source tool for students from kindergarten up through their graduate studies. They are also developing an “Applied Machine Learning in Power Systems” course. Li says the course will help meet workforce needs.

The CAREER Award recognizes early-career faculty members who “have the potential to serve as academic role models in research and education and to lead advances in the mission of their department or organization,” according to the NSF. It's given to about 500 researchers each year.

Earlier this year, Rice assistant professor Amanda Marciel was also granted an NSF CAREER Award to continue her research in designing branch elastomers that return to their original shape after being stretched. The research has applications in stretchable electronics and biomimetic tissues.
The new course will provide participants with insights on how to use robotics to enhance efficiency in data collection, AI data analysis tools for industry, risk management with AI, and more. Photo courtesy of UH

Houston university launches latest micro-credential course focused on AI, robotics for the energy industry

coming soon

The University of Houston will launch its latest micro-credential course next month that focuses on how AI and robotics can be used in inspection processes for the energy industry.

Running from March 22 through April 22, the course is open to "engineers, technicians and industry professionals with advanced knowledge in the dynamic fields of robotics and AI," according to a statement from UH. It will combine weekly online lectures and in-person hands-on demonstrations and provide participants with insights on how to use robotics to enhance efficiency in data collection, AI data analysis tools for industry, risk management with AI, and more.

“By blending theoretical knowledge with practical applications and hands-on experience, the course aims to empower participants with the skills needed to evaluate and adopt these advanced technologies to address real-world challenges in asset management,” Vedhus Hoskere, assistant professor at the UH Cullen College of Engineering, said in a statement. “We hope that upskilling and knowledge gained from this course will help accelerate the adoption of AI and robotics and contribute to the advancement of safer and more resource-efficient energy infrastructure systems.”

Hoskere will teach the course module titled “Computer Vision and Deep Learning for Inspections.” He also recently received a $500,000 grant from the Texas Department of Transportation (TxDOT) to look at how to use drones, cameras, sensors and AI to support Texas' bridge maintenance programs.

Other leaders of the UH Energy course will include:

  • Kimberley Hayes, founder of Valkim Technologies: Lead speaker who will provide an overview and introduction of AI applications, standards and certification
  • Gangbing Song, Moores Professor of Mechanical Engineering at UH: Machine learning hands-on exercises
  • Pete Peterson, head of product management and marketing with XaaS Lab: Computer vision technology in the oil and gas industry
  • Matthew Alberts, head of project management with Future Technologies Venture Venture LLC: Use cases, workflow and optimizing inspections with AI and drones
  • Suchet Bargoti, chief technology officer at Abyss Solutions: AI and robots for integrity management.

Registration accepted up to the first day of the course and can be completed online.

The world can't keep on with what it's doing and expect to reach its goals when it comes to climate change. Radical innovations are needed at this point, writes Scott Nyquist. Photo via Getty Images

Only radical innovation can get the world to its climate goals, says this Houston expert

guest column

Almost 3 years ago, McKinsey published a report arguing that limiting global temperature rises to 1.5 degrees Celsius above pre-industrial levels was “technically achievable,” but that the “math is daunting.” Indeed, when the 1.5°C figure was agreed to at the 2015 Paris climate conference, the assumption was that emissions would peak before 2025, and then fall 43 percent by 2030.

Given that 2022 saw the highest emissions ever—36.8 gigatons—the math is now more daunting still: cuts would need to be greater, and faster, than envisioned in Paris. Perhaps that is why the Intergovernmental Panel on Climate Change (IPCC) noted March 20 (with “high confidence”) that it was “likely that warming will exceed 1.5°C during the 21st century.”

I agree with that gloomy assessment. Given the rate of progress so far, 1.5°C looks all but impossible. That puts me in the company of people like Bill Gates; the Economist; the Australian Academy of Science, and apparently many IPCC scientists. McKinsey has estimated that even if all countries deliver on their net zero commitments, temperatures will likely be 1.7°C higher in 2100.

In October, the UN Environment Program argued that there was “no credible pathway to 1.5°C in place” and called for “an urgent system-wide transformation” to change the trajectory. Among the changes it considers necessary: carbon taxes, land use reform, dietary changes in which individuals “consume food for environmental sustainability and carbon reduction,” investment of $4 trillion to $6 trillion a year; applying current technology to all new buildings; no new fossil fuel infrastructure. And so on.

Let’s assume that the UNEP is right. What are the chances of all this happening in the next few years? Or, indeed, any of it? President Obama’s former science adviser, Daniel Schrag, put it this way: “ Who believes that we can halve global emissions by 2030?... It’s so far from reality that it’s kind of absurd.”

Having a goal is useful, concentrating minds and organizing effort. And I think that has been the case with 1.5°C, or recent commitments to get to net zero. Targets create a sense of urgency that has led to real progress on decarbonization.

The 2020 McKinsey report set out how to get on the 1.5°C pathway, and was careful to note that this was not a description of probability or reality but “a picture of a world that could be.” Three years later, that “world that could be” looks even more remote.

Consider the United States, the world’s second-largest emitter. In 2021, 79 percent of primary energy demand (see chart) was met by fossil fuels, about the same as a decade before. Globally, the figures are similar, with renewables accounting for just 12.5 percent of consumption and low-emissions nuclear another 4 percent. Those numbers would have to basically reverse in the next decade or so to get on track. I don’t see how that can happen.

No alt text provided for this image

Credit: Energy Information Administration

But even if 1.5°C is improbable in the short term, that doesn’t mean that missing the target won’t have consequences. And it certainly doesn’t mean giving up on addressing climate change. And in fact, there are some positive trends. Many companies are developing comprehensive plans for achieving net-zero emissions and are making those plans part of their long-term strategy. Moreover, while global emissions grew 0.9 percent in 2022, that was much less than GDP growth (3.2 percent). It’s worth noting, too, that much of the increase came from switching from gas to coal in response to the Russian invasion of Ukraine; that is the kind of supply shock that can be reversed. The point is that growth and emissions no longer move in lockstep; rather the opposite. That is critical because poorer countries are never going to take serious climate action if they believe it threatens their future prosperity.

Another implication is that limiting emissions means addressing the use of fossil fuels. As noted, even with the substantial rise in the use of renewables, coal, gas, and oil are still the core of the global energy system. They cannot be wished away. Perhaps it is time to think differently—that is, making fossil fuels more emissions efficient, by using carbon capture or other technologies; cutting methane emissions; and electrifying oil and gas operations. This is not popular among many climate advocates, who would prefer to see fossil fuels “stay in the ground.” That just isn’t happening. The much likelier scenario is that they are gradually displaced. McKinsey projects peak oil demand later this decade, for example, and for gas, maybe sometime in the late 2030s. Even after the peak, though, oil and gas will still be important for decades.

Second, in the longer term, it may be possible to get back onto 1.5°C if, in addition to reducing emissions, we actually remove them from the atmosphere, in the form of “negative emissions,” such as direct air capture and bioenergy with carbon capture and storage in power and heavy industry. The IPCC itself assumed negative emissions would play a major role in reaching the 1.5°C target; in fact, because of cost and deployment problems, it’s been tiny.

Finally, as I have argued before, it’s hard to see how we limit warming even to 2°C without more nuclear power, which can provide low-emissions energy 24/7, and is the largest single source of such power right now.

None of these things is particularly popular; none get the publicity of things like a cool new electric truck or an offshore wind farm (of which two are operating now in the United States, generating enough power for about 20,000 homes; another 40 are in development). And we cannot assume fast development of offshore wind. NIMBY concerns have already derailed some high-profile projects, and are also emerging in regard to land-based wind farms.

Carbon capture, negative emissions, and nuclear will have to face NIMBY, too. But they all have the potential to move the needle on emissions. Think of the potential if fast-growing India and China, for example, were to develop an assembly line of small nuclear reactors. Of course, the economics have to make sense—something that is true for all climate-change technologies.

And as the UN points out, there needs to be progress on other issues, such as food, buildings, and finance. I don’t think we can assume that such progress will happen on a massive scale in the next few years; the actual record since Paris demonstrates the opposite. That is troubling: the IPCC notes that the risks of abrupt and damaging impacts, such as flooding and crop yields, rise “with every increment of global warming.” But it is the reality.

There is one way to get us to 1.5°C, although not in the Paris timeframe: a radical acceleration of innovation. The approaches being scaled now, such as wind, solar, and batteries, are the same ideas that were being discussed 30 years ago. We are benefiting from long-term, incremental improvements, not disruptive innovation. To move the ball down the field quickly, though, we need to complete a Hail Mary pass.

It’s a long shot. But we’re entering an era of accelerated innovation, driven by advanced computing, artificial intelligence, and machine learning that could narrow the odds. For example, could carbon nanotubes displace demand for high-emissions steel? Might it be possible to store carbon deep in the ocean? Could geo-engineering bend the curve?

I believe that, on the whole, the world is serious about climate change. I am certain that the energy transition is happening. But I don’t think we are anywhere near to being on track to hit the 1.5°C target. And I don’t see how doing more of the same will get us there.

------

Scott Nyquist is a senior advisor at McKinsey & Company and vice chairman, Houston Energy Transition Initiative of the Greater Houston Partnership. The views expressed herein are Nyquist's own and not those of McKinsey & Company or of the Greater Houston Partnership. This article originally ran on LinkedIn.

Ad Placement 300x100
Ad Placement 300x600

CultureMap Emails are Awesome

Texas Gov. Greg Abbott demands answers from Houston power company following Beryl

investigation incoming

With around 270,000 homes and businesses still without power in the Houston area almost a week after Hurricane Beryl hit Texas, Gov. Greg Abbott on Sunday said he's demanding an investigation into the response of the utility that serves the area as well as answers about its preparations for upcoming storms.

“Power companies along the Gulf Coast must be prepared to deal with hurricanes, to state the obvious,” Abbott said at his first news conference about Beryl since returning to the state from an economic development trip to Asia.

While CenterPoint Energy has restored power to about 2 million customers since the storm hit on July 8, the slow pace of recovery has put the utility, which provides electricity to the nation’s fourth-largest city, under mounting scrutiny over whether it was sufficiently prepared for the storm that left people without air conditioning in the searing summer heat.

Abbott said he was sending a letter to the Public Utility Commission of Texas requiring it to investigate why restoration has taken so long and what must be done to fix it. In the Houston area, Beryl toppled transmission lines, uprooted trees and snapped branches that crashed into power lines.

With months of hurricane season left, Abbott said he's giving CenterPoint until the end of the month to specify what it'll be doing to reduce or eliminate power outages in the event of another storm. He said that will include the company providing detailed plans to remove vegetation that still threatens power lines.

Abbott also said that CenterPoint didn't have “an adequate number of workers pre-staged" before the storm hit.

Following Abbott's news conference, CenterPoint said its top priority was “power to the remaining impacted customers as safely and quickly as possible,” adding that on Monday, the utility expects to have restored power to 90% of its customers. CenterPoint said it was committed to working with state and local leaders and to doing a “thorough review of our response.”

CenterPoint also said Sunday that it’s been “investing for years” to strengthen the area’s resilience to such storms.

The utility has defended its preparation for the storm and said that it has brought in about 12,000 additional workers from outside Houston. It has said it would have been unsafe to preposition those workers inside the predicted storm impact area before Beryl made landfall.

Brad Tutunjian, vice president for regulatory policy for CenterPoint Energy, said last week that the extensive damage to trees and power poles hampered the ability to restore power quickly.

A post Sunday on CenterPoint's website from its president and CEO, Jason Wells, said that over 2,100 utility poles were damaged during the storm and over 18,600 trees had to be removed from power lines, which impacted over 75% of the utility's distribution circuits.

Things to know: Beryl in the rearview, Devon Energy's big deal, and events not to miss

taking notes

Editor's note: Dive headfirst into the new week with three quick things to catch up on in Houston's energy transition.

Hurricane Beryl's big impact

Hundreds of thousands of people in the Houston area likely won’t have power restored until this week, as the city swelters in the aftermath of Hurricane Beryl.

The storm slammed into Texas on July 8, knocking out power to nearly 2.7 million homes and businesses and leaving huge swaths of the region in the dark and without air conditioning in the searing summer heat.

Although repairs have restored power to nearly 1.4 million customers, the scale of the damage and slow pace of recovery has put CenterPoint Energy, which provides electricity to the nation's fourth-largest city, under mounting scrutiny over whether it was sufficiently prepared for the storm and is doing enough now to make things right.

Some frustrated residents have also questioned why a part of the country that is all too familiar with major storms has been hobbled by a Category 1 hurricane, which is the weakest kind. But a storm's wind speed, alone, doesn't determine how dangerous it can be. Click here to continue reading this article from the AP.

Big deal: Devon Energy to acquire Houston exploration, production biz in $5B deal

Devon Energy is buying Grayson Mill Energy's Williston Basin business in a cash-and-stock deal valued at $5 billion as consolidation in the oil and gas sector ramps up.

The transaction includes $3.25 billion in cash and $1.75 billion in stock.

Grayson Mill Energy, based in Houston, is an oil and gas exploration company that received an initial investment from private equity firm EnCap Investments in 2016.

The firm appears to be stepping back from energy sector as it sells off assets. Last month EnCap-backed XCL Resources sold its Uinta Basin oil and gas assets to SM Energy Co. and Northern Oil and Gas in a transaction totaling $2.55 billion. EnCap had another deal in June as well, selling some assets to Matador Resources for nearly $2 billion. Click here to continue reading.

Events not to miss

Put these Houston-area energy-related events on your calendar.

  • 2024 Young Leaders Institute: Renewable Energy and Climate Solutions is taking place July 15 to July 19 at Asia Society of Texas. Register now.
  • CCS/Decarbonization Project Development, Finance and Investment, taking place July 23 to 25, is the deepest dive into the economic and regulatory factors driving the success of the CCS/CCUS project development landscape. Register now.
  • The 5th Texas Energy Forum 2024, organized by U.S. Energy Stream, will take place on August 21 and 22 at the Petroleum Club of Houston. Register now.

Growing Houston biotech company expands leadership as it commercializes sustainable products

onboarding

Houston-based biotech company Cemvita recently tapped two executives to help commercialize its sustainable fuel made from carbon waste.

Nádia Skorupa Parachin came aboard as vice president of industrial biotechnology, and Phil Garcia was promoted to vice president of commercialization.

Parachin most recently oversaw several projects at Boston-based biotech company Ginkjo Bioworks. She previously co-founded Brazilian biotech startup Integra Bioprocessos.

Parachin will lead the Cemvita team that’s developing technology for production of bio-manufactured oil.

“It’s a fantastic moment, as we’re poised to take our prototyping to the next level, and all under the innovative direction of our co-founder Tara Karimi,” Parachin says in a news release. “We will be bringing something truly remarkable to market and ensuring it’s cost-effective.”

Moji Karimi, co-founder and CEO of Cemvita, says the hiring of Parachin represents “the natural next step” toward commercializing the startup’s carbon-to-oil process.

“Her background prepared her to bring the best out of the scientists at the inflection point of commercialization — really bringing things to life,” says Moji Karimi, Tara’s brother.

Parachin joins Garcia on Cemvita’s executive team.

Before being promoted to vice president of commercialization, Garcia was the startup’s commercial director and business development manager. He has a background in engineering and business development.

Founded in 2017, Cemvita recently announced a breakthrough that enables production of large quantities of oil derived from carbon waste.

In 2023, United Airlines agreed to buy up to one billion gallons of sustainable aviation fuel from Cemvita’s first full-scale plant over the course of 20 years.

Cemvita’s investors include the UAV Sustainable Flight Fund, an investment arm of Chicago-based United; Oxy Low Carbon Ventures, an investment arm of Houston-based energy company Occidental Petroleum; and Japanese equipment and machinery manufacturer Mitsubishi Heavy Industries.