Hadi Ghasemi, a University of Houston professor, has uncovered a method to release heat from data centers and electronics at record performance. Photo courtesy UH.

A University of Houston professor has developed a new cooling method that can remove heat at least three times more effectively from AI data centers than current technologies.

Hadi Ghasemi, a distinguished professor of Mechanical & Aerospace Engineering at UH, published his findings in two articles in the International Journal of Heat and Mass Transfer. The findings solve a critical issue in the growing AI sector, according to UH.

High-powered AI data centers generate huge amounts of heat due to the GPU and operating systems they use with extreme power densities, which introduce complex thermal challenges. Traditionally, cooling methods, like microchannels, which use flow and spray cooling, have had limitations when exposed to extreme heat flux, according to UH.

Ghasemi’s research, however, found a more effective way to design thin-film evaporation structures to release heat from data centers and electronics at record performance.

Ghasem’s solution coupled topology optimization and AI modeling to determine the best shapes for thin film efficiency, ultimately landing on a branch-like structure—resembling a tree.

The model found that the “branches” needed to be about 50 percent solid and 50 percent empty space for optimum efficiency, and that they could sustain high heat fluxes with minimal thermal resistance.

“These structures could achieve high critical heat flux at much lower superheat compared to traditionally studied structures,” Ghasemi said in a news release. “The new structures can remove heat without having to get as hot as previous removal systems.

Ghasemi’s doctoral candidates, Amirmohammad Jahanbakhsh and Saber Badkoobeh Hezave, also worked on the project. The team believes their results show the impact of a physics-aware, AI design and can help ensure reliability, longevity and stability of AI data centers.

“Beyond achieving record performance, these new findings provide fundamental insight into the governing heat-transfer physics and establishes a rational pathway toward even higher thermal dissipation capacities,” Ghasemi added in the release

Merab Momen, founder of AI CTO Services. Courtesy Photo

How this Houston expert helps startups turn AI hype into real impact

now streaming

Artificial intelligence is now everywhere. It is mentioned in every startup pitch deck, and every corporate roadmap claims to use it. However, many early-stage businesses struggle with the simple question, “What does AI actually mean for my business?”

In a recent podcast episode of EnergyTech Startups, Merab Momen, founder of AI CTO Services and a long time AI practitioner, explains why most founders misunderstand AI, how startups can practically apply it and why Houston is quietly becoming a serious hub for AI-driven innovation.

Filling the AI Leadership Gap

Merab’s career has spanned decades of technology transitions. He worked on neutral networks in the 1990s, constructed computer vision systems long before they were common, and helped install AI solutions inside huge industrial companies. However, he noticed a huge problem when generative AI started to explode into the mainstream-The requirement of a real partner by the founders for AI integration but inability to rely on a full-time CTO and project-based consultants.

“I really needed something which is much more engaging where I can give that partner-level advice to the founders,” he said. By giving firms on-demand access to high-level AI knowledge and expertise, his methodology enables them to analyse tools, steer clear of cost blunders and eventually transition to a permanent technology leader when the time is right.

AI is Older than Most People Think

Despite its recent rise in popularity, AI is nothing new. AI actually began in the 1950s. Merab in his conversation explained how he worked on his first AI project back in the year 1996 that worked perfectly, but the processing power wasn’t just there to make it practical. He continued how he utilized the swarm intelligence models to optimize supply chains, now referred to as MLPOs and data engineering.

From Language Models to Physical World

Much of the public conversation about AI revolves around chatbots and text generation. But Merab sees far greater potential in AI’s interaction with the physical world, especially in industrial settings. He emphasized edge computing and vision language models (VLMs) as significant advances in manufacturing and energy. This physical shift is opening doors for new opportunities for robotics, automated inspections, and industrial safety applications. Merab added that Houston is uniquely positioned for this transition.

Why Houston has an AI Advantage

Silicon Valley may dominate the AI headlines, but Merab believes Houston’s advantage lies beneath the surface. The city doesn’t lag in AI utilization; it just operates in industries where results show differently.

Machine learning isn’t new to Houston’s core industries. Energy companies, manufacturers, logistics providers, and healthcare systems have been using advanced analytics for decades. The difference lies in them innovating in industrial sectors rather than consumer technology.

What’s Next

With the AI CTO Services growing, Merab is working with startups across industries to deploy AI in practical, business-first ways.

He is more interested in assisting founders in finding answers to critical issues than following new trends.

For Houston’s energy and climate tech community, it needs to transform AI enthusiasm into real-world impact.

Listen to the full conversation with Mehrab Momin on the Energy Tech Startups Podcast to learn more.

---

Energy Tech Startups Podcast is hosted by Jason Ethier and Nada Ahmed. It delves into Houston's pivotal role in the energy transition, spotlighting entrepreneurs and industry leaders shaping a low-carbon future.


A new report shows the role Texas could play as the data-center sector enters "hyperdrive." Photo via JLL.com.

Texas could topple Virginia as biggest data-center market by 2030, JLL report says

data analysis

Everything’s bigger in Texas, they say—and that phrase now applies to the state’s growing data-center presence.

A new report from commercial real estate services provider JLL says Texas could overtake Northern Virginia as the world’s largest data-center market by 2030. Northern Virginia is a longtime holder of that title.

What’s driving Texas’ increasingly larger role in the data-center market? The key factor is artificial intelligence.

Companies like Google and Microsoft need more energy-hungry data centers to power AI innovations. In a 2023 article, Forbes explained that AI models consume a lot of energy because of the massive amount of data used to train them, as well as the complexity of those models and the rising volume of tasks assigned to AI.

“The data-center sector has officially entered hyperdrive,” Andy Cvengros, executive managing director at JLL and co-leader of its U.S. data-center business, said in the report. “Record-low vacancy sustained over two consecutive years provides compelling evidence against bubble concerns, especially when nearly all our massive construction pipeline is already pre-committed by investment-grade tenants.”

Dallas-Fort Worth has long dominated the Texas data-center market. But in recent years, West Texas has emerged as a popular territory for building data-center campuses, thanks in large part to an abundance of land and energy. Nearly two-thirds of data-center construction underway now is happening in “frontier markets” like West Texas, Ohio, Tennessee and Wisconsin, the JLL report says.

Northern Virginia, the current data-center champ in the U.S., boasted a data-center market with 6,315 megawatts of capacity at the end of 2025, the report says. That compares with 2,423 megawatts in Dallas-Fort Worth, 1,700 megawatts in the Austin-San Antonio corridor, 200 megawatts in West Texas, and 164 megawatts in Houston.

Google is investing in Texas. Courtesy of Google

Google's $40B investment in Texas data centers includes energy infrastructure

The future of data

Google is investing a huge chunk of money in Texas: According to a release, the company will invest $40 billion on cloud and artificial intelligence (AI) infrastructure, with the development of new data centers in Armstrong and Haskell counties.

The company announced its intentions at a meeting on November 14 attended by federal, state, and local leaders including Gov. Greg Abbott who called it "a Texas-sized investment."

Google will open two new data center campuses in Haskell County and a data center campus in Armstrong County.

Additionally, the first building at the company’s Red Oak campus in Ellis County is now operational. Google is continuing to invest in its existing Midlothian campus and Dallas cloud region, which are part of the company’s global network of 42 cloud regions that deliver high-performance, low-latency services that businesses and organizations use to build and scale their own AI-powered solutions.

Energy demands

Google is committed to responsibly growing its infrastructure by bringing new energy resources onto the grid, paying for costs associated with its operations, and supporting community energy efficiency initiatives.

One of the new Haskell data centers will be co-located with — or built directly alongside — a new solar and battery energy storage plant, creating the first industrial park to be developed through Google’s partnership with Intersect and TPG Rise Climate announced last year.

Google has contracted to add more than 6,200 megawatts (MW) of net new energy generation and capacity to the Texas electricity grid through power purchase agreements (PPAs) with energy developers such as AES Corporation, Enel North America, Intersect, Clearway, ENGIE, SB Energy, Ørsted, and X-Elio.

Water demands

Google’s three new facilities in Armstrong and Haskell counties will use air-cooling technology, limiting water use to site operations like kitchens. The company is also contributing $2.6 million to help Texas Water Trade create and enhance up to 1,000 acres of wetlands along the Trinity-San Jacinto Estuary. Google is also sponsoring a regenerative agriculture program with Indigo Ag in the Dallas-Fort Worth area and an irrigation efficiency project with N-Drip in the Texas High Plains.

In addition to the data centers, Google is committing $7 million in grants to support AI-related initiatives in healthcare, energy, and education across the state. This includes helping CareMessage enhance rural healthcare access; enabling the University of Texas at Austin and Texas Tech University to address energy challenges that will arise with AI, and expanding AI training for Texas educators and students through support to Houston City College.

---

This article originally appeared on CultureMap.com.

Deloitte predicts AI will represent 57 percent of IT spending by U.S. oil and gas companies in 2029. Photo via Unsplash.

Energy sector AI spending is set to soar to $13B, report says

eyes on ai

Get ready for a massive increase in the amount of AI spending by oil and gas companies in the Houston area and around the country.

A new report from professional services firm Deloitte predicts AI will represent 57 percent of IT spending by U.S. oil and gas companies in 2029. That’s up from the estimated share of 23 percent in 2025.

According to the analysis, the amount of AI spending in the oil and gas industry will jump from an estimated $4 billion in 2025 to an estimated $13.4 billion in 2029—an increase of 235 percent.

Almost half of AI spending by U.S. oil and gas companies targets process optimization, according to Deloitte’s analysis of data from market research companies IDC and Gartner. “AI-driven analytics adjust drilling parameters and production rates in real time, improving yield and decision-making,” says the Deloitte report.

Other uses for AI in the oil and gas industry cited by Deloitte include:

  • Integrating infrastructure used by shale producers
  • Monitoring pipelines, drilling platforms, refineries, and other assets
  • Upskilling workers through AI-powered platforms
  • Connecting workers on offshore rigs via high-speed, real-time internet access supplied by satellites
  • Detecting and reporting leaks

The report says a new generation of technology, including AI and real-time analytics, is transforming office and on-site operations at oil and gas companies. The Trump administration’s “focus on AI innovation through supportive policies and investments could further accelerate large-scale adoption and digital transformation,” the report adds.

Chevron and ExxonMobil, the two biggest oil and gas companies based in the Houston area, continue to dive deeper into AI.

Chevron is taking advantage of AI to squeeze more insights from enormous datasets, VentureBeat reported.

“AI is a perfect match for the established, large-scale enterprise with huge datasets—that is exactly the tool we need,” Bill Braun, the company’s now-retired chief information officer, said at a VentureBeat event in May.

Meanwhile, AI enables ExxonMobil to conduct autonomous drilling in the waters off the coast of Guyana. ExxonMobil says its proprietary system improves drilling safety, boosts efficiency, and eliminates repetitive tasks performed by rig workers.

ExxonMobil is also relying on AI to help cut $15 billion in operating costs by 2027.

“There is a concerted effort to make sure that we’re really working hard to apply that new technology … to drive effectiveness and efficiency,” Darren Woods, executive chairman and CEO of ExxonMobil, said during a 2024 earnings call.

ExxonMobil is on Fortune's first-ever AIQ ranking. Getty Images

2 Houston energy giants appear on Fortune’s inaugural AI ranking

AI Leaders

Two Houston-area energy leaders appear on Fortune’s inaugural list of the top adopters of AI among Fortune 500 companies.

They are:

  • No. 7 energy company ExxonMobil, based in Spring
  • No. 47 energy company Chevron, based in Houston

They are joined by Spring-based tech company Hewlett Packard Enterprise, No. `19.

All three companies have taken a big dive into the AI pool.

In 2024, ExxonMobil’s executive chairman and CEO, Darren Woods, explained that AI would play a key role in achieving a $15 billion reduction in operating costs by 2027.

“There is a concerted effort to make sure that we're really working hard to apply that new technology to the opportunity set within the company to drive effectiveness and efficiency,” Woods told Wall Street analysts.

At Chevron, AI tools are being used to quickly analyze data and extract insights from it, according to tech news website VentureBeat. Also, Chevron employs advanced AI systems known as large language models (LLMs) to create engineering standards, specifications and safety alerts. AI is even being put to work in Chevron’s exploration initiatives.

Bill Braun, Chevron’s chief information officer, said at a VentureBeat-sponsored event in 2024 that AI-savvy data scientists, or “digital scholars,” are always embedded within workplace teams “to act as a catalyst for working differently.”

The Fortune AIQ 50 ranking is based on ServiceNow’s Enterprise AI Maturity Index, an annual measurement of how prepared organizations are to adopt and scale AI. To evaluate how Fortune 500 companies are rolling out AI and how much they value AI investments, Fortune teamed up with Enterprise Technology Research. The results went into computing an AIQ score for each company.

At the top of the ranking is Alphabet (owner of Google and YouTube), followed by Visa, JPMorgan Chase, Nvidia and Mastercard. Aside from ExxonMobil, Hewlett Packard Enterprise, and Chevron, two other Texas companies made the list: Arlington-based homebuilder D.R. Horton (No. 29) and Austin-based software company Oracle (No. 37).

“The Fortune AIQ 50 demonstrates how companies across industry sectors are beginning to find real value from the deployment of AI technology,” Jeremy Kahn, Fortune’s AI editor, said in a news release. “Clearly, some sectors, such as tech and finance, are pulling ahead of others, but even in so-called 'old economy' industries like mining and transport, there are a few companies that are pulling away from their peers in the successful use of AI.


---

This article originally appeared on InnovationMap.com.

Ad Placement 300x100
Ad Placement 300x600

CultureMap Emails are Awesome

Texas data center proposed by U.S. Army could use more power than El Paso

Big Data

The U.S. Army is proposing developing a gargantuan, 3-gigawatt data center complex on Fort Bliss property that within a few years would consume more electricity than all of El Paso Electric’s 460,000 customers combined – even as questions about its development, water usage and air pollution remain unanswered.

If built, it would be the third major data center project in the El Paso region, along with Meta Platform’s $10 billion facility in Northeast and the $165 billion Project Jupiter campus that Oracle and OpenAI are building in Santa Teresa, New Mexico. The combined scale and size of the three facilities could quickly transform the Borderland into one of the nation’s core hubs of power generation and AI infrastructure.

The publicly-traded investment firm Carlyle Group would pay to build and operate the Fort Bliss data center – one of several planned in a national rollout under President Donald Trump’s administration to rapidly increase artificial intelligence technology for the Department of Defense.

At Fort Bliss, the Army is “targeting an initial operating capacity of about 100 megawatts on the compute side” by next year, David Fitzgerald, deputy undersecretary of the Army, said during a meeting with reporters April 22. An official estimated cost for the project has yet to be released.

By 2029, the complex on military land in far East El Paso would require 3 gigawatts of electricity, Fitzgerald said. By comparison, El Paso Electric currently maintains about 2.9 gigawatts of generation capacity across its entire system that spans from Hatch, New Mexico, to Van Horn, Texas. The highest customer demand the power company has ever seen was just over 2.3 gigawatts during the summer of 2023.

And whether most El Pasoans are on board with the rapid buildout of another data center here is not a question that Army leadership is asking at this point.

“What we’re trying to do is find where are the common interests, common ground that we can solve for?” Fitzgerald said, referring to coordinating with El Paso city leaders on the data center project.

“The state of modern warfare and future warfare is largely going to depend on the ability to capture, process and utilize massive amounts of data,” he said. “So, the reality is, this is a strategic priority, not just for the Army, but for the entire Department of War. So, we need these capabilities, and we need to put them somewhere.”

Combined-cycle natural gas turbines are the “most likely” source of electricity generation for the facility, said Jeff Waksman, an assistant secretary of the Army and former member of Trump’s first administration.

Waksman said the facility would undergo environmental review before construction starts.

Still, there are far more outstanding questions than answers about the proposed Fort Bliss data center.

It’s unclear if the facility would connect to El Paso Water’s water system. The city-owned water utility pointed out that Fort Bliss Water provides water service for the installation. However, El Paso Water can provide “backup” service to the base, according to the project solicitation documents.

“EPWater was just recently brought into the discussion, and we only have preliminary information,” El Paso Water said in a statement. “The construction and water use would be entirely on federal property.”

El Paso Electric said it’s also uncertain whether the data center will connect to the utility’s power grid and will figure that out in the future. To date, the Army hasn’t made a formal request for service from El Paso Electric.

Officials from the U.S. Army “confirmed that questions regarding the power source and whether it will be connected to the regional grid remain under review and have plans to establish a data center with a projected demand of 3 gigawatts,” El Paso Electric said in a statement. “Ultimately, decisions about these matters will be made by Fort Bliss leadership, and we defer to them for further comment.”

A representative with Carlyle Group at a recent community meeting didn’t answer questions or provide details about the proposed data center facility and the related power generation source.

Carlyle Group did not respond to a request for comment.

Army officials said they don’t yet have a definitive agreement in place with Carlyle, which was conditionally selected to enter into exclusive negotiations, so few details are finalized.

However, the Army has set a short timeline to start operating by late 2027. That means construction will have to start soon, Fitzgerald said.

“The ideal endstate is that we would be at least (operational) by the end of ’27, which is moving pretty quick,” Fitzgerald said. “That would mean construction would need to begin in the not-so-distant future.”

Water, electricity concerns

Meeting three gigawatts of electricity demand with natural gas-fired turbines – cited by Army officials as the most likely power source – would likely produce huge amounts of greenhouse gases in a central area of El Paso, such as carbon dioxide, as well as other harmful pollutants including particulate matter.

And even if the data center doesn’t take service from El Paso Water and instead receives water from wells managed by Fort Bliss, it would rely on groundwater pumped out of the Hueco Bolson aquifer, the city’s main source of water.

The solicitation issued by the Army cites water risk for El Paso as “extremely high” and notes that most of Fort Bliss’ water supply comes from wells within the installation.

Fitzgerald said the Army is aware of the public’s concern that the data center could unsustainably guzzle El Paso’s groundwater to cool the data center’s computer servers. He said the facility will be “water neutral.”

It’s also not clear how the project could replace the same amount of water that it consumes.

It’s possible the Kay Bailey Hutchison Desalination Plant – co-owned by El Paso Water and the U.S. Army – could play a role in making the data center water neutral. But El Paso Water said it has no details about how the data center facility could achieve water neutrality.

El Paso Water is “more than willing to continue to share ideas for best practices in sustainability to help protect our regional water resources,” the utility said in its statement.

As far as electricity generation, Army officials said they don’t know if El Paso Electric would build a new power plant to serve the data center. It’s also possible that Carlyle Group or another private company could build its own power generation source for the data center that’s isolated from the power grid El Pasoans use every day.

“We have to decide whether El Paso Electric is going to be the ones building whatever is coming, or if this is going to be some independent power producer,” Waksman said.

El Paso Electric is planning to develop a 366 megawatt power plant made up of over 800 small gas generators to power Meta’s data center. The utility will build more generation in the coming years to meet 1 gigawatt of total demand from Meta’s facility. Meanwhile, as the technology giant Oracle develops Project Jupiter, the company said Monday it is seeking to power the campus using 2.45 gigawatts of fuel cell power systems provided by the company Bloom Energy.

For perspective, 3.45 gigawatts – the combined projected demand of those two major data centers – is enough electricity to power as many as a million homes, depending on the time of day and weather.

The Fort Bliss project would have to meet environmental regulatory requirements, and the developer needs to include a plan for providing utilities and infrastructure needs such as access to the facility, according to a request for proposals issued by the Army in December 2025. Army officials emphasized the project would not impact El Pasoans’ water or electric bills.

Who is Carlyle Group?

Carlyle Group is a global investment management firm that oversees $477 billion of assets from entities such as pension funds.

The company invests that money by buying businesses ranging from wine producers to Asian telecommunications companies, or by developing infrastructure projects such as renewable energy generation and data centers. The company in 2025 posted distributable earnings of nearly $1.7 billion on $4.8 billion in revenue.

The Army wants to build the facility at Fort Bliss in partnership with Carlyle because the installation has a large amount of available, unused land and because of the water and electricity infrastructure already in place in El Paso, Fitzgerald said.

The Carlyle data center planned for El Paso is part of a wider U.S. military effort to quickly build infrastructure that supports the use of artificial intelligence — both on the battlefield and in running its day-to-day operations, according to government documents.

Army officials nodded to the use of AI in drone warfare and targeting systems. And a hyperscale data center facility can also securely house information such as the military’s cloud database that details pay and entitlements for every U.S. soldier, said Maj. Gen. Curtis Taylor, commanding general of the 1st Armored Division and Fort Bliss.

Data centers are “essential parts of power projection,” Taylor said. “But we have to protect those servers. And that’s why there’s great utility in building that infrastructure on military installations.”

The Fort Bliss facility would be located on a plot of land near the intersection of Loop 375 and Montana Avenue. The site is just east of the Camp East Montana immigrant detention facility, and near El Paso Electric’s gas-fired Montana power station.

The plan is for Carlyle to utilize the majority of the data center’s capacity for its business needs, and the military would have access to a more secure portion of the data center for its own uses.

The Army is developing another similar data center project in Dugway, Utah. Other Army bases identified as potential sites include Fort Hood in Texas and Fort Bragg in North Carolina.

The U.S. Air Force in October issued a solicitation saying it is “accepting proposals for the development of Artificial Intelligence data centers,” on unused land at different bases, including in California, Georgia, Arizona and Tennessee. The push was enabled by executive orders signed by Trump that seek to speed up permitting and development timelines for AI data centers.

Would the Fort Bliss data center pay taxes?

A privately-financed data center on Fort Bliss would likely have to pay some taxes – unlike on-base government facilities – but there’s a lot of uncertainty.

Carlyle Group is leasing the land for the data center under an “enhanced use lease” that allows branches of the military to rent under-used land on bases.

Land on federal installations is not subject to state or local taxes. However, the statute that authorizes the U.S. military to lease excess land to private entities says that “the interest of a lessee of property leased under this section may be taxed by State or local governments.”

So, while the land the data center is built on would not be subject to taxation, the structures housing the data center could be subject to local property taxes.

But it depends on how the deal is structured, including factors such as whether Carlyle or the Army ultimately takes ownership of the buildings.

The Army in January awarded a contract to Korean-owned Hanwha Defense USA, which will invest $1.3 billion to develop a munitions factory at a base in Pine Bluff, Arkansas, using an enhanced use lease.

Fitzgerald, the Army undersecretary, acknowledged the public pushback to other data centers such as Meta and Project Jupiter. But he said the Army wants to ensure the project is developed “the right way.”

“There are always elements that will kind of make this an ‘us versus them’ sort of a construct, but I don’t think we view it that way from the Army,” he said. “I think there’s a path here that will benefit not just the installation, but the community as well.”

CenterPoint launches real-time tracker to map Houston’s power grid upgrades

resiliency plan

Houstonians can now track electronic infrastructure improvements via CenterPoint’s new Community Progress Tracker, part of the company’s ongoing Greater Houston Resiliency Initiative.

The tracker allows users to search by zip code and see completed work in real time, as well as updates on upcoming projects that highlight infrastructure improvements and efforts to strengthen the power grid in the face of extreme weather. Users can view icons on a map that track automation and intelligence projects, storm-resilient pole and equipment installations, undergrounding work and tree trimmings.

CenterPoint had installed 10,000 storm-resilient poles, cleared 1,600 miles of higher-risk vegetation, completed 99 miles of power line undergrounding and hardened 220 miles of power lines by the end of Q1 2026, according to the company.

For the rest of 2026, CenterPoint aims to install 35,000 stronger, storm-resilient poles, clear high-risk vegetation from 8,000 miles of power lines and harden 500 transmission structures against storms.

Via centerpointenergy.com

“We are proud of the progress made in 2025, which helped deliver more than 100 million fewer outage minutes when compared to 2024, and we are determined to make even more progress in 2026 as we work toward our defining goal: building the nation's most resilient coastal grid,” Nathan Brownell, CenterPoint's vice president of resilience and capital delivery, said in a news release. “To date, we are ahead of schedule in making critical 2026 GHRI improvements, and we will continue to build the stronger, smarter infrastructure necessary to further improve systemwide reliability and strengthen resiliency, reducing the likelihood and impact of outages for our customers.”

Woodlands-based company signs deal to develop 200 MW battery storage project

power deal

The Woodlands-based Plus Power announced this month that it has entered into a 20-year energy storage agreement with Tennessee Valley Authority (TVA), one of the largest public energy providers in the U.S.

Through the agreement, Plus Power and TVA will develop the Crawfish Creek Energy Storage project, a 200-megawatt / 800-megawatt-hour utility-scale battery energy storage facility in Jackson County, Alabama.

Construction on Crawfish Creek Energy Storage is expected to begin in 2028, and commercial operation is planned for the summer of 2029. The project will store electricity when demand is low and release it during peak periods, helping improve grid reliability, affordability, and energy security, according to a news release.

"Battery storage is essential to protecting the reliable, affordable electricity our region depends on to power next-generation technologies," Monika Beckner, TVA vice president, power supply & fuels, said in the release. "Projects like Crawfish Creek strengthen the Valley's energy security, improve our ability to manage extreme conditions, and help unleash American energy."

TVA selected Plus Power for the project in 2025 via a request for proposal to supply new capacity resources needed across the region. Plus Power currently owns and operates nine facilities that provide enhanced power reliability to Arizona, Hawaii, Maine, Massachusetts and Texas, totaling 1,650 megawatts/4,150 megawatt-hours. With this deal, Plus Power is entering its seventh state market and expanding into the Southeast.

"Plus Power is proud to support energy resilience in Jackson County and the Tennessee Valley, a key region for America's military, aerospace, and nuclear innovation," Brian Duncan, chief commercial officer at Plus Power, said in a news release. "Battery energy storage systems are flexible and millisecond-fast, making Crawfish Creek uniquely suited to meet the region's evolving needs. We are excited to partner with TVA to deliver a resource that supports economic expansion while strengthening American energy dominance and security.”