Houston's data center scene has received its latest bullish forecast. Photo via serverfarmllc.com

The Houston market could more than double its data center capacity by the end of 2028, a new report indicates.

The report, published by commercial real estate services provider CBRE, says greater demand for data center capacity in the Houston area is being fueled by energy companies, along with large-scale cloud services and AI-driven tenants.

In the second half of 2025, the Houston market had 154 megawatts of data center capacity, which was on par with capacity in the second half of 2024. Another 28.5 megawatts of capacity was under construction during that period.

“Multiple providers are advancing new builds and redevelopments, including significant power upgrades to recently purchased buildings, underscoring long-term confidence even as the market works through elevated vacancy and uneven absorption,” CBRE says of Houston’s data center presence.

One project alone promises to significantly boost the Houston market’s data center capacity. Data center developer Serverfarm plans to use part of a $3 billion credit facility to build a 250-acre, AI-ready data center campus near Houston with a potential capacity of more than 500 megawatts. The Houston campus and two other Serverfarm projects are already leased to unidentified tenants, according to CoStar.

A 60-megawatt, AI-ready Serverfarm data center is under construction in Houston. The $137 million, 438,000-square-foot project, located near the former headquarters of computer manufacturer Compaq, is supposed to be completed in the third quarter of 2027.

Data Center Map identifies 59 data centers in the Houston area managed by 36 operators, including DataBank, Data Foundry, Digital Realty, IBM, Logix Fiber Networks, Lumen and TRG Datacenters. That compares with more than 180 data centers in Dallas-Fort Worth, more than 50 in the San Antonio area and 40 in the Austin area.

Texas is home to more than 400 data centers, according to Data Center Map.

In November, Google said it’s investing $40 billion to build AI data centers in West Texas and the Texas Panhandle.

“This is a Texas-sized investment in the future of our great state,” Gov. Greg Abbott said when Google’s commitment was announced. “Texas is the epicenter of AI development, where companies can pair innovation with expanding energy. Google's $40 billion investment makes Texas Google's largest investment in any state in the country and supports energy efficiency and workforce development in our state.”

A new report shows that Texas data centers used 25 billion gallons of water in 2025. Photo via HARC report.

Texas data center boom could strain water supply, new report warns

thirst for data

As data centers continue to boom throughout Texas, a new report from the Houston Advanced Research Center (HARC) warns that the trend could strain the state’s water supply.

HARC estimates Texas data centers used 25 billion gallons of water in 2025—and that the demand for water will continue to rise to meet the needs of the 464 data centers currently in Texas, as well as 70 additional sites currently under development.

In the report, titled “Thirsty Data and the Lone Star State: The Impact of Data Center Growth on Texas’ Water Supply,” The Woodlands-based nonprofit says that water use for cooling data centers is expected to double or triple by 2028 on the national level. If projections hold, the total annual water use for data centers in Texas will increase by 0.5 percent to 2.7 percent by 2030, or to between 29 billion and 161 billion gallons of water consumed.

Data centers often use water for cooling, though water demand is dependent on the type of cooling used, the size and type of the data center. Although used water can be reused, some new water withdrawals are always needed to replace evaporated water and other systems’ water losses. Water is also used to cool the power plants that generate electricity used by the data centers.

The HARC report offers guidance to address the overall concerns of water demands by data centers, including:

  • Dry cooling methods
  • Increased reliance on wind and solar energy sources
  • Alternative water supplies, like treated wastewater or brackish water for cooling
  • Adjusted operating schedules to accommodate water usage
  • Partnering with local companies to develop projects that reduce water leaks
  • Companies creating their own water infrastructure investments

The report goes on to explain that the Texas State Water Plan, produced by the Texas Water Development Board, projects shortages of 1.6 trillion gallons by 2030 and 2.3 trillion gallons by 2070. HARC posits that the recent surge in water demand from AI data centers is not fully reflected in those projections.

"Texas water plans always look backward, not forward," the report reads. "That means the 2027 water plan, which is in development now, will be based on 2026 regional water plans that do not include forecasted data center water use. Data centers that began operation in 2025 will not be added to the State Water Plan until 2032."

Currently, there are no state regulations that require data centers to report how much water they use. However, the Public Utility Commission of Texas (PUC) plans to survey operators of data centers and cryptocurrency mining facilities on their water consumption, cooling methods and electricity sources this spring. It is expected to release the results by the end of the year. The companies will have six weeks to respond. The Texas Water Development Board will assist the PUCT on the questions.

“I think we all recognize the importance of data centers and the technology they support and what they give to our modern-day life,” PUC Commissioner Courtney Hjaltman said during the last commission meeting. “Texans, regulators and the legislature really need that understanding of data centers, really need to understand the water they’re using so that we can plan and create the Texas we want.”

See the full HARC report here.
Hadi Ghasemi, a University of Houston professor, has uncovered a method to release heat from data centers and electronics at record performance. Photo courtesy UH.

Houston researcher develops efficient method to cool AI data centers

cool findings

A University of Houston professor has developed a new cooling method that can remove heat at least three times more effectively from AI data centers than current technologies.

Hadi Ghasemi, a distinguished professor of Mechanical & Aerospace Engineering at UH, published his findings in two articles in the International Journal of Heat and Mass Transfer. The findings solve a critical issue in the growing AI sector, according to UH.

High-powered AI data centers generate huge amounts of heat due to the GPU and operating systems they use with extreme power densities, which introduce complex thermal challenges. Traditionally, cooling methods, like microchannels, which use flow and spray cooling, have had limitations when exposed to extreme heat flux, according to UH.

Ghasemi’s research, however, found a more effective way to design thin-film evaporation structures to release heat from data centers and electronics at record performance.

Ghasem’s solution coupled topology optimization and AI modeling to determine the best shapes for thin film efficiency, ultimately landing on a branch-like structure—resembling a tree.

The model found that the “branches” needed to be about 50 percent solid and 50 percent empty space for optimum efficiency, and that they could sustain high heat fluxes with minimal thermal resistance.

“These structures could achieve high critical heat flux at much lower superheat compared to traditionally studied structures,” Ghasemi said in a news release. “The new structures can remove heat without having to get as hot as previous removal systems.

Ghasemi’s doctoral candidates, Amirmohammad Jahanbakhsh and Saber Badkoobeh Hezave, also worked on the project. The team believes their results show the impact of a physics-aware, AI design and can help ensure reliability, longevity and stability of AI data centers.

“Beyond achieving record performance, these new findings provide fundamental insight into the governing heat-transfer physics and establishes a rational pathway toward even higher thermal dissipation capacities,” Ghasemi added in the release

A new report shows the role Texas could play as the data-center sector enters "hyperdrive." Photo via JLL.com.

Texas could topple Virginia as biggest data-center market by 2030, JLL report says

data analysis

Everything’s bigger in Texas, they say—and that phrase now applies to the state’s growing data-center presence.

A new report from commercial real estate services provider JLL says Texas could overtake Northern Virginia as the world’s largest data-center market by 2030. Northern Virginia is a longtime holder of that title.

What’s driving Texas’ increasingly larger role in the data-center market? The key factor is artificial intelligence.

Companies like Google and Microsoft need more energy-hungry data centers to power AI innovations. In a 2023 article, Forbes explained that AI models consume a lot of energy because of the massive amount of data used to train them, as well as the complexity of those models and the rising volume of tasks assigned to AI.

“The data-center sector has officially entered hyperdrive,” Andy Cvengros, executive managing director at JLL and co-leader of its U.S. data-center business, said in the report. “Record-low vacancy sustained over two consecutive years provides compelling evidence against bubble concerns, especially when nearly all our massive construction pipeline is already pre-committed by investment-grade tenants.”

Dallas-Fort Worth has long dominated the Texas data-center market. But in recent years, West Texas has emerged as a popular territory for building data-center campuses, thanks in large part to an abundance of land and energy. Nearly two-thirds of data-center construction underway now is happening in “frontier markets” like West Texas, Ohio, Tennessee and Wisconsin, the JLL report says.

Northern Virginia, the current data-center champ in the U.S., boasted a data-center market with 6,315 megawatts of capacity at the end of 2025, the report says. That compares with 2,423 megawatts in Dallas-Fort Worth, 1,700 megawatts in the Austin-San Antonio corridor, 200 megawatts in West Texas, and 164 megawatts in Houston.

Musk has vowed to upend another industry. Photo via Getty Images

Elon Musk vows to put data centers in space and run them on solar power

Outer Space

Elon Musk vowed this week to upend another industry just as he did with cars and rockets — and once again he's taking on long odds.

The world's richest man said he wants to put as many as a million satellites into orbit to form vast, solar-powered data centers in space — a move to allow expanded use of artificial intelligence and chatbots without triggering blackouts and sending utility bills soaring.

To finance that effort, Musk combined SpaceX with his AI business on Monday, February 2, and plans a big initial public offering of the combined company.

“Space-based AI is obviously the only way to scale,” Musk wrote on SpaceX’s website, adding about his solar ambitions, “It’s always sunny in space!”

But scientists and industry experts say even Musk — who outsmarted Detroit to turn Tesla into the world’s most valuable automaker — faces formidable technical, financial and environmental obstacles.

Feeling the heat

Capturing the sun’s energy from space to run chatbots and other AI tools would ease pressure on power grids and cut demand for sprawling computing warehouses that are consuming farms and forests and vast amounts of water to cool.

But space presents its own set of problems.

Data centers generate enormous heat. Space seems to offer a solution because it is cold. But it is also a vacuum, trapping heat inside objects in the same way that a Thermos keeps coffee hot using double walls with no air between them.

“An uncooled computer chip in space would overheat and melt much faster than one on Earth,” said Josep Jornet, a computer and electrical engineering professor at Northeastern University.

One fix is to build giant radiator panels that glow in infrared light to push the heat “out into the dark void,” says Jornet, noting that the technology has worked on a small scale, including on the International Space Station. But for Musk's data centers, he says, it would require an array of “massive, fragile structures that have never been built before.”

Floating debris

Then there is space junk.

A single malfunctioning satellite breaking down or losing orbit could trigger a cascade of collisions, potentially disrupting emergency communications, weather forecasting and other services.

Musk noted in a recent regulatory filing that he has had only one “low-velocity debris generating event" in seven years running Starlink, his satellite communications network. Starlink has operated about 10,000 satellites — but that's a fraction of the million or so he now plans to put in space.

“We could reach a tipping point where the chance of collision is going to be too great," said University at Buffalo's John Crassidis, a former NASA engineer. “And these objects are going fast -- 17,500 miles per hour. There could be very violent collisions."

No repair crews

Even without collisions, satellites fail, chips degrade, parts break.

Special GPU graphics chips used by AI companies, for instance, can become damaged and need to be replaced.

“On Earth, what you would do is send someone down to the data center," said Baiju Bhatt, CEO of Aetherflux, a space-based solar energy company. "You replace the server, you replace the GPU, you’d do some surgery on that thing and you’d slide it back in.”

But no such repair crew exists in orbit, and those GPUs in space could get damaged due to their exposure to high-energy particles from the sun.

Bhatt says one workaround is to overprovision the satellite with extra chips to replace the ones that fail. But that’s an expensive proposition given they are likely to cost tens of thousands of dollars each, and current Starlink satellites only have a lifespan of about five years.

Competition — and leverage

Musk is not alone trying to solve these problems.

A company in Redmond, Washington, called Starcloud, launched a satellite in November carrying a single Nvidia-made AI computer chip to test out how it would fare in space. Google is exploring orbital data centers in a venture it calls Project Suncatcher. And Jeff Bezos’ Blue Origin announced plans in January for a constellation of more than 5,000 satellites to start launching late next year, though its focus has been more on communications than AI.

Still, Musk has an edge: He's got rockets.

Starcloud had to use one of his Falcon rockets to put its chip in space last year. Aetherflux plans to send a set of chips it calls a Galactic Brain to space on a SpaceX rocket later this year. And Google may also need to turn to Musk to get its first two planned prototype satellites off the ground by early next year.

Pierre Lionnet, a research director at the trade association Eurospace, says Musk routinely charges rivals far more than he charges himself —- as much as $20,000 per kilo of payload versus $2,000 internally.

He said Musk’s announcements this week signal that he plans to use that advantage to win this new space race.

“When he says we are going to put these data centers in space, it’s a way of telling the others we will keep these low launch costs for myself,” said Lionnet. “It’s a kind of powerplay.”

Ad Placement 300x100
Ad Placement 300x600

CultureMap Emails are Awesome

Fervo Energy leads Time’s top green tech companies of 2026

top spot

The accolades keep coming for Houston-based geothermal energy company Fervo Energy.

Fervo sits atop Time magazine’s and Statista’s 2026 list of America’s Top GreenTech Companies. Fervo ranked No. 6 on the list last year.

The ranking honors 250 companies in the U.S. based on their environmental impact, innovation and financial strength. Fervo joins five other Houston-area companies on the list.

  • No. 49 Quaise Energy, an MIT Energy Initiative spinout that’s developing a drilling system designed to convert existing power stations for geothermal power production
  • No. 71 Plus Power, which develops, owns and operates battery energy storage systems
  • No. 98 Utility Global, whose technology enables industrial decarbonization
  • No. 199 Solugen, whose technology converts plant-based feedstocks into carbon-negative chemicals
  • No. 215 Noodoe, which specializes in EV charging stations and software

Fervo says its approach to enhanced geothermal systems (EGS)—including horizontal drilling, AI-enabled drilling and exploration, advanced reservoir engineering, and fiber-optic sensing—demonstrates how validated technology can help deliver reliable zero-emission power.

“By applying drilling technology from the oil and gas industry, we have proven that we can produce 24/7 carbon-free energy resources in new geographies across the world,” Fervo co-founder and CEO Tim Latimer said last year.

Other recent recognitions for Fervo includes:

  • The 2025 Houston Innovation Awards named it Scaleup of the Year
  • MIT Technology Review put Fervo on its 2025 list of the 10 global climatech companies to watch
  • Time named Fervo one of the 100 Most Influential Companies of 2025
  • Fervo was hailed as the Global Cleantech Group 100 North American Company of the Year
  • Fervo was among Congruent Ventures’ and Silicon Valley Bank’s 50 by 2050 companies, all of which are poised to advance global decarbonization over a 25-year span
Just last month, Fervo secured $421 million in debt financing for the construction of its 500-megawatt Cape Station geothermal project in Utah. And in December, the company landed an oversubscribed $462 million Series E round of funding, pushing its valuation to an estimated $1.4 billion. Fervo filed for an IPO earlier this year.

3 strategies to strengthen the Gulf Coast as a global energy hub

The View from HETI

The Texas-Louisiana Gulf Coast is the backbone of America’s energy and chemical economy. Texas produces roughly 43% of U.S. crude oil and 28% of natural gas, while Texas and Louisiana together account for about half of the nation’s refining capacity, processing 9.3 million barrels of crude per day across 50 refineries. The region also produces approximately 80% of the nation’s primary petrochemicals and ships more than $117 billion in chemical products annually from Texas alone.

This unmatched concentration of refining, petrochemical manufacturing, pipelines, ports, and technical talent makes the Gulf Coast one of the most critical energy hubs in the world. But maintaining that leadership in a rapidly evolving global market will require intentional collaboration, faster technology commercialization, and strengthened supply chain resilience.

In fall 2025, the Greater Houston Partnership’s Houston Energy Transition Initiative (HETI) convened national laboratories, Gulf Coast universities, and industry leaders to examine how to reinforce the region’s long-term competitiveness. Participants included Argonne, Oak Ridge, Lawrence Berkeley, the National Energy Technology Laboratory (NETL), and the National Laboratory of the Rockies, alongside Gulf Coast academic institutions and energy and chemical companies. Here are the key findings and takeaways from the workshop.

1. Supply Chain Resilience Requires Structured Industry–Lab Collaboration

Resilience—diversity of supply, operational flexibility, and rapid recovery—was a recurring theme. Recent disruptions exposed vulnerabilities in tightly interconnected energy and manufacturing systems.

National laboratories provide capabilities that complement Gulf Coast industrial scale, particularly at early and mid technology readiness levels (TRLs 1–7), before full commercial deployment. Examples include:

  • Advanced manufacturing and AI-enabled validation of critical components (Oak Ridge).
  • Materials scale-up and techno-economic modeling to move from lab discovery to industrial relevance (Argonne).
  • Pilot-scale testing for severe-service alloys, chemical conversion, and process innovation (NETL).
  • Integrated energy systems modeling to assess grid resilience and system disruptions (National Laboratory of the Rockies).

Recommendation: Organize targeted Gulf Coast industry missions to national laboratories focused on critical supply chains—power equipment, high-heat industrial processes, novel catalysts, refining, and grid infrastructure—to identify joint development opportunities and reduce time to commercialization.

2. Modeling, AI, and Open-Access Platforms Can Bridge the Technology Gap

A persistent barrier to innovation is the gap between scientific discovery, applied development, and commercial deployment. Universities often operate at TRLs 1–3, national labs at 1–7, and industry at 7–9. Bridging these silos requires shared modeling tools, high-performance computing, and structured feedback loops.

National labs maintain open-access platforms capable of:

  • Simulating grid expansion, investment, and dispatch decisions.
  • Modeling cradle-to-gate industrial material flows.
  • Optimizing complex energy and chemical systems.
  • De-risking carbon capture, critical mineral recovery, and advanced manufacturing integration.

Recommendation: HETI should convene structured training and feedback sessions on these public modeling platforms—ensuring Gulf Coast industry can apply, improve, and help guide further development of tools critical to regional competitiveness. Federal initiatives such as the Genesis Mission, focused on AI-accelerated scientific discovery, further expand opportunities for Gulf Coast participation.

3. Time to Commercialization Is the Ultimate Competitive Metric

The lithium-ion battery is a cautionary example: while pioneered in U.S. labs, large-scale manufacturing leadership shifted overseas. Without strategic intervention, U.S. firms are projected to capture less than 30% of domestic lithium battery cell value by 2030.

Successful DOE-backed consortium models show that mission-aligned, multi-partner collaboration reduces development timelines and strengthens domestic manufacturing know-how. However, public–private partnership mechanisms such as CRADAs and Strategic Partnership Projects can be time-intensive.

Recommendation: The Gulf Coast should actively engage DOE and national laboratories to streamline public–private partnership pathways, improve intellectual property clarity, and expand industry access to laboratory infrastructure.

The Path Forward: A Gulf Coast Consortium Model
The workshop’s central conclusion was clear: the Gulf Coast should formalize collaboration through a regional industry–academia–laboratory consortium.

Such a model could:

  • Co-locate national lab researchers within the region.
  • Share modeling data and analytical capabilities.
  • Establish open-access pilot facilities that complement lab infrastructure.
  • Harmonize IP frameworks to accelerate licensing and deployment.

With its dense industrial ecosystem, technical workforce, and decision-making concentration, the Gulf Coast is uniquely positioned to serve as a national demonstration hub for advanced energy and chemical manufacturing.

If industry, universities, and national laboratories align around a shared regional strategy, the Gulf Coast can:

  • Accelerate commercialization timelines.
  • Strengthen critical supply chains.
  • Unleash a world-class technical workforce.
  • Reinforce U.S. leadership in strategic energy and chemical sectors.

———

This article originally appeared on the Greater Houston Partnership's Houston Energy Transition Initiative blog. A full report on the key learnings and recommendations from the workshop can be found here: https://bit.ly/4uEDEqk.

Houston cleantech company closes $12M seed round

fresh funding

Houston-based Helix Earth Technologies has closed a $12 million Seed 2 funding round to scale manufacturing of its energy-efficient commercial HVAC add-on technology.

Veriten, a Houston-based energy investment firm, led the round. Rua Ventures, Carnrite Ventures, Skywriter LLC and Textbook Ventures also participated.

Helix Earth—which was founded based on NASA technology, spun out of Rice University and has been incubated at Greentown Labs—is developing high-efficiency retrofit dehumidification systems that aim to reduce the energy consumption of commercial HVAC units. The company reports that its technology can lead to "healthier indoor air, lower energy bills, reduced building maintenance, and more comfortable spaces for building owners and occupants."

"Building owners are dealing with rising energy costs, uncontrolled humidity, and aging infrastructure with no viable, cost-effective path forward. We are in the field today solving these problems for commercial customers, and this capital puts us on an aggressive path to scale,” Rawand Rasheed, Helix Earth co-founder and CEO, said in a news release.

“The strength of this round reinforces our team's conviction that we can transform innovation-starved sectors with transformational solutions that deliver order-of-magnitude improvements to owners and operators, for both their bottom line and the environment,” Rasheed added.

Maynard Holt, Veriten’s founder and CEO, said that the investment firm is tripling its investment in Helix Earth.

"The team has built breakthrough technology with real applicability across multiple industries,” Holt said in the release. “Their first product will have an immediate and measurable impact on our energy system, and they are already pursuing adjacent innovations to help heavy industries operate more efficiently and with less waste. This is a well-rounded team with a proven track record of strong execution and disciplined capital management.”

Helix Earth also closed a $5.6 million seed funding round in 2024, led by Veriten.

Last year, the company secured a $1.2 million Small Business Innovation Research (SBIR) Phase II grant and won in the Smart Cities, Transportation & Sustainability contest at the 2025 SXSW Pitch Showcase. Rasheed was also named to the Forbes 30 Under 30 Energy and Green Tech list for 2025.