Collide has rolled out RIGGS, a large language model for energy professionals. Photo via Getty Images

Houston-based Collide is looking to solve AI issues in the energy industry from within.

Co-founded by former oil roughneck Collin McLelland, the company has developed AI software for operators and field teams, shaped by firsthand oilfield experience. Its AI-native platform “retrieves and synthesizes data from authoritative sources to deliver accurate, cited, and energy-focused insights to oil and gas professionals,” according to the company.

“Oil and gas has a graveyard full of technology that was technically impressive and operationally useless,” McLelland tells Energy Capital. “The reason is almost always the same: the people who built it didn't understand what they were actually solving for. When you're an outsider, you see workflows and try to automate them. When you're an insider, you understand why those workflows exist—the regulatory constraints, the physical realities, the liability concerns, the trust dynamics between operators and service companies.”

Collide’s large language model, known as RIGGS, performed well in recent benchmarking results when taking a standardized petroleum engineering (SPE) exam, the company reports. The exam assesses understanding from conceptual terminology to complex mathematical problem-solving.

According to Collide, RIGGS achieved a score of 67.5 percent on a 40-question subset of the SPE petroleum engineering exam, outperforming other large language models like Grok 4 (62.5 percent), Claude Sonnet 4.5 (52.5 percent) and GPT 5.1 (4 percent).

RIGGS completed the test in 15 minutes, while Grok took two hours. Collide hopes over the next few months, RIGGS will receive a score between 75 percent to 80 percent accuracy.

The software could potentially help oil and gas companies produce accurate outputs and automate trivial workflows, which can open up valuable time for engineers and teams to work on other pressing matters, according to McLelland.

“Collide exists because we sat in those seats — we were the engineers, the operators, the field guys,” he says. ”RIGGS scoring higher on the PE exam versus the frontier labs isn't a party trick. It's evidence that the model understands petroleum engineering the way a petroleum engineer does, because it was built by people who do.”

RIGGS was trained on Collide’s Spindletop hardware and is supported by a vast library of information, as well as a reasoning engine and validation layer that uses logic to solve problems.

“Longer term, we see RIGGS as the intelligence layer that sits underneath every operator's workflow — not a chatbot you open in a browser, but something embedded in the tools engineers already use,” McLelland says. “The goal is to give every engineer the knowledge and pattern recognition of a 30-year veteran, on demand."

According to McLelland, Collide is already building toward reservoir analysis and production optimization, automated regulatory compliance (Railroad Commission filings, W-10s, G-10s), workover report generation, and engineering decision support in the field for near-term use cases. In March, Collide and Texas-based oil and gas operator Winn Resources announced a collaboration to automate the time-intensive process of filing monthly W-10 and G-10 forms with the Texas Railroad Commission, completing what’s normally a multi-hour task in under 30 minutes. Collide reports that Winn’s infrastructure now automates regulatory filings and provides real-time visibility into data gaps, which has reduced processing time by over 95 percent.

“Before Collide, I'd spend hours manually keying in filings,” Buck Crum, director of operations, said in a news release. “(In March), we had 50 wells to file and I was done in 20 minutes. It does the majority of the heavy lifting while keeping me in control. That human-in-the-loop approach saves meaningful time and gives us greater confidence in our compliance and reporting.”

Collide was originally launched by Houston media organization Digital Wildcatters as “a professional network and digital community for technical discussions and knowledge sharing.” After raising $5 million in seed funding led by Houston’s Mercury Fund last year, the company said it would shift its focus to rolling out its enterprise-level, AI-enabled solution.
Merab Momen, founder of AI CTO Services. Courtesy Photo

How this Houston expert helps startups turn AI hype into real impact

now streaming

Artificial intelligence is now everywhere. It is mentioned in every startup pitch deck, and every corporate roadmap claims to use it. However, many early-stage businesses struggle with the simple question, “What does AI actually mean for my business?”

In a recent podcast episode of EnergyTech Startups, Merab Momen, founder of AI CTO Services and a long time AI practitioner, explains why most founders misunderstand AI, how startups can practically apply it and why Houston is quietly becoming a serious hub for AI-driven innovation.

Filling the AI Leadership Gap

Merab’s career has spanned decades of technology transitions. He worked on neutral networks in the 1990s, constructed computer vision systems long before they were common, and helped install AI solutions inside huge industrial companies. However, he noticed a huge problem when generative AI started to explode into the mainstream-The requirement of a real partner by the founders for AI integration but inability to rely on a full-time CTO and project-based consultants.

“I really needed something which is much more engaging where I can give that partner-level advice to the founders,” he said. By giving firms on-demand access to high-level AI knowledge and expertise, his methodology enables them to analyse tools, steer clear of cost blunders and eventually transition to a permanent technology leader when the time is right.

AI is Older than Most People Think

Despite its recent rise in popularity, AI is nothing new. AI actually began in the 1950s. Merab in his conversation explained how he worked on his first AI project back in the year 1996 that worked perfectly, but the processing power wasn’t just there to make it practical. He continued how he utilized the swarm intelligence models to optimize supply chains, now referred to as MLPOs and data engineering.

From Language Models to Physical World

Much of the public conversation about AI revolves around chatbots and text generation. But Merab sees far greater potential in AI’s interaction with the physical world, especially in industrial settings. He emphasized edge computing and vision language models (VLMs) as significant advances in manufacturing and energy. This physical shift is opening doors for new opportunities for robotics, automated inspections, and industrial safety applications. Merab added that Houston is uniquely positioned for this transition.

Why Houston has an AI Advantage

Silicon Valley may dominate the AI headlines, but Merab believes Houston’s advantage lies beneath the surface. The city doesn’t lag in AI utilization; it just operates in industries where results show differently.

Machine learning isn’t new to Houston’s core industries. Energy companies, manufacturers, logistics providers, and healthcare systems have been using advanced analytics for decades. The difference lies in them innovating in industrial sectors rather than consumer technology.

What’s Next

With the AI CTO Services growing, Merab is working with startups across industries to deploy AI in practical, business-first ways.

He is more interested in assisting founders in finding answers to critical issues than following new trends.

For Houston’s energy and climate tech community, it needs to transform AI enthusiasm into real-world impact.

Listen to the full conversation with Mehrab Momin on the Energy Tech Startups Podcast to learn more.

---

Energy Tech Startups Podcast is hosted by Jason Ethier and Nada Ahmed. It delves into Houston's pivotal role in the energy transition, spotlighting entrepreneurs and industry leaders shaping a low-carbon future.


A new report shows the role Texas could play as the data-center sector enters "hyperdrive." Photo via JLL.com.

Texas could topple Virginia as biggest data-center market by 2030, JLL report says

data analysis

Everything’s bigger in Texas, they say—and that phrase now applies to the state’s growing data-center presence.

A new report from commercial real estate services provider JLL says Texas could overtake Northern Virginia as the world’s largest data-center market by 2030. Northern Virginia is a longtime holder of that title.

What’s driving Texas’ increasingly larger role in the data-center market? The key factor is artificial intelligence.

Companies like Google and Microsoft need more energy-hungry data centers to power AI innovations. In a 2023 article, Forbes explained that AI models consume a lot of energy because of the massive amount of data used to train them, as well as the complexity of those models and the rising volume of tasks assigned to AI.

“The data-center sector has officially entered hyperdrive,” Andy Cvengros, executive managing director at JLL and co-leader of its U.S. data-center business, said in the report. “Record-low vacancy sustained over two consecutive years provides compelling evidence against bubble concerns, especially when nearly all our massive construction pipeline is already pre-committed by investment-grade tenants.”

Dallas-Fort Worth has long dominated the Texas data-center market. But in recent years, West Texas has emerged as a popular territory for building data-center campuses, thanks in large part to an abundance of land and energy. Nearly two-thirds of data-center construction underway now is happening in “frontier markets” like West Texas, Ohio, Tennessee and Wisconsin, the JLL report says.

Northern Virginia, the current data-center champ in the U.S., boasted a data-center market with 6,315 megawatts of capacity at the end of 2025, the report says. That compares with 2,423 megawatts in Dallas-Fort Worth, 1,700 megawatts in the Austin-San Antonio corridor, 200 megawatts in West Texas, and 164 megawatts in Houston.

UH researchers have developed a thin film that could allow AI chips to run cooler and faster. Photo courtesy University of Houston.

Houston researchers develop energy-efficient film for AI chips

AI research

A team of researchers at the University of Houston has developed an innovative thin-film material that they believe will make AI devices faster and more energy efficient.

AI data centers consume massive amounts of electricity and use large cooling systems to operate, adding a strain on overall energy consumption.

“AI has made our energy needs explode,” Alamgir Karim, Dow Chair and Welch Foundation Professor at the William A. Brookshire Department of Chemical and Biomolecular Engineering at UH, explained in a news release. “Many AI data centers employ vast cooling systems that consume large amounts of electricity to keep the thousands of servers with integrated circuit chips running optimally at low temperatures to maintain high data processing speed, have shorter response time and extend chip lifetime.”

In a report recently published in ACS Nano, Karim and a team of researchers introduced a specialized two-dimensional thin film dielectric, or electric insulator. The film, which does not store electricity, could be used to replace traditional, heat-generating components in integrated circuit chips, which are essential hardware powering AI.

The thinner film material aims to reduce the significant energy cost and heat produced by the high-performance computing necessary for AI.

Karim and his former doctoral student, Maninderjeet Singh, used Nobel prize-winning organic framework materials to develop the film. Singh, now a postdoctoral researcher at Columbia University, developed the materials during his doctoral training at UH, along with Devin Shaffer, a UH professor of civil engineering, and doctoral student Erin Schroeder.

Their study shows that dielectrics with high permittivity (high-k) store more electrical energy and dissipate more energy as heat than those with low-k materials. Karim focused on low-k materials made from light elements, like carbon, that would allow chips to run cooler and faster.

The team then created new materials with carbon and other light elements, forming covalently bonded sheetlike films with highly porous crystalline structures using a process known as synthetic interfacial polymerization. Then they studied their electronic properties and applications in devices.

According to the report, the film was suitable for high-voltage, high-power devices while maintaining thermal stability at elevated operating temperatures.

“These next-generation materials are expected to boost the performance of AI and conventional electronics devices significantly,” Singh added in the release.

---

This article originally appeared on our sister site, InnovationMap.

ExxonMobil is on Fortune's first-ever AIQ ranking. Getty Images

2 Houston energy giants appear on Fortune’s inaugural AI ranking

AI Leaders

Two Houston-area energy leaders appear on Fortune’s inaugural list of the top adopters of AI among Fortune 500 companies.

They are:

  • No. 7 energy company ExxonMobil, based in Spring
  • No. 47 energy company Chevron, based in Houston

They are joined by Spring-based tech company Hewlett Packard Enterprise, No. `19.

All three companies have taken a big dive into the AI pool.

In 2024, ExxonMobil’s executive chairman and CEO, Darren Woods, explained that AI would play a key role in achieving a $15 billion reduction in operating costs by 2027.

“There is a concerted effort to make sure that we're really working hard to apply that new technology to the opportunity set within the company to drive effectiveness and efficiency,” Woods told Wall Street analysts.

At Chevron, AI tools are being used to quickly analyze data and extract insights from it, according to tech news website VentureBeat. Also, Chevron employs advanced AI systems known as large language models (LLMs) to create engineering standards, specifications and safety alerts. AI is even being put to work in Chevron’s exploration initiatives.

Bill Braun, Chevron’s chief information officer, said at a VentureBeat-sponsored event in 2024 that AI-savvy data scientists, or “digital scholars,” are always embedded within workplace teams “to act as a catalyst for working differently.”

The Fortune AIQ 50 ranking is based on ServiceNow’s Enterprise AI Maturity Index, an annual measurement of how prepared organizations are to adopt and scale AI. To evaluate how Fortune 500 companies are rolling out AI and how much they value AI investments, Fortune teamed up with Enterprise Technology Research. The results went into computing an AIQ score for each company.

At the top of the ranking is Alphabet (owner of Google and YouTube), followed by Visa, JPMorgan Chase, Nvidia and Mastercard. Aside from ExxonMobil, Hewlett Packard Enterprise, and Chevron, two other Texas companies made the list: Arlington-based homebuilder D.R. Horton (No. 29) and Austin-based software company Oracle (No. 37).

“The Fortune AIQ 50 demonstrates how companies across industry sectors are beginning to find real value from the deployment of AI technology,” Jeremy Kahn, Fortune’s AI editor, said in a news release. “Clearly, some sectors, such as tech and finance, are pulling ahead of others, but even in so-called 'old economy' industries like mining and transport, there are a few companies that are pulling away from their peers in the successful use of AI.


---

This article originally appeared on InnovationMap.com.

Ad Placement 300x100
Ad Placement 300x600

CultureMap Emails are Awesome

Oxy officially announces CEO transition, names successor

new leader

Houston-based Occidental (Oxy) has officially announced its longtime CEO's retirement and her successor.

Oxy shared last week that Vicki Hollub will retire June 1. Reuters first reported Hollub's plan to retire in March, but a firm date had not been set. Hollub will remain on Oxy's board of directors.

Richard Jackson, who currently serves as Oxy's COO, will replace Hollub in the CEO role.

“It has been a privilege to lead Occidental and work alongside such a talented team for more than 40 years," Hollub shared in a news release. "Following the recently completed decade-long transformation of the company, we now have the best portfolio and the best technical expertise in Occidental’s history. With this strong foundation in place, a clear path forward and a leader like Richard, who has the experience and vision to elevate Occidental, now is the right time for this transition. “I look forward to supporting Richard and the Board through my continued role as a director.”

Hollub has held the top leadership position at Oxy since 2016 and has been with the energy giant for more than 40 years. Before being named CEO, she served as COO and senior executive vice president at the company. She led strategic acquisitions of Anadarko Petroleum in 2019 and CrownRock in 2024, and was the first woman selected to lead a major U.S. oil and gas company.

Hollub also played a key role in leading Oxy's future as a "carbon management company."

Jackson has been with Oxy since 2003. He has held numerous leadership positions, including president of U.S. onshore oil and gas, president of low carbon integrated technologies, general manager of the Permian Delaware Basin and enhanced oil recovery oil and gas, vice president of investor relations, and vice president of drilling Americas.

He was instrumental in launching Oxy Low Carbon Ventures, which focuses DAC, carbon sequestration and low-carbon fuels through businesses like 1PointFive, TerraLithium and others, according to the company. He also serves on the Oil and Gas Climate Initiative’s Climate Investment Board and the American Petroleum Institute’s Upstream Committee. He holds a bachelor's degree in petroleum engineering from Texas A&M University.

Jackson was named COO of Oxy in October 2025. In his new role as CEO, he will also join the board of directors, effective June 1.

“I am grateful to be appointed President and CEO of Occidental and excited about the opportunity to execute from the strong position and capabilities that we built under Vicki’s leadership,” Jackson added in the release. “It means a lot to me personally to be a part of our Occidental team. I am committed to delivering value from our significant and high-quality resource base. We have a tremendous opportunity to focus on organic improvement and execution to deliver meaningful value for our employees, shareholders and partners.”

Texas data center proposed by U.S. Army could use more power than El Paso

Big Data

The U.S. Army is proposing developing a gargantuan, 3-gigawatt data center complex on Fort Bliss property that within a few years would consume more electricity than all of El Paso Electric’s 460,000 customers combined – even as questions about its development, water usage and air pollution remain unanswered.

If built, it would be the third major data center project in the El Paso region, along with Meta Platform’s $10 billion facility in Northeast and the $165 billion Project Jupiter campus that Oracle and OpenAI are building in Santa Teresa, New Mexico. The combined scale and size of the three facilities could quickly transform the Borderland into one of the nation’s core hubs of power generation and AI infrastructure.

The publicly-traded investment firm Carlyle Group would pay to build and operate the Fort Bliss data center – one of several planned in a national rollout under President Donald Trump’s administration to rapidly increase artificial intelligence technology for the Department of Defense.

At Fort Bliss, the Army is “targeting an initial operating capacity of about 100 megawatts on the compute side” by next year, David Fitzgerald, deputy undersecretary of the Army, said during a meeting with reporters April 22. An official estimated cost for the project has yet to be released.

By 2029, the complex on military land in far East El Paso would require 3 gigawatts of electricity, Fitzgerald said. By comparison, El Paso Electric currently maintains about 2.9 gigawatts of generation capacity across its entire system that spans from Hatch, New Mexico, to Van Horn, Texas. The highest customer demand the power company has ever seen was just over 2.3 gigawatts during the summer of 2023.

And whether most El Pasoans are on board with the rapid buildout of another data center here is not a question that Army leadership is asking at this point.

“What we’re trying to do is find where are the common interests, common ground that we can solve for?” Fitzgerald said, referring to coordinating with El Paso city leaders on the data center project.

“The state of modern warfare and future warfare is largely going to depend on the ability to capture, process and utilize massive amounts of data,” he said. “So, the reality is, this is a strategic priority, not just for the Army, but for the entire Department of War. So, we need these capabilities, and we need to put them somewhere.”

Combined-cycle natural gas turbines are the “most likely” source of electricity generation for the facility, said Jeff Waksman, an assistant secretary of the Army and former member of Trump’s first administration.

Waksman said the facility would undergo environmental review before construction starts.

Still, there are far more outstanding questions than answers about the proposed Fort Bliss data center.

It’s unclear if the facility would connect to El Paso Water’s water system. The city-owned water utility pointed out that Fort Bliss Water provides water service for the installation. However, El Paso Water can provide “backup” service to the base, according to the project solicitation documents.

“EPWater was just recently brought into the discussion, and we only have preliminary information,” El Paso Water said in a statement. “The construction and water use would be entirely on federal property.”

El Paso Electric said it’s also uncertain whether the data center will connect to the utility’s power grid and will figure that out in the future. To date, the Army hasn’t made a formal request for service from El Paso Electric.

Officials from the U.S. Army “confirmed that questions regarding the power source and whether it will be connected to the regional grid remain under review and have plans to establish a data center with a projected demand of 3 gigawatts,” El Paso Electric said in a statement. “Ultimately, decisions about these matters will be made by Fort Bliss leadership, and we defer to them for further comment.”

A representative with Carlyle Group at a recent community meeting didn’t answer questions or provide details about the proposed data center facility and the related power generation source.

Carlyle Group did not respond to a request for comment.

Army officials said they don’t yet have a definitive agreement in place with Carlyle, which was conditionally selected to enter into exclusive negotiations, so few details are finalized.

However, the Army has set a short timeline to start operating by late 2027. That means construction will have to start soon, Fitzgerald said.

“The ideal endstate is that we would be at least (operational) by the end of ’27, which is moving pretty quick,” Fitzgerald said. “That would mean construction would need to begin in the not-so-distant future.”

Water, electricity concerns

Meeting three gigawatts of electricity demand with natural gas-fired turbines – cited by Army officials as the most likely power source – would likely produce huge amounts of greenhouse gases in a central area of El Paso, such as carbon dioxide, as well as other harmful pollutants including particulate matter.

And even if the data center doesn’t take service from El Paso Water and instead receives water from wells managed by Fort Bliss, it would rely on groundwater pumped out of the Hueco Bolson aquifer, the city’s main source of water.

The solicitation issued by the Army cites water risk for El Paso as “extremely high” and notes that most of Fort Bliss’ water supply comes from wells within the installation.

Fitzgerald said the Army is aware of the public’s concern that the data center could unsustainably guzzle El Paso’s groundwater to cool the data center’s computer servers. He said the facility will be “water neutral.”

It’s also not clear how the project could replace the same amount of water that it consumes.

It’s possible the Kay Bailey Hutchison Desalination Plant – co-owned by El Paso Water and the U.S. Army – could play a role in making the data center water neutral. But El Paso Water said it has no details about how the data center facility could achieve water neutrality.

El Paso Water is “more than willing to continue to share ideas for best practices in sustainability to help protect our regional water resources,” the utility said in its statement.

As far as electricity generation, Army officials said they don’t know if El Paso Electric would build a new power plant to serve the data center. It’s also possible that Carlyle Group or another private company could build its own power generation source for the data center that’s isolated from the power grid El Pasoans use every day.

“We have to decide whether El Paso Electric is going to be the ones building whatever is coming, or if this is going to be some independent power producer,” Waksman said.

El Paso Electric is planning to develop a 366 megawatt power plant made up of over 800 small gas generators to power Meta’s data center. The utility will build more generation in the coming years to meet 1 gigawatt of total demand from Meta’s facility. Meanwhile, as the technology giant Oracle develops Project Jupiter, the company said Monday it is seeking to power the campus using 2.45 gigawatts of fuel cell power systems provided by the company Bloom Energy.

For perspective, 3.45 gigawatts – the combined projected demand of those two major data centers – is enough electricity to power as many as a million homes, depending on the time of day and weather.

The Fort Bliss project would have to meet environmental regulatory requirements, and the developer needs to include a plan for providing utilities and infrastructure needs such as access to the facility, according to a request for proposals issued by the Army in December 2025. Army officials emphasized the project would not impact El Pasoans’ water or electric bills.

Who is Carlyle Group?

Carlyle Group is a global investment management firm that oversees $477 billion of assets from entities such as pension funds.

The company invests that money by buying businesses ranging from wine producers to Asian telecommunications companies, or by developing infrastructure projects such as renewable energy generation and data centers. The company in 2025 posted distributable earnings of nearly $1.7 billion on $4.8 billion in revenue.

The Army wants to build the facility at Fort Bliss in partnership with Carlyle because the installation has a large amount of available, unused land and because of the water and electricity infrastructure already in place in El Paso, Fitzgerald said.

The Carlyle data center planned for El Paso is part of a wider U.S. military effort to quickly build infrastructure that supports the use of artificial intelligence — both on the battlefield and in running its day-to-day operations, according to government documents.

Army officials nodded to the use of AI in drone warfare and targeting systems. And a hyperscale data center facility can also securely house information such as the military’s cloud database that details pay and entitlements for every U.S. soldier, said Maj. Gen. Curtis Taylor, commanding general of the 1st Armored Division and Fort Bliss.

Data centers are “essential parts of power projection,” Taylor said. “But we have to protect those servers. And that’s why there’s great utility in building that infrastructure on military installations.”

The Fort Bliss facility would be located on a plot of land near the intersection of Loop 375 and Montana Avenue. The site is just east of the Camp East Montana immigrant detention facility, and near El Paso Electric’s gas-fired Montana power station.

The plan is for Carlyle to utilize the majority of the data center’s capacity for its business needs, and the military would have access to a more secure portion of the data center for its own uses.

The Army is developing another similar data center project in Dugway, Utah. Other Army bases identified as potential sites include Fort Hood in Texas and Fort Bragg in North Carolina.

The U.S. Air Force in October issued a solicitation saying it is “accepting proposals for the development of Artificial Intelligence data centers,” on unused land at different bases, including in California, Georgia, Arizona and Tennessee. The push was enabled by executive orders signed by Trump that seek to speed up permitting and development timelines for AI data centers.

Would the Fort Bliss data center pay taxes?

A privately-financed data center on Fort Bliss would likely have to pay some taxes – unlike on-base government facilities – but there’s a lot of uncertainty.

Carlyle Group is leasing the land for the data center under an “enhanced use lease” that allows branches of the military to rent under-used land on bases.

Land on federal installations is not subject to state or local taxes. However, the statute that authorizes the U.S. military to lease excess land to private entities says that “the interest of a lessee of property leased under this section may be taxed by State or local governments.”

So, while the land the data center is built on would not be subject to taxation, the structures housing the data center could be subject to local property taxes.

But it depends on how the deal is structured, including factors such as whether Carlyle or the Army ultimately takes ownership of the buildings.

The Army in January awarded a contract to Korean-owned Hanwha Defense USA, which will invest $1.3 billion to develop a munitions factory at a base in Pine Bluff, Arkansas, using an enhanced use lease.

Fitzgerald, the Army undersecretary, acknowledged the public pushback to other data centers such as Meta and Project Jupiter. But he said the Army wants to ensure the project is developed “the right way.”

“There are always elements that will kind of make this an ‘us versus them’ sort of a construct, but I don’t think we view it that way from the Army,” he said. “I think there’s a path here that will benefit not just the installation, but the community as well.”

CenterPoint launches real-time tracker to map Houston’s power grid upgrades

resiliency plan

Houstonians can now track electronic infrastructure improvements via CenterPoint’s new Community Progress Tracker, part of the company’s ongoing Greater Houston Resiliency Initiative.

The tracker allows users to search by zip code and see completed work in real time, as well as updates on upcoming projects that highlight infrastructure improvements and efforts to strengthen the power grid in the face of extreme weather. Users can view icons on a map that track automation and intelligence projects, storm-resilient pole and equipment installations, undergrounding work and tree trimmings.

CenterPoint had installed 10,000 storm-resilient poles, cleared 1,600 miles of higher-risk vegetation, completed 99 miles of power line undergrounding and hardened 220 miles of power lines by the end of Q1 2026, according to the company.

For the rest of 2026, CenterPoint aims to install 35,000 stronger, storm-resilient poles, clear high-risk vegetation from 8,000 miles of power lines and harden 500 transmission structures against storms.

Via centerpointenergy.com

“We are proud of the progress made in 2025, which helped deliver more than 100 million fewer outage minutes when compared to 2024, and we are determined to make even more progress in 2026 as we work toward our defining goal: building the nation's most resilient coastal grid,” Nathan Brownell, CenterPoint's vice president of resilience and capital delivery, said in a news release. “To date, we are ahead of schedule in making critical 2026 GHRI improvements, and we will continue to build the stronger, smarter infrastructure necessary to further improve systemwide reliability and strengthen resiliency, reducing the likelihood and impact of outages for our customers.”