The five-month program establishes a significant relationship between the 20 selected startups and NOV, beginning with paid pilot programs. Photo via NOV.com

Houston-based NOV is launching a new growth-stage startup accelerator focused on the upstream oil and gas industry.

NOV, a provider of oil and gas drilling and production operations equipment, has announced its new NOV Supernova Accelerator in collaboration with VentureBuilder, a consulting firm, investor, and accelerator program operator led by a group of Houston innovators.

Applications to the program are open online, and the deadline to apply is July 7. Specifically, NOV is looking for companies working on solutions in data management and analytics, operational efficiency, HSE monitoring, predictive maintenance, and digital twins.

The five-month program establishes a significant relationship between the 20 selected startups and NOV, beginning with paid pilot programs.

"This is not a traditional startup accelerator. This is often a first-client relationship to help disruptive startups refine product-market fit and creatively solve our pressing enterprise problems," reads the program's website.

Selected startups will have direct access to NOV's team and resources. The program will require companies to spend one week per month in person at NOV headquarters in Houston and will provide support surrounding several themes, including go-to-market strategy, pitch practice, and more.

“The NOV Supernova Accelerator offers a strategic approach where the company collaborates with startups in a vendor-client relationship to address specific business needs," says Billy Grandy, general partner of VentureBuilder.vc, in a statement. "Unlike mergers and acquisitions, the venture client model allows corporations like NOV to quickly test and implement new technologies without committing to an acquisition or risking significant investment.”

UH Professor Vedhus Hoskere received a three-year, $505,286 grant from TxDOT for a bridge digitization project. Photo via uh.edu

Houston researcher earns $500,000 grant to tap into digital twin tech for bridge safety

transportation

A University of Houston professor has received a grant from the Texas Department of Transportation (TxDOT) to improve the efficiency and effectiveness of how bridges are inspected in the state.

The $505,286 grant will support the project of Vedhus Hoskere, assistant professor in the Civil and Environmental Engineering Department, over three years. The project, “Development of Digital Twins for Texas Bridges,” will look at how to use drones, cameras, sensors and AI to support Texas' bridge maintenance programs.

“To put this data in context, we create a 3D digital representation of these bridges, called digital twins,” Hoskere said in a statement. “Then, we use artificial intelligence methods to help us find and quantify problems to be concerned about. We’re particularly interested in any structural problems that we can identify - these digital twins help us monitor changes over time and keep a close eye on the bridge. The digital twins can be tremendously useful for the planning and management of our aging bridge infrastructure so that limited taxpayer resources are properly utilized.”

The project began in September and will continue through August 2026. Hoskere is joined on the project by Craig Glennie, the Hugh Roy and Lillie Cranz Cullen Distinguished Chair at Cullen College and director of the National Center for Airborne Laser Mapping, as the project’s co-principal investigator.

According to Hoskere, the project will have implications for Texas's 55,000 bridges (more than twice as many as any other state in the country), which need to be inspected every two years.

Outside of Texas, Hoskere says the project will have international impact on digital twin research. Hoskere chairs a sub-task group of the International Association for Bridge and Structural Engineering (IABSE).

“Our international efforts align closely with this project’s goals and the insights gained globally will enhance our work in Texas while our research at UH contributes to advancing bridge digitization worldwide,” he said. “We have been researching developing digital twins for inspections and management of various infrastructure assets over the past 8 years. This project provides us an opportunity to leverage our expertise to help TxDOT achieve their goals while also advancing the science and practice of better developing these digital twins.”

Last year another UH team earned a $750,000 grant from the National Science Foundation for a practical, Texas-focused project that uses AI. The team was backed by the NSF's Convergence Accelerator for its project to help food-insecure Texans and eliminate inefficiencies within the food charity system.

———

This article originally ran on InnovationMap.
Nick Purday, IT director of emerging digital technology for ConocoPhillips, presented at the Reuters Events Data-Driven Oil and Gas Conference 2023 to help dispel any myths about digital twins. Photo courtesy of Shuttershock.

The secret to unlocking efficiency for the energy transition? Data management

SAVING THE BEST FOR LAST

As Nick Purday, IT director of emerging digital technology for ConocoPhillips, began his presentation at the Reuters Events Data-Driven Oil and Gas Conference 2023 in Houston yesterday, he lamented at missing the opportunity to dispel any myths about digital twins given his second-to-last time slot of the conference.

He may have sold himself short.

No less than a hush fell over the crowd as Purday described one of the more challenging applications of digital twins his team tackled late last year. Purday explained, “The large diagram [up there], that’s two trains from our LNG facility. How long did that take to build? We built that one in a month.”

It’s been years since an upstream oil and gas audience has gasped, but Purday swept the crowd with admiration for the swift, arduous task undertaken by his team.

He then addressed the well-known balance of good/fast/cheap in a rare glimpse under the hood of project planning for such novel technology. “As soon as you move into remote visualization applications – think Alaska, think Norway – then you’re going to get a pretty good return on your investment. Think 3-to-1,” Purday explains. “As you would expect, those simulation digital twins, those are the ones where you get huge value. Optimizing the energy requirements of an LNG facility – huge value associated with that.

“Independently, Forrester did some work recently and came up with a 4-to-1 return, so that fits exactly with our data set,” Purday continued before casually bringing up the foundation for their successful effort.

“If you’ve got good data, then it doesn’t take that long and you can do these pretty effectively,” Purday stated plainly.

Another wave of awe rippled across the room.

In an earlier panel session, Nathan McMahan, data strategy chief at CoP, commented on the shared responsibility model for data in the industry. “When I talked to a lot of people across the organization, three common themes commonly filtered up: What’s the visibility, access, and trust of data?” McMahan observed.

Strong data governance stretches across the organization, but the Wells team, responsible for drilling and completions activity, stood out to McMahan with its approach to data governance.

“They had taken ownership of [the] data and partnered with business units across the globe to standardize best practices between some of the tools and data ingestion methods, even work with suppliers and contractors, [to demonstrate] our expectations for how we take data,” McMahan explained. “They even went a step further to bring an IT resource onto their floor and start to create roles of the owners and the stewards and the custodians of the data. They really laid that good foundation and built upon that with some of the outcomes they wanted to achieve with machine learning techniques and those sorts of things.“

The key, McMahan concluded, is making the “janitorial effort [of] cleaning up data sustainable… and fun.”

The sentiment of fun continued in Purday's late afternoon presentation as he explained how the application went viral upon sharing it with 1 or 2 testers, crashing the email of the lead developer responsible for managing the model as he was flooded with questions and kudos.

Digital twin applications significantly reduce the carbon footprint created by sending personnel to triage onsite concerns for LNG, upstream, and refining facilities in addition to streamlining processes and enabling tremendous savings. The application Purday described allowed his team to discover an issue previously only resolved by flying someone to a remote location where they would likely spend days testing and analyzing the area to diagnose the problem.

The digital twin found the issue in 10 minutes, and the on-site team resolved the problem within the day.

The LNG operations team now consistently starts their day with a bit of a spark, using the digital twin during morning meetings to help with planning and predictive maintenance.

Ad Placement 300x100
Ad Placement 300x600

CultureMap Emails are Awesome

Rice research team's study keeps CO2-to-fuel devices running 50 times longer

new findings

In a new study published in the journal Science, a team of Rice University researchers shared findings on how acid bubbles can improve the stability of electrochemical devices that convert carbon dioxide into useful fuels and chemicals.

The team led by Rice associate professor Hoatian Wang addressed an issue in the performance and stability of CO2 reduction systems. The gas flow channels in the systems often clog due to salt buildup, reducing efficiency and causing the devices to fail prematurely after about 80 hours of operation.

“Salt precipitation blocks CO2 transport and floods the gas diffusion electrode, which leads to performance failure,” Wang said in a news release. “This typically happens within a few hundred hours, which is far from commercial viability.”

By using an acid-humidified CO2 technique, the team was able to extend the operational life of a CO2 reduction system more than 50-fold, demonstrating more than 4,500 hours of stable operation in a scaled-up reactor.

The Rice team made a simple swap with a significant impact. Instead of using water to humidify the CO2 gas input into the reactor, the team bubbled the gas through an acid solution such as hydrochloric, formic or acetic acid. This process made more soluble salt formations that did not crystallize or block the channels.

The process has major implications for an emerging green technology known as electrochemical CO2 reduction, or CO2RR, that transforms climate-warming CO2 into products like carbon monoxide, ethylene, or alcohols. The products can be further refined into fuels or feedstocks.

“Using the traditional method of water-humidified CO2 could lead to salt formation in the cathode gas flow channels,” Shaoyun Hao, postdoctoral research associate in chemical and biomolecular engineering at Rice and co-first author, explained in the news release. “We hypothesized — and confirmed — that acid vapor could dissolve the salt and convert the low solubility KHCO3 into salt with higher solubility, thus shifting the solubility balance just enough to avoid clogging without affecting catalyst performance.”

The Rice team believes the work can lead to more scalable CO2 electrolyzers, which is vital if the technology is to be deployed at industrial scales as part of carbon capture and utilization strategies. Since the approach itself is relatively simple, it could lead to a more cost-effective and efficient solution. It also worked well with multiple catalyst types, including zinc oxide, copper oxide and bismuth oxide, which are allo used to target different CO2RR products.

“Our method addresses a long-standing obstacle with a low-cost, easily implementable solution,” Ahmad Elgazzar, co-first author and graduate student in chemical and biomolecular engineering at Rice, added in the release. “It’s a step toward making carbon utilization technologies more commercially viable and more sustainable.”

A team led by Wang and in collaboration with researchers from the University of Houston also shared findings on salt precipitation buildup and CO2RR in a recent edition of the journal Nature Energy. Read more here.

The case for smarter CUI inspections in the energy sector

Guest Column

Corrosion under insulation (CUI) accounts for roughly 60% of pipeline leaks in the U.S. oil and gas sector. Yet many operators still rely on outdated inspection methods that are slow, risky, and economically unsustainable.

This year, widespread budget cuts and layoffs across the sector are forcing refineries to do more with less. Efficiency is no longer a goal; it’s a mandate. The challenge: how to maintain safety and reliability without overextending resources?

Fortunately, a new generation of technologies is gaining traction in the oil and gas industry, offering operators faster, safer, and more cost-effective ways to identify and mitigate CUI.

Hidden cost of corrosion

Corrosion is a pervasive threat, with CUI posing the greatest risk to refinery operations. Insulation conceals damage until it becomes severe, making detection difficult and ultimately leading to failure. NACE International estimates the annual cost of corrosion in the U.S. at $276 billion.

Compounding the issue is aging infrastructure: roughly half of the nation’s 2.6 million miles of pipeline are over 50 years old. Aging infrastructure increases the urgency and the cost of inspections.

So, the question is: Are we at a breaking point or an inflection point? The answer depends largely on how quickly the industry can move beyond inspection methods that no longer match today's operational or economic realities.

Legacy methods such as insulation stripping, scaffolding, and manual NDT are slow, hazardous, and offer incomplete coverage. With maintenance budgets tightening, these methods are no longer viable.

Why traditional inspection falls short

Without question, what worked 50 years ago no longer works today. Traditional inspection methods are slow, siloed, and dangerously incomplete.

Insulation removal:

  • Disruptive and expensive.
  • Labor-intensive and time-consuming, with a high risk of process upsets and insulation damage.
  • Limited coverage. Often targets a small percentage of piping, leaving large areas unchecked.
  • Health risks: Exposes workers to hazardous materials such as asbestos or fiberglass.

Rope access and scaffolding:

  • Safety hazards. Falls from height remain a leading cause of injury.
  • Restricted time and access. Weather, fatigue, and complex layouts limit coverage and effectiveness.
  • High coordination costs. Multiple contractors, complex scheduling, and oversight, which require continuous monitoring, documentation, and compliance assurance across vendors and protocols drive up costs.

Spot checks:

  • Low detection probability. Random sampling often fails to detect localized corrosion.
  • Data gaps. Paper records and inconsistent methods hinder lifecycle asset planning.
  • Reactive, not proactive: Problems are often discovered late after damage has already occurred.

A smarter way forward

While traditional NDT methods for CUI like Pulsed Eddy Current (PEC) and Real-Time Radiography (RTR) remain valuable, the addition of robotic systems, sensors, and AI are transforming CUI inspection.

Robotic systems, sensors, and AI are reshaping how CUI inspections are conducted, reducing reliance on manual labor and enabling broader, data-rich asset visibility for better planning and decision-making.

ARIX Technologies, for example, introduced pipe-climbing robotic systems capable of full-coverage inspections of insulated pipes without the need for insulation removal. Venus, ARIX’s pipe-climbing robot, delivers full 360° CUI data across both vertical and horizontal pipe circuits — without magnets, scaffolding, or insulation removal. It captures high-resolution visuals and Pulsed Eddy Current (PEC) data simultaneously, allowing operators to review inspection video and analyze corrosion insights in one integrated workflow. This streamlines data collection, speeds up analysis, and keeps personnel out of hazardous zones — making inspections faster, safer, and far more actionable.

These integrated technology platforms are driving measurable gains:

  • Autonomous grid scanning: Delivers structured, repeatable coverage across pipe surfaces for greater inspection consistency.
  • Integrated inspection portal: Combines PEC, RTR, and video into a unified 3D visualization, streamlining analysis across inspection teams.
  • Actionable insights: Enables more confident planning and risk forecasting through digital, shareable data—not siloed or static.

Real-world results

Petromax Refining adopted ARIX’s robotic inspection systems to modernize its CUI inspections, and its results were substantial and measurable:

  • Inspection time dropped from nine months to 39 days.
  • Costs were cut by 63% compared to traditional methods.
  • Scaffolding was minimized 99%, reducing hazardous risks and labor demands.
  • Data accuracy improved, supporting more innovative maintenance planning.

Why the time is now

Energy operators face mounting pressure from all sides: aging infrastructure, constrained budgets, rising safety risks, and growing ESG expectations.

In the U.S., downstream operators are increasingly piloting drone and crawler solutions to automate inspection rounds in refineries, tank farms, and pipelines. Over 92% of oil and gas companies report that they are investing in AI or robotic technologies or have plans to invest soon to modernize operations.

The tools are here. The data is here. Smarter inspection is no longer aspirational — it’s operational. The case has been made. Petromax and others are showing what’s possible. Smarter inspection is no longer a leap but a step forward.

---

Tyler Flanagan is director of service & operations at Houston-based ARIX Technologies.


Scientists warn greenhouse gas accumulation is accelerating and more extreme weather will come

Climate Report

Humans are on track to release so much greenhouse gas in less than three years that a key threshold for limiting global warming will be nearly unavoidable, according to a study released June 19.

The report predicts that society will have emitted enough carbon dioxide by early 2028 that crossing an important long-term temperature boundary will be more likely than not. The scientists calculate that by that point there will be enough of the heat-trapping gas in the atmosphere to create a 50-50 chance or greater that the world will be locked in to 1.5 degrees Celsius (2.7 degrees Fahrenheit) of long-term warming since preindustrial times. That level of gas accumulation, which comes from the burning of fuels like gasoline, oil and coal, is sooner than the same group of 60 international scientists calculated in a study last year.

“Things aren’t just getting worse. They’re getting worse faster,” said study co-author Zeke Hausfather of the tech firm Stripe and the climate monitoring group Berkeley Earth. “We’re actively moving in the wrong direction in a critical period of time that we would need to meet our most ambitious climate goals. Some reports, there’s a silver lining. I don’t think there really is one in this one.”

That 1.5 goal, first set in the 2015 Paris agreement, has been a cornerstone of international efforts to curb worsening climate change. Scientists say crossing that limit would mean worse heat waves and droughts, bigger storms and sea-level rise that could imperil small island nations. Over the last 150 years, scientists have established a direct correlation between the release of certain levels of carbon dioxide, along with other greenhouse gases like methane, and specific increases in global temperatures.

In Thursday's Indicators of Global Climate Change report, researchers calculated that society can spew only 143 billion more tons (130 billion metric tons) of carbon dioxide before the 1.5 limit becomes technically inevitable. The world is producing 46 billion tons (42 billion metric tons) a year, so that inevitability should hit around February 2028 because the report is measured from the start of this year, the scientists wrote. The world now stands at about 1.24 degrees Celsius (2.23 degrees Fahrenheit) of long-term warming since preindustrial times, the report said.

Earth's energy imbalance

The report, which was published in the journal Earth System Science Data, shows that the rate of human-caused warming per decade has increased to nearly half a degree (0.27 degrees Celsius) per decade, Hausfather said. And the imbalance between the heat Earth absorbs from the sun and the amount it radiates out to space, a key climate change signal, is accelerating, the report said.

“It's quite a depressing picture unfortunately, where if you look across the indicators, we find that records are really being broken everywhere,” said lead author Piers Forster, director of the Priestley Centre for Climate Futures at the University of Leeds in England. “I can't conceive of a situation where we can really avoid passing 1.5 degrees of very long-term temperature change.”

The increase in emissions from fossil-fuel burning is the main driver. But reduced particle pollution, which includes soot and smog, is another factor because those particles had a cooling effect that masked even more warming from appearing, scientists said. Changes in clouds also factor in. That all shows up in Earth’s energy imbalance, which is now 25% higher than it was just a decade or so ago, Forster said.

Earth’s energy imbalance “is the most important measure of the amount of heat being trapped in the system,” Hausfather said.

Earth keeps absorbing more and more heat than it releases. “It is very clearly accelerating. It’s worrisome,” he said.

Crossing the temperature limit

The planet temporarily passed the key 1.5 limit last year. The world hit 1.52 degrees Celsius (2.74 degrees Fahrenheit) of warming since preindustrial times for an entire year in 2024, but the Paris threshold is meant to be measured over a longer period, usually considered 20 years. Still, the globe could reach that long-term threshold in the next few years even if individual years haven't consistently hit that mark, because of how the Earth's carbon cycle works.

That 1.5 is “a clear limit, a political limit for which countries have decided that beyond which the impact of climate change would be unacceptable to their societies,” said study co-author Joeri Rogelj, a climate scientist at Imperial College London.

The mark is so important because once it is crossed, many small island nations could eventually disappear because of sea level rise, and scientific evidence shows that the impacts become particularly extreme beyond that level, especially hurting poor and vulnerable populations, he said. He added that efforts to curb emissions and the impacts of climate change must continue even if the 1.5 degree threshold is exceeded.

Crossing the threshold "means increasingly more frequent and severe climate extremes of the type we are now seeing all too often in the U.S. and around the world — unprecedented heat waves, extreme hot drought, extreme rainfall events, and bigger storms,” said University of Michigan environment school dean Jonathan Overpeck, who wasn't part of the study.

Andrew Dessler, a Texas A&M University climate scientist who wasn't part of the study, said the 1.5 goal was aspirational and not realistic, so people shouldn’t focus on that particular threshold.

“Missing it does not mean the end of the world,” Dessler said in an email, though he agreed that “each tenth of a degree of warming will bring increasingly worse impacts.”