The five-month program establishes a significant relationship between the 20 selected startups and NOV, beginning with paid pilot programs. Photo via NOV.com

Houston-based NOV is launching a new growth-stage startup accelerator focused on the upstream oil and gas industry.

NOV, a provider of oil and gas drilling and production operations equipment, has announced its new NOV Supernova Accelerator in collaboration with VentureBuilder, a consulting firm, investor, and accelerator program operator led by a group of Houston innovators.

Applications to the program are open online, and the deadline to apply is July 7. Specifically, NOV is looking for companies working on solutions in data management and analytics, operational efficiency, HSE monitoring, predictive maintenance, and digital twins.

The five-month program establishes a significant relationship between the 20 selected startups and NOV, beginning with paid pilot programs.

"This is not a traditional startup accelerator. This is often a first-client relationship to help disruptive startups refine product-market fit and creatively solve our pressing enterprise problems," reads the program's website.

Selected startups will have direct access to NOV's team and resources. The program will require companies to spend one week per month in person at NOV headquarters in Houston and will provide support surrounding several themes, including go-to-market strategy, pitch practice, and more.

“The NOV Supernova Accelerator offers a strategic approach where the company collaborates with startups in a vendor-client relationship to address specific business needs," says Billy Grandy, general partner of VentureBuilder.vc, in a statement. "Unlike mergers and acquisitions, the venture client model allows corporations like NOV to quickly test and implement new technologies without committing to an acquisition or risking significant investment.”

UH Professor Vedhus Hoskere received a three-year, $505,286 grant from TxDOT for a bridge digitization project. Photo via uh.edu

Houston researcher earns $500,000 grant to tap into digital twin tech for bridge safety

transportation

A University of Houston professor has received a grant from the Texas Department of Transportation (TxDOT) to improve the efficiency and effectiveness of how bridges are inspected in the state.

The $505,286 grant will support the project of Vedhus Hoskere, assistant professor in the Civil and Environmental Engineering Department, over three years. The project, “Development of Digital Twins for Texas Bridges,” will look at how to use drones, cameras, sensors and AI to support Texas' bridge maintenance programs.

“To put this data in context, we create a 3D digital representation of these bridges, called digital twins,” Hoskere said in a statement. “Then, we use artificial intelligence methods to help us find and quantify problems to be concerned about. We’re particularly interested in any structural problems that we can identify - these digital twins help us monitor changes over time and keep a close eye on the bridge. The digital twins can be tremendously useful for the planning and management of our aging bridge infrastructure so that limited taxpayer resources are properly utilized.”

The project began in September and will continue through August 2026. Hoskere is joined on the project by Craig Glennie, the Hugh Roy and Lillie Cranz Cullen Distinguished Chair at Cullen College and director of the National Center for Airborne Laser Mapping, as the project’s co-principal investigator.

According to Hoskere, the project will have implications for Texas's 55,000 bridges (more than twice as many as any other state in the country), which need to be inspected every two years.

Outside of Texas, Hoskere says the project will have international impact on digital twin research. Hoskere chairs a sub-task group of the International Association for Bridge and Structural Engineering (IABSE).

“Our international efforts align closely with this project’s goals and the insights gained globally will enhance our work in Texas while our research at UH contributes to advancing bridge digitization worldwide,” he said. “We have been researching developing digital twins for inspections and management of various infrastructure assets over the past 8 years. This project provides us an opportunity to leverage our expertise to help TxDOT achieve their goals while also advancing the science and practice of better developing these digital twins.”

Last year another UH team earned a $750,000 grant from the National Science Foundation for a practical, Texas-focused project that uses AI. The team was backed by the NSF's Convergence Accelerator for its project to help food-insecure Texans and eliminate inefficiencies within the food charity system.

———

This article originally ran on InnovationMap.
Nick Purday, IT director of emerging digital technology for ConocoPhillips, presented at the Reuters Events Data-Driven Oil and Gas Conference 2023 to help dispel any myths about digital twins. Photo courtesy of Shuttershock.

The secret to unlocking efficiency for the energy transition? Data management

SAVING THE BEST FOR LAST

As Nick Purday, IT director of emerging digital technology for ConocoPhillips, began his presentation at the Reuters Events Data-Driven Oil and Gas Conference 2023 in Houston yesterday, he lamented at missing the opportunity to dispel any myths about digital twins given his second-to-last time slot of the conference.

He may have sold himself short.

No less than a hush fell over the crowd as Purday described one of the more challenging applications of digital twins his team tackled late last year. Purday explained, “The large diagram [up there], that’s two trains from our LNG facility. How long did that take to build? We built that one in a month.”

It’s been years since an upstream oil and gas audience has gasped, but Purday swept the crowd with admiration for the swift, arduous task undertaken by his team.

He then addressed the well-known balance of good/fast/cheap in a rare glimpse under the hood of project planning for such novel technology. “As soon as you move into remote visualization applications – think Alaska, think Norway – then you’re going to get a pretty good return on your investment. Think 3-to-1,” Purday explains. “As you would expect, those simulation digital twins, those are the ones where you get huge value. Optimizing the energy requirements of an LNG facility – huge value associated with that.

“Independently, Forrester did some work recently and came up with a 4-to-1 return, so that fits exactly with our data set,” Purday continued before casually bringing up the foundation for their successful effort.

“If you’ve got good data, then it doesn’t take that long and you can do these pretty effectively,” Purday stated plainly.

Another wave of awe rippled across the room.

In an earlier panel session, Nathan McMahan, data strategy chief at CoP, commented on the shared responsibility model for data in the industry. “When I talked to a lot of people across the organization, three common themes commonly filtered up: What’s the visibility, access, and trust of data?” McMahan observed.

Strong data governance stretches across the organization, but the Wells team, responsible for drilling and completions activity, stood out to McMahan with its approach to data governance.

“They had taken ownership of [the] data and partnered with business units across the globe to standardize best practices between some of the tools and data ingestion methods, even work with suppliers and contractors, [to demonstrate] our expectations for how we take data,” McMahan explained. “They even went a step further to bring an IT resource onto their floor and start to create roles of the owners and the stewards and the custodians of the data. They really laid that good foundation and built upon that with some of the outcomes they wanted to achieve with machine learning techniques and those sorts of things.“

The key, McMahan concluded, is making the “janitorial effort [of] cleaning up data sustainable… and fun.”

The sentiment of fun continued in Purday's late afternoon presentation as he explained how the application went viral upon sharing it with 1 or 2 testers, crashing the email of the lead developer responsible for managing the model as he was flooded with questions and kudos.

Digital twin applications significantly reduce the carbon footprint created by sending personnel to triage onsite concerns for LNG, upstream, and refining facilities in addition to streamlining processes and enabling tremendous savings. The application Purday described allowed his team to discover an issue previously only resolved by flying someone to a remote location where they would likely spend days testing and analyzing the area to diagnose the problem.

The digital twin found the issue in 10 minutes, and the on-site team resolved the problem within the day.

The LNG operations team now consistently starts their day with a bit of a spark, using the digital twin during morning meetings to help with planning and predictive maintenance.

Ad Placement 300x100
Ad Placement 300x600

CultureMap Emails are Awesome

How Planckton Data is building the sustainability label every industry will need

now streaming

There’s a reason “carbon footprint” became a buzzword. It sounds like something we should know. Something we should measure. Something that should be printed next to the calorie count on a label.

But unlike calories, a carbon footprint isn’t universal, standardized, or easy to calculate. In fact, for most companies—especially in energy and heavy industry—it’s still a black box.

That’s the problem Planckton Data is solving.

On this episode of the Energy Tech Startups Podcast, Planckton Data co-founders Robin Goswami and Sandeep Roy sit down to explain how they’re turning complex, inconsistent, and often incomplete emissions data into usable insight. Not for PR. Not for green washing. For real operational and regulatory decisions.

And they’re doing it in a way that turns sustainability from a compliance burden into a competitive advantage.

From calories to carbon: The label analogy that actually works

If you’ve ever picked up two snack bars and compared their calorie counts, you’ve made a decision based on transparency. Robin and Sandeep want that same kind of clarity for industrial products.

Whether it’s a shampoo bottle, a plastic feedstock, or a specialty chemical—there’s now consumer and regulatory pressure to know exactly how sustainable a product is. And to report it.

But that’s where the simplicity ends.

Because unlike food labels, carbon labels can’t be standardized across a single factory. They depend on where and how a product was made, what inputs were used, how far it traveled, and what method was used to calculate the data.

Even two otherwise identical chemicals—one sourced from a refinery in Texas and the other in Europe—can carry very different carbon footprints, depending on logistics, local emission factors, and energy sources.

Planckton’s solution is built to handle exactly this level of complexity.

AI that doesn’t just analyze

For most companies, supply chain emissions data is scattered, outdated, and full of gaps.

That’s where Planckton’s use of AI becomes transformative.

  • It standardizes data from multiple suppliers, geographies, and formats.
  • It uses probabilistic models to fill in the blanks when suppliers don’t provide details.
  • It applies industry-specific product category rules (PCRs) and aligns them with evolving global frameworks like ISO standards and GHG Protocol.
  • It helps companies model decarbonization pathways, not just calculate baselines.

This isn’t generative AI for show. It’s applied machine learning with a purpose: helping large industrial players move from reporting to real action.

And it’s not a side tool. For many of Planckton’s clients, it’s becoming the foundation of their sustainability strategy.

From boardrooms to smokestacks: Where the pressure is coming from

Planckton isn’t just chasing early adopters. They’re helping midstream and upstream industrial suppliers respond to pressure coming from two directions:

  1. Downstream consumer brands—especially in cosmetics, retail, and CPG—are demanding footprint data from every input supplier.
  2. Upstream regulations—especially in Europe—are introducing reporting requirements, carbon taxes, and supply chain disclosure laws.

The team gave a real-world example: a shampoo brand wants to differentiate based on lower emissions. That pressure flows up the value chain to the chemical suppliers. Who, in turn, must track data back to their own suppliers.

It’s a game of carbon traceability—and Planckton helps make it possible.

Why Planckton focused on chemicals first

With backgrounds at Infosys and McKinsey, Robin and Sandeep know how to navigate large-scale digital transformations. They also know that industry specificity matters—especially in sustainability.

So they chose to focus first on the chemicals sector—a space where:

  • Supply chains are complex and often opaque.
  • Product formulations are sensitive.
  • And pressure from cosmetics, packaging, and consumer brands is pushing for measurable, auditable impact data.

It’s a wedge into other verticals like energy, plastics, fertilizers, and industrial manufacturing—but one that’s already showing results.

Carbon accounting needs a financial system

What makes this conversation unique isn’t just the product. It’s the co-founders’ view of the ecosystem.

They see a world where sustainability reporting becomes as robust as financial reporting. Where every company knows its Scope 1, 2, and 3 emissions the way it knows revenue, gross margin, and EBITDA.

But that world doesn’t exist yet. The data infrastructure isn’t there. The standards are still in flux. And the tooling—until recently—was clunky, manual, and impossible to scale.

Planckton is building that infrastructure—starting with the industries that need it most.

Houston as a launchpad (not just a legacy hub)

Though Planckton has global ambitions, its roots in Houston matter.

The city’s legacy in energy and chemicals gives it a unique edge in understanding real-world industrial challenges. And the growing ecosystem around energy transition—investors, incubators, and founders—is helping companies like Planckton move fast.

“We thought we’d have to move to San Francisco,” Robin shares. “But the resources we needed were already here—just waiting to be activated.”

The future of sustainability is measurable—and monetizable

The takeaway from this episode is clear: measuring your carbon footprint isn’t just good PR—it’s increasingly tied to market access, regulatory approval, and bottom-line efficiency.

And the companies that embrace this shift now—using platforms like Planckton—won’t just stay compliant. They’ll gain a competitive edge.

Listen to the full conversation with Planckton Data on the Energy Tech Startups Podcast:

Hosted by Jason Ethier and Nada Ahmed, the Digital Wildcatters’ podcast, Energy Tech Startups, delves into Houston's pivotal role in the energy transition, spotlighting entrepreneurs and industry leaders shaping a low-carbon future.


Gold H2 harvests clean hydrogen from depleted California reservoirs in first field trial

breakthrough trial

Houston climatech company Gold H2 completed its first field trial that demonstrates subsurface bio-stimulated hydrogen production, which leverages microbiology and existing infrastructure to produce clean hydrogen.

Gold H2 is a spinoff of another Houston biotech company, Cemvita.

“When we compare our tech to the rest of the stack, I think we blow the competition out of the water," Prabhdeep Singh Sekhon, CEO of Gold H2 Sekhon previously told Energy Capital.

The project represented the first-of-its-kind application of Gold H2’s proprietary biotechnology, which generates hydrogen from depleted oil reservoirs, eliminating the need for new drilling, electrolysis or energy-intensive surface facilities. The Woodlands-based ChampionX LLC served as the oilfield services provider, and the trial was conducted in an oilfield in California’s San Joaquin Basin.

According to the company, Gold H2’s technology could yield up to 250 billion kilograms of low-carbon hydrogen, which is estimated to provide enough clean power to Los Angeles for over 50 years and avoid roughly 1 billion metric tons of CO2 equivalent.

“This field trial is tangible proof. We’ve taken a climate liability and turned it into a scalable, low-cost hydrogen solution,” Sekhon said in a news release. “It’s a new blueprint for decarbonization, built for speed, affordability, and global impact.”

Highlights of the trial include:

  • First-ever demonstration of biologically stimulated hydrogen generation at commercial field scale with unprecedented results of 40 percent H2 in the gas stream.
  • Demonstrated how end-of-life oilfield liabilities can be repurposed into hydrogen-producing assets.
  • The trial achieved 400,000 ppm of hydrogen in produced gases, which, according to the company,y is an “unprecedented concentration for a huff-and-puff style operation and a strong indicator of just how robust the process can perform under real-world conditions.”
  • The field trial marked readiness for commercial deployment with targeted hydrogen production costs below $0.50/kg.

“This breakthrough isn’t just a step forward, it’s a leap toward climate impact at scale,” Jillian Evanko, CEO and president at Chart Industries Inc., Gold H2 investor and advisor, added in the release. “By turning depleted oil fields into clean hydrogen generators, Gold H2 has provided a roadmap to produce low-cost, low-carbon energy using the very infrastructure that powered the last century. This changes the game for how the world can decarbonize heavy industry, power grids, and economies, faster and more affordably than we ever thought possible.”

Rice University spinout lands $500K NSF grant to boost chip sustainability

cooler computing

HEXAspec, a spinout from Rice University's Liu Idea Lab for Innovation and Entrepreneurship, was recently awarded a $500,000 National Science Foundation Partnership for Innovation grant.

The team says it will use the funding to continue enhancing semiconductor chips’ thermal conductivity to boost computing power. According to a release from Rice, HEXAspec has developed breakthrough inorganic fillers that allow graphic processing units (GPUs) to use less water and electricity and generate less heat.

The technology has major implications for the future of computing with AI sustainably.

“With the huge scale of investment in new computing infrastructure, the problem of managing the heat produced by these GPUs and semiconductors has grown exponentially. We’re excited to use this award to further our material to meet the needs of existing and emerging industry partners and unlock a new era of computing,” HEXAspec co-founder Tianshu Zhai said in the release.

HEXAspec was founded by Zhai and Chen-Yang Lin, who both participated in the Rice Innovation Fellows program. A third co-founder, Jing Zhang, also worked as a postdoctoral researcher and a research scientist at Rice, according to HEXAspec's website.

The HEXASpec team won the Liu Idea Lab for Innovation and Entrepreneurship's H. Albert Napier Rice Launch Challenge in 2024. More recently, it also won this year's Energy Venture Day and Pitch Competition during CERAWeek in the TEX-E student track, taking home $25,000.

"The grant from the NSF is a game-changer, accelerating the path to market for this transformative technology," Kyle Judah, executive director of Lilie, added in the release.

---

This article originally ran on InnovationMap.