A team of Texas researchers has landed a nearly $1 million NSF grant to address rural flood management challenges with community input. Photo via Getty Images.
A team from Rice University, the University of Texas at Austin and Texas A&M University have been awarded a National Science Foundation grant under the CHIRRP—or Confronting Hazards, Impacts and Risks for a Resilient Planet—program to combat flooding hazards in rural Texas.
The team is led by Avantika Gori, assistant professor of civil and environmental engineering at Rice. Other members include Rice’s James Doss-Gollin, Andrew Juan at Texas A&M University and Keri Stephens at UT Austin.
Researchers from Rice’s Severe Storm Prediction, Education and Evacuation from Disasters Center and Ken Kennedy Institute, Texas A&M’s Institute for A Disaster Resilient Texas and the Technology & Information Policy Institute at UT Austin are part of the team as well.
Their proposal includes work that introduces a “stakeholder-centered framework” to help address rural flood management challenges with community input.
“Our goal is to create a flood management approach that truly serves rural communities — one that’s driven by science but centers around the people who are impacted the most,” Gori said in a news release.
The project plans to introduce a performance-based system dynamics framework that integrates hydroclimate variability, hydrology, machine learning, community knowledge, and feedback to give researchers a better understanding of flood risks in rural areas.
The research will be implemented in two rural Texas areas that struggle with constant challenges associated with flooding. The case studies aim to demonstrate how linking global and regional hydroclimate variability with local hazard dynamics can work toward solutions.
“By integrating understanding of the weather dynamics that cause extreme floods, physics-based models of flooding and AI or machine learning tools together with an understanding of each community’s needs and vulnerabilities, we can better predict how different interventions will reduce a community’s risk,” Doss-Gollin said in a news release.
At the same time, the project aims to help communities gain a better understanding of climate science in their terms. The framework will also consider “resilience indicators,” such as business continuity, transportation access and other features that the team says more adequately address the needs of rural communities.
“This work is about more than flood science — it’s also about identifying ways to help communities understand flooding using words that reflect their values and priorities,” said Stephens. “We’re creating tools that empower communities to not only recover from disasters but to thrive long term.”
OpenSafe.AI, a new platform that utilizes AI, data, and hazard and resilience models to support storm response decision makers, has secured an NSF grant. Photo via Getty Images
Researchers from Rice University have secured a $1.5 million grant from the National Science Foundation to continue their work on improving safety and resiliency of coastal communities plagued by flooding and hazardous weather.
Together, the team is developing and hopes to deploy “Open-Source Situational Awareness Framework for Equitable Multi-Hazard Impact Sensing using Responsible AI,” or OpenSafe.AI, a new platform that utilizes AI, data, and hazard and resilience models "to provide timely, reliable and equitable insights to emergency response organizations and communities before, during and after tropical cyclones and coastal storm events," reads a news release from Rice.
“Our goal with this project is to enable communities to better prepare for and navigate severe weather by providing better estimates of what is actually happening or might happen within the next hours or days,” Padgett, Rice’s Stanley C. Moore Professor in Engineering and chair of the Department of Civil and Environmental Engineering, says in the release. “OpenSafe.AI will take into account multiple hazards such as high-speed winds, storm surge and compound flooding and forecast their potential impact on the built environment such as transportation infrastructure performance or hazardous material spills triggered by severe storms.”
OpenSafe.AI platform will be developed to support decision makers before, during, and after a storm.
“By combining cutting-edge AI with a deep understanding of the needs of emergency responders, we aim to provide accurate, real-time information that will enable better decision-making in the face of disasters,” adds Hu, associate professor of computer science at Rice.
In the long term, OpenSafe.AI hopes to explore how the system can be applied to and scaled in other regions in need of equitable resilience to climate-driven hazards.
“Our goal is not only to develop a powerful tool for emergency response agencies along the coast but to ensure that all communities ⎯ especially the ones most vulnerable to storm-induced damage ⎯ can rely on this technology to better respond to and recover from the devastating effects of coastal storms,” adds Gori, assistant professor of civil and environmental engineering at Rice.
A Rice University study will consider how "design strategies aimed at improving civic engagement in stormwater infrastructure could help reduce catastrophic flooding." Photo via Getty Images
Houston will be the setting of a new three-year National Science Foundation-funded study that focuses on a phenomenon the city is quite familiar with: flooding.
Conducted by Rice University, the study will consider how "design strategies aimed at improving civic engagement in stormwater infrastructure could help reduce catastrophic flooding," according to a statement.
The team will begin its research in the Trinity/Houston Gardens neighborhood and will implement field research, participatory design work and hydrological impact analyses.
Rice professor of anthropology Dominic Boyer and Rice's Gus Sessions Wortham Professor of Architecture Albert Pope are co-principal investigators on the study. They'll be joined by Phil Bedient, director of the Severe Storm Prediction, Education and Evacuation from Disasters Center at Rice, and Jessica Eisma, a civil engineer at the University of Texas at Arlington.
According to Boyer, the study will bring tougher researchers from across disciplines as well as community members and even elementary-aged students.
"Our particular focus will be on green stormwater infrastructure—techniques like bioswale, green roofs and rain gardens—that are more affordable than conventional concrete infrastructure and ones where community members can be more directly involved in the design and implementation phases,” Boyer said. “We envision helping students and other community members design and complete projects like community rain gardens that offer a variety of beneficial amenities and can also mitigate flooding.”
Rice's Severe Storm Prediction, Education and Evacuation from Disasters Center, or SSPEED Center, is a leader in flood mitigation research and innovation.
In 2021, the center developed its FIRST radar-based flood assessment, mapping, and early-warning system based on more than 350 maps that simulate different combinations of rainfall over various areas of the watershed. The system was derived from the Rice/Texas Medical Center Flood Alert System (FAS), which Bedient created 20 years ago. Click here to read more.
There’s a reason “carbon footprint” became a buzzword. It sounds like something we should know. Something we should measure. Something that should be printed next to the calorie count on a label.
But unlike calories, a carbon footprint isn’t universal, standardized, or easy to calculate. In fact, for most companies—especially in energy and heavy industry—it’s still a black box.
That’s the problem Planckton Data is solving.
On this episode of the Energy Tech Startups Podcast, Planckton Data co-founders Robin Goswami and Sandeep Roy sit down to explain how they’re turning complex, inconsistent, and often incomplete emissions data into usable insight. Not for PR. Not for green washing. For real operational and regulatory decisions.
And they’re doing it in a way that turns sustainability from a compliance burden into a competitive advantage.
From calories to carbon: The label analogy that actually works
If you’ve ever picked up two snack bars and compared their calorie counts, you’ve made a decision based on transparency. Robin and Sandeep want that same kind of clarity for industrial products.
Whether it’s a shampoo bottle, a plastic feedstock, or a specialty chemical—there’s now consumer and regulatory pressure to know exactly how sustainable a product is. And to report it.
But that’s where the simplicity ends.
Because unlike food labels, carbon labels can’t be standardized across a single factory. They depend on where and how a product was made, what inputs were used, how far it traveled, and what method was used to calculate the data.
Even two otherwise identical chemicals—one sourced from a refinery in Texas and the other in Europe—can carry very different carbon footprints, depending on logistics, local emission factors, and energy sources.
Planckton’s solution is built to handle exactly this level of complexity.
AI that doesn’t just analyze
For most companies, supply chain emissions data is scattered, outdated, and full of gaps.
That’s where Planckton’s use of AI becomes transformative.
It standardizes data from multiple suppliers, geographies, and formats.
It uses probabilistic models to fill in the blanks when suppliers don’t provide details.
It applies industry-specific product category rules (PCRs) and aligns them with evolving global frameworks like ISO standards and GHG Protocol.
It helps companies model decarbonization pathways, not just calculate baselines.
This isn’t generative AI for show. It’s applied machine learning with a purpose: helping large industrial players move from reporting to real action.
And it’s not a side tool. For many of Planckton’s clients, it’s becoming the foundation of their sustainability strategy.
From boardrooms to smokestacks: Where the pressure is coming from
Planckton isn’t just chasing early adopters. They’re helping midstream and upstream industrial suppliers respond to pressure coming from two directions:
Downstream consumer brands—especially in cosmetics, retail, and CPG—are demanding footprint data from every input supplier.
Upstream regulations—especially in Europe—are introducing reporting requirements, carbon taxes, and supply chain disclosure laws.
The team gave a real-world example: a shampoo brand wants to differentiate based on lower emissions. That pressure flows up the value chain to the chemical suppliers. Who, in turn, must track data back to their own suppliers.
It’s a game of carbon traceability—and Planckton helps make it possible.
Why Planckton focused on chemicals first
With backgrounds at Infosys and McKinsey, Robin and Sandeep know how to navigate large-scale digital transformations. They also know that industry specificity matters—especially in sustainability.
So they chose to focus first on the chemicals sector—a space where:
Supply chains are complex and often opaque.
Product formulations are sensitive.
And pressure from cosmetics, packaging, and consumer brands is pushing for measurable, auditable impact data.
It’s a wedge into other verticals like energy, plastics, fertilizers, and industrial manufacturing—but one that’s already showing results.
Carbon accounting needs a financial system
What makes this conversation unique isn’t just the product. It’s the co-founders’ view of the ecosystem.
They see a world where sustainability reporting becomes as robust as financial reporting. Where every company knows its Scope 1, 2, and 3 emissions the way it knows revenue, gross margin, and EBITDA.
But that world doesn’t exist yet. The data infrastructure isn’t there. The standards are still in flux. And the tooling—until recently—was clunky, manual, and impossible to scale.
Planckton is building that infrastructure—starting with the industries that need it most.
Houston as a launchpad (not just a legacy hub)
Though Planckton has global ambitions, its roots in Houston matter.
The city’s legacy in energy and chemicals gives it a unique edge in understanding real-world industrial challenges. And the growing ecosystem around energy transition—investors, incubators, and founders—is helping companies like Planckton move fast.
“We thought we’d have to move to San Francisco,” Robin shares. “But the resources we needed were already here—just waiting to be activated.”
The future of sustainability is measurable—and monetizable
The takeaway from this episode is clear: measuring your carbon footprint isn’t just good PR—it’s increasingly tied to market access, regulatory approval, and bottom-line efficiency.
And the companies that embrace this shift now—using platforms like Planckton—won’t just stay compliant. They’ll gain a competitive edge.
Listen to the full conversation with Planckton Data on the Energy Tech Startups Podcast:
Hosted by Jason Ethier and Nada Ahmed, the Digital Wildcatters’ podcast, Energy Tech Startups, delves into Houston's pivotal role in the energy transition, spotlighting entrepreneurs and industry leaders shaping a low-carbon future.
Houston climatech company Gold H2 completed its first field trial that demonstrates subsurface bio-stimulated hydrogen production, which leverages microbiology and existing infrastructure to produce clean hydrogen.
“When we compare our tech to the rest of the stack, I think we blow the competition out of the water," Prabhdeep Singh Sekhon, CEO of Gold H2 Sekhon previously told Energy Capital.
The project represented the first-of-its-kind application of Gold H2’s proprietary biotechnology, which generates hydrogen from depleted oil reservoirs, eliminating the need for new drilling, electrolysis or energy-intensive surface facilities. The Woodlands-based ChampionX LLC served as the oilfield services provider, and the trial was conducted in an oilfield in California’s San Joaquin Basin.
According to the company, Gold H2’s technology could yield up to 250 billion kilograms of low-carbon hydrogen, which is estimated to provide enough clean power to Los Angeles for over 50 years and avoid roughly 1 billion metric tons of CO2 equivalent.
“This field trial is tangible proof. We’ve taken a climate liability and turned it into a scalable, low-cost hydrogen solution,” Sekhon said in a news release. “It’s a new blueprint for decarbonization, built for speed, affordability, and global impact.”
Highlights of the trial include:
First-ever demonstration of biologically stimulated hydrogen generation at commercial field scale with unprecedented results of 40 percent H2 in the gas stream.
Demonstrated how end-of-life oilfield liabilities can be repurposed into hydrogen-producing assets.
The trial achieved 400,000 ppm of hydrogen in produced gases, which, according to the company,y is an “unprecedented concentration for a huff-and-puff style operation and a strong indicator of just how robust the process can perform under real-world conditions.”
The field trial marked readiness for commercial deployment with targeted hydrogen production costs below $0.50/kg.
“This breakthrough isn’t just a step forward, it’s a leap toward climate impact at scale,” Jillian Evanko, CEO and president at Chart Industries Inc., Gold H2 investor and advisor, added in the release. “By turning depleted oil fields into clean hydrogen generators, Gold H2 has provided a roadmap to produce low-cost, low-carbon energy using the very infrastructure that powered the last century. This changes the game for how the world can decarbonize heavy industry, power grids, and economies, faster and more affordably than we ever thought possible.”
HEXAspec, a spinout from Rice University's Liu Idea Lab for Innovation and Entrepreneurship, was recently awarded a $500,000 National Science Foundation Partnership for Innovation grant.
The team says it will use the funding to continue enhancing semiconductor chips’ thermal conductivity to boost computing power. According to a release from Rice, HEXAspec has developed breakthrough inorganic fillers that allow graphic processing units (GPUs) to use less water and electricity and generate less heat.
The technology has major implications for the future of computing with AI sustainably.
“With the huge scale of investment in new computing infrastructure, the problem of managing the heat produced by these GPUs and semiconductors has grown exponentially. We’re excited to use this award to further our material to meet the needs of existing and emerging industry partners and unlock a new era of computing,” HEXAspec co-founder Tianshu Zhai said in the release.
HEXAspec was founded by Zhai and Chen-Yang Lin, who both participated in the Rice Innovation Fellows program. A third co-founder, Jing Zhang, also worked as a postdoctoral researcher and a research scientist at Rice, according to HEXAspec's website.
"The grant from the NSF is a game-changer, accelerating the path to market for this transformative technology," Kyle Judah, executive director of Lilie, added in the release.