Nick Purday, IT director of emerging digital technology for ConocoPhillips, presented at the Reuters Events Data-Driven Oil and Gas Conference 2023 to help dispel any myths about digital twins. Photo courtesy of Shuttershock.

As Nick Purday, IT director of emerging digital technology for ConocoPhillips, began his presentation at the Reuters Events Data-Driven Oil and Gas Conference 2023 in Houston yesterday, he lamented at missing the opportunity to dispel any myths about digital twins given his second-to-last time slot of the conference.

He may have sold himself short.

No less than a hush fell over the crowd as Purday described one of the more challenging applications of digital twins his team tackled late last year. Purday explained, “The large diagram [up there], that’s two trains from our LNG facility. How long did that take to build? We built that one in a month.”

It’s been years since an upstream oil and gas audience has gasped, but Purday swept the crowd with admiration for the swift, arduous task undertaken by his team.

He then addressed the well-known balance of good/fast/cheap in a rare glimpse under the hood of project planning for such novel technology. “As soon as you move into remote visualization applications – think Alaska, think Norway – then you’re going to get a pretty good return on your investment. Think 3-to-1,” Purday explains. “As you would expect, those simulation digital twins, those are the ones where you get huge value. Optimizing the energy requirements of an LNG facility – huge value associated with that.

“Independently, Forrester did some work recently and came up with a 4-to-1 return, so that fits exactly with our data set,” Purday continued before casually bringing up the foundation for their successful effort.

“If you’ve got good data, then it doesn’t take that long and you can do these pretty effectively,” Purday stated plainly.

Another wave of awe rippled across the room.

In an earlier panel session, Nathan McMahan, data strategy chief at CoP, commented on the shared responsibility model for data in the industry. “When I talked to a lot of people across the organization, three common themes commonly filtered up: What’s the visibility, access, and trust of data?” McMahan observed.

Strong data governance stretches across the organization, but the Wells team, responsible for drilling and completions activity, stood out to McMahan with its approach to data governance.

“They had taken ownership of [the] data and partnered with business units across the globe to standardize best practices between some of the tools and data ingestion methods, even work with suppliers and contractors, [to demonstrate] our expectations for how we take data,” McMahan explained. “They even went a step further to bring an IT resource onto their floor and start to create roles of the owners and the stewards and the custodians of the data. They really laid that good foundation and built upon that with some of the outcomes they wanted to achieve with machine learning techniques and those sorts of things.“

The key, McMahan concluded, is making the “janitorial effort [of] cleaning up data sustainable… and fun.”

The sentiment of fun continued in Purday's late afternoon presentation as he explained how the application went viral upon sharing it with 1 or 2 testers, crashing the email of the lead developer responsible for managing the model as he was flooded with questions and kudos.

Digital twin applications significantly reduce the carbon footprint created by sending personnel to triage onsite concerns for LNG, upstream, and refining facilities in addition to streamlining processes and enabling tremendous savings. The application Purday described allowed his team to discover an issue previously only resolved by flying someone to a remote location where they would likely spend days testing and analyzing the area to diagnose the problem.

The digital twin found the issue in 10 minutes, and the on-site team resolved the problem within the day.

The LNG operations team now consistently starts their day with a bit of a spark, using the digital twin during morning meetings to help with planning and predictive maintenance.

Ad Placement 300x100
Ad Placement 300x600

CultureMap Emails are Awesome

UH's $44 million mass timber building slashed energy use in first year

building up

The University of Houston recently completed assessments on year one of the first mass timber project on campus, and the results show it has had a major impact.

Known as the Retail, Auxiliary, and Dining Center, or RAD Center, the $44 million building showed an 84 percent reduction in predicted energy use intensity, a measure of how much energy a building uses relative to its size, compared to similar buildings. Its Global Warming Potential rating, a ratio determined by the Intergovernmental Panel on Climate Change, shows a 39 percent reduction compared to the benchmark for other buildings of its type.

In comparison to similar structures, the RAD Center saved the equivalent of taking 472 gasoline-powered cars driven for one year off the road, according to architecture firm Perkins & Will.

The RAD Center was created in alignment with the AIA 2030 Commitment to carbon-neutral buildings, designed by Perkins & Will and constructed by Houston-based general contractor Turner Construction.

Perkins & Will’s work reduced the building's carbon footprint by incorporating lighter mass timber structural systems, which allowed the RAD Center to reuse the foundation, columns and beams of the building it replaced. Reused elements account for 45 percent of the RAD Center’s total mass, according to Perkins & Will.

Mass timber is considered a sustainable alternative to steel and concrete construction. The RAD Center, a 41,000-square-foot development, replaced the once popular Satellite, which was a food, retail and hangout center for students on UH’s campus near the Science & Research Building 2 and the Jack J. Valenti School of Communication.

The RAD Center uses more than a million pounds of timber, which can store over 650 metric tons of CO2. Aesthetically, the building complements the surrounding campus woodlands and offers students a view both inside and out.

“Spaces are designed to create a sense of serenity and calm in an ecologically-minded environment,” Diego Rozo, a senior project manager and associate principal at Perkins & Will, said in a news release. “They were conceptually inspired by the notion of ‘unleashing the senses’ – the design celebrating different sights, sounds, smells and tastes alongside the tactile nature of the timber.”

In addition to its mass timber design, the building was also part of an Energy Use Intensity (EUI) reduction effort. It features high-performance insulation and barriers, natural light to illuminate a building's interior, efficient indoor lighting fixtures, and optimized equipment, including HVAC systems.

The RAD Center officially opened Phase I in Spring 2024. The third and final phase of construction is scheduled for this summer, with a planned opening set for the fall.

Experts on U.S. energy infrastructure, sustainability, and the future of data

Guest column

Digital infrastructure is the dominant theme in energy and infrastructure, real estate and technology markets.

Data, the byproduct and primary value generated by digital infrastructure, is referred to as “the fifth utility,” along with water, gas, electricity and telecommunications. Data is created, aggregated, stored, transmitted, shared, traded and sold. Data requires data centers. Data centers require energy. The United States is home to approximately 40% of the world's data centers. The U.S. is set to lead the world in digital infrastructure advancement and has an opportunity to lead on energy for a very long time.

Data centers consume vast amounts of electricity due to their computational and cooling requirements. According to the United States Department of Energy, data centers consume “10 to 50 times the energy per floor space of a typical commercial office building.” Lawrence Berkeley National Laboratory issued a report in December 2024 stating that U.S. data center energy use reached 176 TWh by 2023, “representing 4.4% of total U.S. electricity consumption.” This percentage will increase significantly with near-term investment into high performance computing (HPC) and artificial intelligence (AI). The markets recognize the need for digital infrastructure build-out and, developers, engineers, investors and asset owners are responding at an incredible clip.

However, the energy demands required to meet this digital load growth pose significant challenges to the U.S. power grid. Reliability and cost-efficiency have been, and will continue to be, two non-negotiable priorities of the legal, regulatory and quasi-regulatory regime overlaying the U.S. power grid.

Maintaining and improving reliability requires physical solutions. The grid must be perfectly balanced, with neither too little nor too much electricity at any given time. Specifically, new-build, physical power generation and transmission (a topic worthy of another article) projects must be built. To be sure, innovative financial products such as virtual power purchase agreements (VPPAs), hedges, environmental attributes, and other offtake strategies have been, and will continue to be, critical to growing the U.S. renewable energy markets and facilitating the energy transition, but the U.S. electrical grid needs to generate and move significantly more electrons to support the digital infrastructure transformation.

But there is now a third permanent priority: sustainability. New power generation over the next decade will include a mix of solar (large and small scale, offsite and onsite), wind and natural gas resources, with existing nuclear power, hydro, biomass, and geothermal remaining important in their respective regions.

Solar, in particular, will grow as a percentage of U.S grid generation. The Solar Energy Industries Association (SEIA) reported that solar added 50 gigawatts of new capacity to the U.S. grid in 2024, “the largest single year of new capacity added to the grid by an energy technology in over two decades.” Solar is leading, as it can be flexibly sized and sited.

Under-utilized technology such as carbon capture, utilization and storage (CCUS) will become more prominent. Hydrogen may be a potential game-changer in the medium-to-long-term. Further, a nuclear power renaissance (conventional and small modular reactor (SMR) technologies) appears to be real, with recent commitments from some of the largest companies in the world, led by technology companies. Nuclear is poised to be a part of a “net-zero” future in the United States, also in the medium-to-long term.

The transition from fossil fuels to zero carbon renewable energy is well on its way – this is undeniable – and will continue, regardless of U.S. political and market cycles. Along with reliability and cost efficiency, sustainability has become a permanent third leg of the U.S. power grid stool.

Sustainability is now non-negotiable. Corporate renewable and low carbon energy procurement is strong. State renewable portfolio standards (RPS) and clean energy standards (CES) have established aggressive goals. Domestic manufacturing of the equipment deployed in the U.S. is growing meaningfully and in politically diverse regions of the country. Solar, wind and batteries are increasing less expensive. But, perhaps more importantly, the grid needs as much renewable and low carbon power generation as possible - not in lieu of gas generation, but as an increasingly growing pairing with gas and other technologies. This is not an “R” or “D” issue (as we say in Washington), and it's not an “either, or” issue, it's good business and a physical necessity.

As a result, solar, wind and battery storage deployment, in particular, will continue to accelerate in the U.S. These clean technologies will inevitably become more efficient as the buildout in the U.S. increases, investments continue and technology advances.

At some point in the future (it won’t be in the 2020s, it could be in the 2030s, but, more realistically, in the 2040s), the U.S. will have achieved the remarkable – a truly modern (if not entirely overhauled) grid dependent largely on a mix of zero and low carbon power generation and storage technology. And when this happens, it will have been due in large part to the clean technology deployment and advances over the next 10 to 15 years resulting from the current digital infrastructure boom.

---

Hans Dyke and Gabbie Hindera are lawyers at Bracewell. Dyke's experience includes transactions in the electric power and oil and gas midstream space, as well as transactions involving energy intensive industries such as data storage. Hindera focuses on mergers and acquisitions, joint ventures, and public and private capital market offerings.

Rice researchers' quantum breakthrough could pave the way for next-gen superconductors

new findings

A new study from researchers at Rice University, published in Nature Communications, could lead to future advances in superconductors with the potential to transform energy use.

The study revealed that electrons in strange metals, which exhibit unusual resistance to electricity and behave strangely at low temperatures, become more entangled at a specific tipping point, shedding new light on these materials.

A team led by Rice’s Qimiao Si, the Harry C. and Olga K. Wiess Professor of Physics and Astronomy, used quantum Fisher information (QFI), a concept from quantum metrology, to measure how electron interactions evolve under extreme conditions. The research team also included Rice’s Yuan Fang, Yiming Wang, Mounica Mahankali and Lei Chen along with Haoyu Hu of the Donostia International Physics Center and Silke Paschen of the Vienna University of Technology. Their work showed that the quantum phenomenon of electron entanglement peaks at a quantum critical point, which is the transition between two states of matter.

“Our findings reveal that strange metals exhibit a unique entanglement pattern, which offers a new lens to understand their exotic behavior,” Si said in a news release. “By leveraging quantum information theory, we are uncovering deep quantum correlations that were previously inaccessible.”

The researchers examined a theoretical framework known as the Kondo lattice, which explains how magnetic moments interact with surrounding electrons. At a critical transition point, these interactions intensify to the extent that the quasiparticles—key to understanding electrical behavior—disappear. Using QFI, the team traced this loss of quasiparticles to the growing entanglement of electron spins, which peaks precisely at the quantum critical point.

In terms of future use, the materials share a close connection with high-temperature superconductors, which have the potential to transmit electricity without energy loss, according to the researchers. By unblocking their properties, researchers believe this could revolutionize power grids and make energy transmission more efficient.

The team also found that quantum information tools can be applied to other “exotic materials” and quantum technologies.

“By integrating quantum information science with condensed matter physics, we are pivoting in a new direction in materials research,” Si said in the release.