Hear from guest columnist Onega Ulanova on AI and quality management systems in manufacturing. Photo via Getty Images

The concept of quality management is so intrinsic to modern manufacturing — and yet so little understood by the general public — and has literally revolutionized our world over the past hundred years.

Yet, in the present day, quality management and the related systems that guide its implementation are far from static. They are continuously-evolving, shifting to ever-changing global conditions and new means of application unleashed by technological innovation.

Now, more than ever, they are essential for addressing and eliminating not only traditional sources of waste in business, such as lost time and money, but also the physical and pollutant waste that threatens the world we all inhabit.

But what are quality management systems, or QMS, exactly? Who created them, and how have they evolved over time? Perhaps most pressingly, where can they be of greatest help in the present world, and when can they be implemented by businesses in need of change and improvement?

In this article, we will explore the history of QMS, explain their essential role in today’s manufacturing practices, and examine how these systems will take us into the future of productivity.

Quality Management Systems: A Definition

In the United States and globally, the gold standard of quality management standards and practices is the American Society for Quality. This preeminent organization, with over 4,000 members in 130 countries, was established in 1946 and has guided practices and implementation of quality management systems worldwide.

The Society defines a quality management system as “a formalized system that documents processes, procedures, and responsibilities for achieving quality policies and objectives,” and further states that “a QMS helps coordinate and direct an organization’s activities to meet customer and regulatory requirements and improve its effectiveness and efficiency on a continuous basis.”

From this definition, it can be understood that a good quality management system’s purpose is to establish the conditions for consistent and ever-increasing improvement through the use of standardized business culture practices.

Which QMS Standards are Most Widely Used?

The results of quality management’s remarkable growth since the 1940s has led to the rise of a number of widely-used standards, which can serve as the basis for companies and organizations to design and implement their own practices. Most of these modern quality management standards are globally recognized, and are specifically tailored to ensure that a company’s newly-developed practices include essential elements that can increase the likelihood of success.

The most widely-known entity which has designed such guidance is the International Organization for Standardization (ISO), a global organization which develops and publishes technical standards. Since the 1980s, the ISO has provided the 9000 series of standards (the most famous of which is 9001:2015) which outline how organizations can satisfy the checklists of quality management requirements and create their own best practices.

In 2020, over 1.2 million organizations worldwide were officially certified by the ISO for their quality management implementation practices.

However, it should be understood that the ISO 9000 standards are merely guidelines for the design and implementation of a quality management system; they are not systems in and of themselves.

Furthermore, the ISO is far from the only relevant player in this field. Many industry-specific standards, such as the American Petroleum Institute’s API Q1 standard, have been developed to target the highly specialized needs of particular business practices of oil and gas industry. These industry-specific standards are generally aligned with the ISO 9000 standards, and serve as complimentary additional guidance, rather than a replacement. It is entirely possible, and in many cases desirable, for a company to receive both ISO certification and certification from an industry-specific standards body, as doing so can help ensure the company’s newly-developed QMS procedures are consistent with both broad and specialized best practices.

A History of Quality Management

The concept of quality management is intrinsically tied to the development of industrial production. Previous to the industrial revolution, the concept of ‘quality’ was inherently linked to the skill and effort of craftspeople, or in other words, individual laborers trained in specialized fields who, either individually or in small groups, produced goods for use in society.

Whether they were weaving baskets or building castles, these craftspeople were primarily defined by a skill that centered them in a specific production methodology, and it was the mastery of this skill which determined the quality. Guilds of craftspeople would sign their works, placing a personal or group seal on the resulting product and thereby accepting accountability for its quality.

Such signatures and marks are found dating back at least 4,500 years to the construction of Egypt’s Great Pyramid of Giza, and came into widespread practice in medieval Europe with the rise of craft guilds.

In these early confederations of workers, a person’s mastery of a skill or craft could become a defining part of their identity and life, to the extent that many craftspeople of 13th Century Europe lived together in communal settings, while the Egyptian pyramid workers may have belonged to life-long ‘fraternities’ who returned, year after year, to fulfill their roles in ‘work gangs’.

However, in the Industrial Revolution, craft and guild organizations were supplanted by factories. Though ancient and medieval projects at times reached monumental scale, the rise of thousands of factories, each requiring human and machine contributions to generate masses of identical products, required a completely different scale of quality management.

The emphasis on mass production necessitated the use of workers who were no longer crafts masters, and thus resulted in a decrease in the quality of products. This in turn necessitated the rise of the product inspection system, which was steadily refined from the start of the Industrial Revolution in 1760 into the early 20th century.

However, inspection was merely a system of quality control, rather than quality management; in other words, simply discarding defective products did not in and of itself increase total product quality or reduce waste.

As influential American engineer Joseph M. Juran explained, in 1920s-era America, it was common to throw away substantial portions of produced inventory due to defects, and when Juran prompted inspectors at his employer’s company to do something, they refused, saying it was the responsibility of the production line to improve. Quality control, in and of itself, would not yield quality management.

As is often the case in human history, war was the driver of change. In World War II, the mobilization of millions of American workers into wartime roles coincided with the need to produce greater quantities of high-quality products than ever before.

To counteract the loss of skilled factory labor, the United States government implemented the Training Within Industry program, which utilized 10-hour courses to educate newly-recruited workers in how to conduct their work, evaluate their efficiency, and suggest improvements. Similar training programs for the trainers themselves were also developed. By the end of the war, more than 1.6 million workers had been certified under the Training Within Industry program.

Training Within Industry represented one of the first successful implementations of quality management systems, and its impact was widely felt after the end of the war. In the ashes of conflict, the United States and the other Allied Powers were tasked with helping to rebuild the economies of the other wartime combatants. Nowhere was this a more pressing matter than Japan, which had seen widespread economic devastation and had lost 40 percent of all its factories. Further complicating the situation was the reality that, then as now, Japan lacked sufficient natural resources to serve its economic scale.

And yet, within just 10 years of the war’s end, Japan’s economy war growing twice as fast per year than it had been before the fighting started. The driver of this miraculous turnaround was American-derived quality management practices, reinterpreted and implemented with Japanese ingenuity.

In modern business management, few concepts are as renowned, and oft-cited for success, as kaizen. This Japanese word, which simply means “improvement,” is the essential lesson and driver of Japan’s postwar economic success.

Numerous books written outside Japan have attempted to explain kaizen’s quality management principles, often by citing them as being ‘distinctly Japanese.’ Yet, the basis for kaizen is actually universal and applicable in any culture or context; it is, simply put, an emphasis on remaining quality-focused and open to evolution. The development of kaizen began in the post-war period when American statistician William Edwards Deming was brought to Japan as part of the US government’s rebuilding efforts.

A student of earlier quality management thought leaders, Deming instructed hundreds of Japanese engineers, executives, and scholars, urging them to place statistical analysis and human relationships at the center of their management practices. Deming used statistics to track the number and origin of product defects, as well to analyze the effectiveness of remedies. He also reinstated a key idea of the craftsperson creed: that the individual worker is not just a set of hands performing a task, but a person who can, with time, improve both the self and the whole of the company.

Deming was not alone in these efforts; the aforementioned Joseph M. Juran, who came to Japan as part of the rebuilding program several years later, also gave numerous lectures expounding similar principles.

Like Deming, Juran had previously tried to impart these approaches to American industry, but the lessons often fell on deaf ears. Japanese managers, however, took the lessons to heart and soon began crafting their own quality management systems.

Kaoru Ishikawa, who began by translating the works of Deming and Juran into Japanese, was one of the crucial players who helped to create the ideas now known as kaizen. He introduced a bottom-up approach where workers from every part of the product life cycle could initiate change, and popularized Deming’s concept of quality circles, where small groups of workers would meet regularly to analyze results and discuss improvements.

By 1975, Japanese product quality, which had once been regarded as poor, had transformed into world-class thanks to the teachings of Deming, Juran, and kaizen.

By the 1980s, American industry had lost market share and quality prestige to Japan. It was now time for US businesses to learn from Deming and Juran, both of whom at last found a receptive audience in their home country. Deming in particular achieved recognition for his role in the influential 1980 television documentary If Japan Can, Why Can’t We?, in which he emphasized the universal applicability of quality management.

So too did kaizen, which influenced a new generation of global thought leaders. Arising out of this rapid expansion of QMS were new systems in the 1970s and ‘80s, including the Six Sigma approach pioneered by Bill Smith and Motorola in 1987. Ishikawa, who saw his reputation and life transformed as his ideas spread worldwide, eventually summed up the explanation as the universality of human nature and its desire to improve. As Ishikawa said, “wherever they are, human beings are human beings”.

In no small part due to the influence of the thought leaders mentioned, quality management systems are today a cornerstone of global business practice. So influential are the innovators of these systems that they are often called ‘gurus.’ But what are the specific benefits of these systems, and how best can they be implemented?

How QMS Benefits Organizations, and the World

The oft-cited benefits of quality management systems are operational efficiency, employee retention, and reduction of waste. From all of these come improvements to the company’s bottom line and reputation. But far from being dry talking points, each benefit not only serves its obvious purpose, but also can dramatically help benefit the planet itself.

Operational efficiency is the measurement, analysis, and improvement of processes which occur within an organization, with the purpose of utilizing data and consideration to eliminate or mediate any areas where current practices are not effective.

Quality management systems can increase operational efficiency by utilizing employee analysis and feedback to quickly identify areas where improvements are possible, and then to guide their implementation.

In a joint study conducted in 2017 by Forbes and the American Society for Quality, 56 percent of companies stated that improving operational efficiency was a top concern; in the same survey, 59 percent of companies received direct benefit to operations by utilizing quality management system practices, making it the single largest area of improvement across all business types.

Because operational improvements inherently reduce both waste and cost, conducting business in a fully-optimized manner can simultaneously save unnecessary resource expenditure, decrease pollutants and discarded materials, and retain more money which the company can invest into further sustainable practices. Efficiency is itself a kind of ‘stealth sustainability’ that turns a profit-focused mindset into a generator of greater good. It is this very point that the

United States government’s Environmental Protection Agency (EPA) has emphasized in their guidance for Environmental Management Systems (EMS). These quality management system guidelines, tailored specifically to benefit operational efficiency in a business setting, are also designed to benefit the global environment by utilizing quality management practices.

Examples in the EPA’s studies in preparing these guidelines showcased areas where small companies could reduce environmental waste, while simultaneously reducing cost, in numerous areas. These added to substantial reductions and savings, such as a 15 percent waste water reduction which saved a small metal finishing company $15,000 per year.

Similarly, a 2020 study by McKinsey & Company identified ways that optimizing operations could dramatically aid a company’s sustainability with only small outlays of capital, thereby making environmental benefit a by-product of improved profitability.

Employee retention, and more broadly the satisfaction of employees, is another major consideration of QMS. Defined simply, retention is not only the maintenance of a stable workforce without turnover, but the improvement of that workforce with time as they gain skill, confidence, and ability for continued self and organizational improvement. We may be in the post-Industrial Revolution, but thanks to the ideas of QMS, some of the concept of the craftsperson has returned to modern thinking; the individual, once more, has great value.

Quality management systems aid employee retention by allowing the people of an organization to have a direct hand in its improvement. In a study published in 2023 by the journal Quality Innovation Prosperity, 40 percent of organizations which implemented ISO 9001 guidance for the creation of a QMS reported that the process yielded greater employee retention.

A crucial success factor for employee satisfaction is how empowered the employee feels to apply judgment. According to a 2014 study by the Harvard Business Review, companies which set clear guidelines, protect and celebrate employee proposals for quality improvement, and clearly communicate the organization’s quality message while allowing the employees to help shape and implement it, have by far the highest engagement and retention rates. The greatest successes come from cultures where peer-driven approaches increase employee engagement, thereby eliminating preventable employee mistakes. Yet the same study also pointed out that nearly half of all employees feel their company’s leadership lacks a clear emphasis on quality, and only 10 percent felt their company’s existing quality statements were truthful and viable.

Then as now, the need to establish a clear quality culture, to manage and nurture that culture, and to empower the participants is critical to earning the trust of the employee participants and thereby retaining workers who in time can become the invaluable craftspeople of today.

Finally, there is the reduction of waste. Waste can be defined in many ways: waste of time, waste of money, waste of resources. The unifying factor in all definitions is the loss of something valuable, and irretrievable. All inevitably also lead to the increase of another kind of waste: pollution and discarded detritus which steadily ruin our shared planet.

Reducing waste with quality management can take many forms, but ultimately, all center on the realization of strategies which use only what is truly needed. This can mean both operational efficiencies and employee quality, as noted above. The Harvard Business Review survey identified that in 2014, the average large company (having 26,000 employees or more) loses a staggering $350 million each year due to preventable employee errors, many of which could be reduced, mitigated, or eliminated entirely with better implementation of quality management.

This is waste on an almost unimaginable financial scale. Waste eliminated through practices which emphasize efficiency and sustainability, as noted in the McKinsey & Company study, can also yield tremendous savings. In one example, a company which purchased asphalt and previously prioritized only the per-ton price found that, when examining the logistical costs of transporting the asphalt from distant suppliers, they were actually paying more than if they purchased it locally. The quality management analysis they performed yielded them a cost savings, and eliminated 40 percent of the carbon emissions associated with the asphalt’s procurement. In this case, not only was wasteful spending eliminated, but literal waste (pollution) was prevented.

In taking these steps, companies can meaningfully improve their bottom lines, while at the same time doing something worthwhile and beneficial for the planet. That, in turn, helps burnish their reputations. A remarkable plurality of consumers, 88 percent of Americans surveyed in a 2017 study to be exact, said they would be more loyal to a company that supports social or environmental issues.

It is therefore clear that any steps a company can take which save money, improve worker satisfaction, and yield increased positivity in the marketplace are well worth pursuing.

What is the Future of QMS?

Until the 2000s, quality management systems were just that: systems of desirable practices, outlined by individuals and implemented individually. That was the age of the gurus: the visionaries who outlined the systems. But what that age lacked was a practical and easy means for companies, sometimes located far away from direct guidance by the gurus, to implement their teachings.

In the intervening years, technology has radically changed that dynamic. Today, QMS software fills the marketplace, allowing businesses small and large to design and guide their quality management plans. But even these software solutions have not yet solved the last great challenge: personalized assistance in putting standards into practice.

That is why the latest innovations, particularly in artificial intelligence, have the potential to upend the equation. Already, major companies have started to use artificial intelligence in connection with QMS datasets managed by software, utilizing the programs for statistical analysis, suggested improvements, and even prediction of potential faults before they occur.

These are immensely valuable opportunities, hence why huge players such as Honeywell are spending billions of dollars to bring innovative AI technology companies into their platforms to refine existing QMS systems.

But while AI has already begun to significantly affect the biggest players, small and mid-sized companies remain eager, but not yet able, to take full advantage. It is thus the next great revolution for a new evolution of QMS, one which will bring these emerging technologies to all companies, regardless of size or scale. The future of QMS, and therefore the future of efficiency in business, rests upon this shift from companies being the recipients of ‘guru knowledge,’ to themselves being the designers of their own quality-minded futures.

------

Onega Ulanova is the CEO of QMS2GO, a provider of quality management systems leveraging AI in manufacturing.

This article originally ran on InnovationMap.

Ad Placement 300x100
Ad Placement 300x600

CultureMap Emails are Awesome

How Planckton Data is building the sustainability label every industry will need

now streaming

There’s a reason “carbon footprint” became a buzzword. It sounds like something we should know. Something we should measure. Something that should be printed next to the calorie count on a label.

But unlike calories, a carbon footprint isn’t universal, standardized, or easy to calculate. In fact, for most companies—especially in energy and heavy industry—it’s still a black box.

That’s the problem Planckton Data is solving.

On this episode of the Energy Tech Startups Podcast, Planckton Data co-founders Robin Goswami and Sandeep Roy sit down to explain how they’re turning complex, inconsistent, and often incomplete emissions data into usable insight. Not for PR. Not for green washing. For real operational and regulatory decisions.

And they’re doing it in a way that turns sustainability from a compliance burden into a competitive advantage.

From calories to carbon: The label analogy that actually works

If you’ve ever picked up two snack bars and compared their calorie counts, you’ve made a decision based on transparency. Robin and Sandeep want that same kind of clarity for industrial products.

Whether it’s a shampoo bottle, a plastic feedstock, or a specialty chemical—there’s now consumer and regulatory pressure to know exactly how sustainable a product is. And to report it.

But that’s where the simplicity ends.

Because unlike food labels, carbon labels can’t be standardized across a single factory. They depend on where and how a product was made, what inputs were used, how far it traveled, and what method was used to calculate the data.

Even two otherwise identical chemicals—one sourced from a refinery in Texas and the other in Europe—can carry very different carbon footprints, depending on logistics, local emission factors, and energy sources.

Planckton’s solution is built to handle exactly this level of complexity.

AI that doesn’t just analyze

For most companies, supply chain emissions data is scattered, outdated, and full of gaps.

That’s where Planckton’s use of AI becomes transformative.

  • It standardizes data from multiple suppliers, geographies, and formats.
  • It uses probabilistic models to fill in the blanks when suppliers don’t provide details.
  • It applies industry-specific product category rules (PCRs) and aligns them with evolving global frameworks like ISO standards and GHG Protocol.
  • It helps companies model decarbonization pathways, not just calculate baselines.

This isn’t generative AI for show. It’s applied machine learning with a purpose: helping large industrial players move from reporting to real action.

And it’s not a side tool. For many of Planckton’s clients, it’s becoming the foundation of their sustainability strategy.

From boardrooms to smokestacks: Where the pressure is coming from

Planckton isn’t just chasing early adopters. They’re helping midstream and upstream industrial suppliers respond to pressure coming from two directions:

  1. Downstream consumer brands—especially in cosmetics, retail, and CPG—are demanding footprint data from every input supplier.
  2. Upstream regulations—especially in Europe—are introducing reporting requirements, carbon taxes, and supply chain disclosure laws.

The team gave a real-world example: a shampoo brand wants to differentiate based on lower emissions. That pressure flows up the value chain to the chemical suppliers. Who, in turn, must track data back to their own suppliers.

It’s a game of carbon traceability—and Planckton helps make it possible.

Why Planckton focused on chemicals first

With backgrounds at Infosys and McKinsey, Robin and Sandeep know how to navigate large-scale digital transformations. They also know that industry specificity matters—especially in sustainability.

So they chose to focus first on the chemicals sector—a space where:

  • Supply chains are complex and often opaque.
  • Product formulations are sensitive.
  • And pressure from cosmetics, packaging, and consumer brands is pushing for measurable, auditable impact data.

It’s a wedge into other verticals like energy, plastics, fertilizers, and industrial manufacturing—but one that’s already showing results.

Carbon accounting needs a financial system

What makes this conversation unique isn’t just the product. It’s the co-founders’ view of the ecosystem.

They see a world where sustainability reporting becomes as robust as financial reporting. Where every company knows its Scope 1, 2, and 3 emissions the way it knows revenue, gross margin, and EBITDA.

But that world doesn’t exist yet. The data infrastructure isn’t there. The standards are still in flux. And the tooling—until recently—was clunky, manual, and impossible to scale.

Planckton is building that infrastructure—starting with the industries that need it most.

Houston as a launchpad (not just a legacy hub)

Though Planckton has global ambitions, its roots in Houston matter.

The city’s legacy in energy and chemicals gives it a unique edge in understanding real-world industrial challenges. And the growing ecosystem around energy transition—investors, incubators, and founders—is helping companies like Planckton move fast.

“We thought we’d have to move to San Francisco,” Robin shares. “But the resources we needed were already here—just waiting to be activated.”

The future of sustainability is measurable—and monetizable

The takeaway from this episode is clear: measuring your carbon footprint isn’t just good PR—it’s increasingly tied to market access, regulatory approval, and bottom-line efficiency.

And the companies that embrace this shift now—using platforms like Planckton—won’t just stay compliant. They’ll gain a competitive edge.

Listen to the full conversation with Planckton Data on the Energy Tech Startups Podcast:

Hosted by Jason Ethier and Nada Ahmed, the Digital Wildcatters’ podcast, Energy Tech Startups, delves into Houston's pivotal role in the energy transition, spotlighting entrepreneurs and industry leaders shaping a low-carbon future.


Gold H2 harvests clean hydrogen from depleted California reservoirs in first field trial

breakthrough trial

Houston climatech company Gold H2 completed its first field trial that demonstrates subsurface bio-stimulated hydrogen production, which leverages microbiology and existing infrastructure to produce clean hydrogen.

Gold H2 is a spinoff of another Houston biotech company, Cemvita.

“When we compare our tech to the rest of the stack, I think we blow the competition out of the water," Prabhdeep Singh Sekhon, CEO of Gold H2 Sekhon previously told Energy Capital.

The project represented the first-of-its-kind application of Gold H2’s proprietary biotechnology, which generates hydrogen from depleted oil reservoirs, eliminating the need for new drilling, electrolysis or energy-intensive surface facilities. The Woodlands-based ChampionX LLC served as the oilfield services provider, and the trial was conducted in an oilfield in California’s San Joaquin Basin.

According to the company, Gold H2’s technology could yield up to 250 billion kilograms of low-carbon hydrogen, which is estimated to provide enough clean power to Los Angeles for over 50 years and avoid roughly 1 billion metric tons of CO2 equivalent.

“This field trial is tangible proof. We’ve taken a climate liability and turned it into a scalable, low-cost hydrogen solution,” Sekhon said in a news release. “It’s a new blueprint for decarbonization, built for speed, affordability, and global impact.”

Highlights of the trial include:

  • First-ever demonstration of biologically stimulated hydrogen generation at commercial field scale with unprecedented results of 40 percent H2 in the gas stream.
  • Demonstrated how end-of-life oilfield liabilities can be repurposed into hydrogen-producing assets.
  • The trial achieved 400,000 ppm of hydrogen in produced gases, which, according to the company,y is an “unprecedented concentration for a huff-and-puff style operation and a strong indicator of just how robust the process can perform under real-world conditions.”
  • The field trial marked readiness for commercial deployment with targeted hydrogen production costs below $0.50/kg.

“This breakthrough isn’t just a step forward, it’s a leap toward climate impact at scale,” Jillian Evanko, CEO and president at Chart Industries Inc., Gold H2 investor and advisor, added in the release. “By turning depleted oil fields into clean hydrogen generators, Gold H2 has provided a roadmap to produce low-cost, low-carbon energy using the very infrastructure that powered the last century. This changes the game for how the world can decarbonize heavy industry, power grids, and economies, faster and more affordably than we ever thought possible.”

Rice University spinout lands $500K NSF grant to boost chip sustainability

cooler computing

HEXAspec, a spinout from Rice University's Liu Idea Lab for Innovation and Entrepreneurship, was recently awarded a $500,000 National Science Foundation Partnership for Innovation grant.

The team says it will use the funding to continue enhancing semiconductor chips’ thermal conductivity to boost computing power. According to a release from Rice, HEXAspec has developed breakthrough inorganic fillers that allow graphic processing units (GPUs) to use less water and electricity and generate less heat.

The technology has major implications for the future of computing with AI sustainably.

“With the huge scale of investment in new computing infrastructure, the problem of managing the heat produced by these GPUs and semiconductors has grown exponentially. We’re excited to use this award to further our material to meet the needs of existing and emerging industry partners and unlock a new era of computing,” HEXAspec co-founder Tianshu Zhai said in the release.

HEXAspec was founded by Zhai and Chen-Yang Lin, who both participated in the Rice Innovation Fellows program. A third co-founder, Jing Zhang, also worked as a postdoctoral researcher and a research scientist at Rice, according to HEXAspec's website.

The HEXASpec team won the Liu Idea Lab for Innovation and Entrepreneurship's H. Albert Napier Rice Launch Challenge in 2024. More recently, it also won this year's Energy Venture Day and Pitch Competition during CERAWeek in the TEX-E student track, taking home $25,000.

"The grant from the NSF is a game-changer, accelerating the path to market for this transformative technology," Kyle Judah, executive director of Lilie, added in the release.

---

This article originally ran on InnovationMap.