Hear from guest columnist Onega Ulanova on AI and quality management systems in manufacturing. Photo via Getty Images

The concept of quality management is so intrinsic to modern manufacturing — and yet so little understood by the general public — and has literally revolutionized our world over the past hundred years.

Yet, in the present day, quality management and the related systems that guide its implementation are far from static. They are continuously-evolving, shifting to ever-changing global conditions and new means of application unleashed by technological innovation.

Now, more than ever, they are essential for addressing and eliminating not only traditional sources of waste in business, such as lost time and money, but also the physical and pollutant waste that threatens the world we all inhabit.

But what are quality management systems, or QMS, exactly? Who created them, and how have they evolved over time? Perhaps most pressingly, where can they be of greatest help in the present world, and when can they be implemented by businesses in need of change and improvement?

In this article, we will explore the history of QMS, explain their essential role in today’s manufacturing practices, and examine how these systems will take us into the future of productivity.

Quality Management Systems: A Definition

In the United States and globally, the gold standard of quality management standards and practices is the American Society for Quality. This preeminent organization, with over 4,000 members in 130 countries, was established in 1946 and has guided practices and implementation of quality management systems worldwide.

The Society defines a quality management system as “a formalized system that documents processes, procedures, and responsibilities for achieving quality policies and objectives,” and further states that “a QMS helps coordinate and direct an organization’s activities to meet customer and regulatory requirements and improve its effectiveness and efficiency on a continuous basis.”

From this definition, it can be understood that a good quality management system’s purpose is to establish the conditions for consistent and ever-increasing improvement through the use of standardized business culture practices.

Which QMS Standards are Most Widely Used?

The results of quality management’s remarkable growth since the 1940s has led to the rise of a number of widely-used standards, which can serve as the basis for companies and organizations to design and implement their own practices. Most of these modern quality management standards are globally recognized, and are specifically tailored to ensure that a company’s newly-developed practices include essential elements that can increase the likelihood of success.

The most widely-known entity which has designed such guidance is the International Organization for Standardization (ISO), a global organization which develops and publishes technical standards. Since the 1980s, the ISO has provided the 9000 series of standards (the most famous of which is 9001:2015) which outline how organizations can satisfy the checklists of quality management requirements and create their own best practices.

In 2020, over 1.2 million organizations worldwide were officially certified by the ISO for their quality management implementation practices.

However, it should be understood that the ISO 9000 standards are merely guidelines for the design and implementation of a quality management system; they are not systems in and of themselves.

Furthermore, the ISO is far from the only relevant player in this field. Many industry-specific standards, such as the American Petroleum Institute’s API Q1 standard, have been developed to target the highly specialized needs of particular business practices of oil and gas industry. These industry-specific standards are generally aligned with the ISO 9000 standards, and serve as complimentary additional guidance, rather than a replacement. It is entirely possible, and in many cases desirable, for a company to receive both ISO certification and certification from an industry-specific standards body, as doing so can help ensure the company’s newly-developed QMS procedures are consistent with both broad and specialized best practices.

A History of Quality Management

The concept of quality management is intrinsically tied to the development of industrial production. Previous to the industrial revolution, the concept of ‘quality’ was inherently linked to the skill and effort of craftspeople, or in other words, individual laborers trained in specialized fields who, either individually or in small groups, produced goods for use in society.

Whether they were weaving baskets or building castles, these craftspeople were primarily defined by a skill that centered them in a specific production methodology, and it was the mastery of this skill which determined the quality. Guilds of craftspeople would sign their works, placing a personal or group seal on the resulting product and thereby accepting accountability for its quality.

Such signatures and marks are found dating back at least 4,500 years to the construction of Egypt’s Great Pyramid of Giza, and came into widespread practice in medieval Europe with the rise of craft guilds.

In these early confederations of workers, a person’s mastery of a skill or craft could become a defining part of their identity and life, to the extent that many craftspeople of 13th Century Europe lived together in communal settings, while the Egyptian pyramid workers may have belonged to life-long ‘fraternities’ who returned, year after year, to fulfill their roles in ‘work gangs’.

However, in the Industrial Revolution, craft and guild organizations were supplanted by factories. Though ancient and medieval projects at times reached monumental scale, the rise of thousands of factories, each requiring human and machine contributions to generate masses of identical products, required a completely different scale of quality management.

The emphasis on mass production necessitated the use of workers who were no longer crafts masters, and thus resulted in a decrease in the quality of products. This in turn necessitated the rise of the product inspection system, which was steadily refined from the start of the Industrial Revolution in 1760 into the early 20th century.

However, inspection was merely a system of quality control, rather than quality management; in other words, simply discarding defective products did not in and of itself increase total product quality or reduce waste.

As influential American engineer Joseph M. Juran explained, in 1920s-era America, it was common to throw away substantial portions of produced inventory due to defects, and when Juran prompted inspectors at his employer’s company to do something, they refused, saying it was the responsibility of the production line to improve. Quality control, in and of itself, would not yield quality management.

As is often the case in human history, war was the driver of change. In World War II, the mobilization of millions of American workers into wartime roles coincided with the need to produce greater quantities of high-quality products than ever before.

To counteract the loss of skilled factory labor, the United States government implemented the Training Within Industry program, which utilized 10-hour courses to educate newly-recruited workers in how to conduct their work, evaluate their efficiency, and suggest improvements. Similar training programs for the trainers themselves were also developed. By the end of the war, more than 1.6 million workers had been certified under the Training Within Industry program.

Training Within Industry represented one of the first successful implementations of quality management systems, and its impact was widely felt after the end of the war. In the ashes of conflict, the United States and the other Allied Powers were tasked with helping to rebuild the economies of the other wartime combatants. Nowhere was this a more pressing matter than Japan, which had seen widespread economic devastation and had lost 40 percent of all its factories. Further complicating the situation was the reality that, then as now, Japan lacked sufficient natural resources to serve its economic scale.

And yet, within just 10 years of the war’s end, Japan’s economy war growing twice as fast per year than it had been before the fighting started. The driver of this miraculous turnaround was American-derived quality management practices, reinterpreted and implemented with Japanese ingenuity.

In modern business management, few concepts are as renowned, and oft-cited for success, as kaizen. This Japanese word, which simply means “improvement,” is the essential lesson and driver of Japan’s postwar economic success.

Numerous books written outside Japan have attempted to explain kaizen’s quality management principles, often by citing them as being ‘distinctly Japanese.’ Yet, the basis for kaizen is actually universal and applicable in any culture or context; it is, simply put, an emphasis on remaining quality-focused and open to evolution. The development of kaizen began in the post-war period when American statistician William Edwards Deming was brought to Japan as part of the US government’s rebuilding efforts.

A student of earlier quality management thought leaders, Deming instructed hundreds of Japanese engineers, executives, and scholars, urging them to place statistical analysis and human relationships at the center of their management practices. Deming used statistics to track the number and origin of product defects, as well to analyze the effectiveness of remedies. He also reinstated a key idea of the craftsperson creed: that the individual worker is not just a set of hands performing a task, but a person who can, with time, improve both the self and the whole of the company.

Deming was not alone in these efforts; the aforementioned Joseph M. Juran, who came to Japan as part of the rebuilding program several years later, also gave numerous lectures expounding similar principles.

Like Deming, Juran had previously tried to impart these approaches to American industry, but the lessons often fell on deaf ears. Japanese managers, however, took the lessons to heart and soon began crafting their own quality management systems.

Kaoru Ishikawa, who began by translating the works of Deming and Juran into Japanese, was one of the crucial players who helped to create the ideas now known as kaizen. He introduced a bottom-up approach where workers from every part of the product life cycle could initiate change, and popularized Deming’s concept of quality circles, where small groups of workers would meet regularly to analyze results and discuss improvements.

By 1975, Japanese product quality, which had once been regarded as poor, had transformed into world-class thanks to the teachings of Deming, Juran, and kaizen.

By the 1980s, American industry had lost market share and quality prestige to Japan. It was now time for US businesses to learn from Deming and Juran, both of whom at last found a receptive audience in their home country. Deming in particular achieved recognition for his role in the influential 1980 television documentary If Japan Can, Why Can’t We?, in which he emphasized the universal applicability of quality management.

So too did kaizen, which influenced a new generation of global thought leaders. Arising out of this rapid expansion of QMS were new systems in the 1970s and ‘80s, including the Six Sigma approach pioneered by Bill Smith and Motorola in 1987. Ishikawa, who saw his reputation and life transformed as his ideas spread worldwide, eventually summed up the explanation as the universality of human nature and its desire to improve. As Ishikawa said, “wherever they are, human beings are human beings”.

In no small part due to the influence of the thought leaders mentioned, quality management systems are today a cornerstone of global business practice. So influential are the innovators of these systems that they are often called ‘gurus.’ But what are the specific benefits of these systems, and how best can they be implemented?

How QMS Benefits Organizations, and the World

The oft-cited benefits of quality management systems are operational efficiency, employee retention, and reduction of waste. From all of these come improvements to the company’s bottom line and reputation. But far from being dry talking points, each benefit not only serves its obvious purpose, but also can dramatically help benefit the planet itself.

Operational efficiency is the measurement, analysis, and improvement of processes which occur within an organization, with the purpose of utilizing data and consideration to eliminate or mediate any areas where current practices are not effective.

Quality management systems can increase operational efficiency by utilizing employee analysis and feedback to quickly identify areas where improvements are possible, and then to guide their implementation.

In a joint study conducted in 2017 by Forbes and the American Society for Quality, 56 percent of companies stated that improving operational efficiency was a top concern; in the same survey, 59 percent of companies received direct benefit to operations by utilizing quality management system practices, making it the single largest area of improvement across all business types.

Because operational improvements inherently reduce both waste and cost, conducting business in a fully-optimized manner can simultaneously save unnecessary resource expenditure, decrease pollutants and discarded materials, and retain more money which the company can invest into further sustainable practices. Efficiency is itself a kind of ‘stealth sustainability’ that turns a profit-focused mindset into a generator of greater good. It is this very point that the

United States government’s Environmental Protection Agency (EPA) has emphasized in their guidance for Environmental Management Systems (EMS). These quality management system guidelines, tailored specifically to benefit operational efficiency in a business setting, are also designed to benefit the global environment by utilizing quality management practices.

Examples in the EPA’s studies in preparing these guidelines showcased areas where small companies could reduce environmental waste, while simultaneously reducing cost, in numerous areas. These added to substantial reductions and savings, such as a 15 percent waste water reduction which saved a small metal finishing company $15,000 per year.

Similarly, a 2020 study by McKinsey & Company identified ways that optimizing operations could dramatically aid a company’s sustainability with only small outlays of capital, thereby making environmental benefit a by-product of improved profitability.

Employee retention, and more broadly the satisfaction of employees, is another major consideration of QMS. Defined simply, retention is not only the maintenance of a stable workforce without turnover, but the improvement of that workforce with time as they gain skill, confidence, and ability for continued self and organizational improvement. We may be in the post-Industrial Revolution, but thanks to the ideas of QMS, some of the concept of the craftsperson has returned to modern thinking; the individual, once more, has great value.

Quality management systems aid employee retention by allowing the people of an organization to have a direct hand in its improvement. In a study published in 2023 by the journal Quality Innovation Prosperity, 40 percent of organizations which implemented ISO 9001 guidance for the creation of a QMS reported that the process yielded greater employee retention.

A crucial success factor for employee satisfaction is how empowered the employee feels to apply judgment. According to a 2014 study by the Harvard Business Review, companies which set clear guidelines, protect and celebrate employee proposals for quality improvement, and clearly communicate the organization’s quality message while allowing the employees to help shape and implement it, have by far the highest engagement and retention rates. The greatest successes come from cultures where peer-driven approaches increase employee engagement, thereby eliminating preventable employee mistakes. Yet the same study also pointed out that nearly half of all employees feel their company’s leadership lacks a clear emphasis on quality, and only 10 percent felt their company’s existing quality statements were truthful and viable.

Then as now, the need to establish a clear quality culture, to manage and nurture that culture, and to empower the participants is critical to earning the trust of the employee participants and thereby retaining workers who in time can become the invaluable craftspeople of today.

Finally, there is the reduction of waste. Waste can be defined in many ways: waste of time, waste of money, waste of resources. The unifying factor in all definitions is the loss of something valuable, and irretrievable. All inevitably also lead to the increase of another kind of waste: pollution and discarded detritus which steadily ruin our shared planet.

Reducing waste with quality management can take many forms, but ultimately, all center on the realization of strategies which use only what is truly needed. This can mean both operational efficiencies and employee quality, as noted above. The Harvard Business Review survey identified that in 2014, the average large company (having 26,000 employees or more) loses a staggering $350 million each year due to preventable employee errors, many of which could be reduced, mitigated, or eliminated entirely with better implementation of quality management.

This is waste on an almost unimaginable financial scale. Waste eliminated through practices which emphasize efficiency and sustainability, as noted in the McKinsey & Company study, can also yield tremendous savings. In one example, a company which purchased asphalt and previously prioritized only the per-ton price found that, when examining the logistical costs of transporting the asphalt from distant suppliers, they were actually paying more than if they purchased it locally. The quality management analysis they performed yielded them a cost savings, and eliminated 40 percent of the carbon emissions associated with the asphalt’s procurement. In this case, not only was wasteful spending eliminated, but literal waste (pollution) was prevented.

In taking these steps, companies can meaningfully improve their bottom lines, while at the same time doing something worthwhile and beneficial for the planet. That, in turn, helps burnish their reputations. A remarkable plurality of consumers, 88 percent of Americans surveyed in a 2017 study to be exact, said they would be more loyal to a company that supports social or environmental issues.

It is therefore clear that any steps a company can take which save money, improve worker satisfaction, and yield increased positivity in the marketplace are well worth pursuing.

What is the Future of QMS?

Until the 2000s, quality management systems were just that: systems of desirable practices, outlined by individuals and implemented individually. That was the age of the gurus: the visionaries who outlined the systems. But what that age lacked was a practical and easy means for companies, sometimes located far away from direct guidance by the gurus, to implement their teachings.

In the intervening years, technology has radically changed that dynamic. Today, QMS software fills the marketplace, allowing businesses small and large to design and guide their quality management plans. But even these software solutions have not yet solved the last great challenge: personalized assistance in putting standards into practice.

That is why the latest innovations, particularly in artificial intelligence, have the potential to upend the equation. Already, major companies have started to use artificial intelligence in connection with QMS datasets managed by software, utilizing the programs for statistical analysis, suggested improvements, and even prediction of potential faults before they occur.

These are immensely valuable opportunities, hence why huge players such as Honeywell are spending billions of dollars to bring innovative AI technology companies into their platforms to refine existing QMS systems.

But while AI has already begun to significantly affect the biggest players, small and mid-sized companies remain eager, but not yet able, to take full advantage. It is thus the next great revolution for a new evolution of QMS, one which will bring these emerging technologies to all companies, regardless of size or scale. The future of QMS, and therefore the future of efficiency in business, rests upon this shift from companies being the recipients of ‘guru knowledge,’ to themselves being the designers of their own quality-minded futures.

------

Onega Ulanova is the CEO of QMS2GO, a provider of quality management systems leveraging AI in manufacturing.

This article originally ran on InnovationMap.

Ad Placement 300x100
Ad Placement 300x600

CultureMap Emails are Awesome

UH's $44 million mass timber building slashed energy use in first year

building up

The University of Houston recently completed assessments on year one of the first mass timber project on campus, and the results show it has had a major impact.

Known as the Retail, Auxiliary, and Dining Center, or RAD Center, the $44 million building showed an 84 percent reduction in predicted energy use intensity, a measure of how much energy a building uses relative to its size, compared to similar buildings. Its Global Warming Potential rating, a ratio determined by the Intergovernmental Panel on Climate Change, shows a 39 percent reduction compared to the benchmark for other buildings of its type.

In comparison to similar structures, the RAD Center saved the equivalent of taking 472 gasoline-powered cars driven for one year off the road, according to architecture firm Perkins & Will.

The RAD Center was created in alignment with the AIA 2030 Commitment to carbon-neutral buildings, designed by Perkins & Will and constructed by Houston-based general contractor Turner Construction.

Perkins & Will’s work reduced the building's carbon footprint by incorporating lighter mass timber structural systems, which allowed the RAD Center to reuse the foundation, columns and beams of the building it replaced. Reused elements account for 45 percent of the RAD Center’s total mass, according to Perkins & Will.

Mass timber is considered a sustainable alternative to steel and concrete construction. The RAD Center, a 41,000-square-foot development, replaced the once popular Satellite, which was a food, retail and hangout center for students on UH’s campus near the Science & Research Building 2 and the Jack J. Valenti School of Communication.

The RAD Center uses more than a million pounds of timber, which can store over 650 metric tons of CO2. Aesthetically, the building complements the surrounding campus woodlands and offers students a view both inside and out.

“Spaces are designed to create a sense of serenity and calm in an ecologically-minded environment,” Diego Rozo, a senior project manager and associate principal at Perkins & Will, said in a news release. “They were conceptually inspired by the notion of ‘unleashing the senses’ – the design celebrating different sights, sounds, smells and tastes alongside the tactile nature of the timber.”

In addition to its mass timber design, the building was also part of an Energy Use Intensity (EUI) reduction effort. It features high-performance insulation and barriers, natural light to illuminate a building's interior, efficient indoor lighting fixtures, and optimized equipment, including HVAC systems.

The RAD Center officially opened Phase I in Spring 2024. The third and final phase of construction is scheduled for this summer, with a planned opening set for the fall.

Experts on U.S. energy infrastructure, sustainability, and the future of data

Guest column

Digital infrastructure is the dominant theme in energy and infrastructure, real estate and technology markets.

Data, the byproduct and primary value generated by digital infrastructure, is referred to as “the fifth utility,” along with water, gas, electricity and telecommunications. Data is created, aggregated, stored, transmitted, shared, traded and sold. Data requires data centers. Data centers require energy. The United States is home to approximately 40% of the world's data centers. The U.S. is set to lead the world in digital infrastructure advancement and has an opportunity to lead on energy for a very long time.

Data centers consume vast amounts of electricity due to their computational and cooling requirements. According to the United States Department of Energy, data centers consume “10 to 50 times the energy per floor space of a typical commercial office building.” Lawrence Berkeley National Laboratory issued a report in December 2024 stating that U.S. data center energy use reached 176 TWh by 2023, “representing 4.4% of total U.S. electricity consumption.” This percentage will increase significantly with near-term investment into high performance computing (HPC) and artificial intelligence (AI). The markets recognize the need for digital infrastructure build-out and, developers, engineers, investors and asset owners are responding at an incredible clip.

However, the energy demands required to meet this digital load growth pose significant challenges to the U.S. power grid. Reliability and cost-efficiency have been, and will continue to be, two non-negotiable priorities of the legal, regulatory and quasi-regulatory regime overlaying the U.S. power grid.

Maintaining and improving reliability requires physical solutions. The grid must be perfectly balanced, with neither too little nor too much electricity at any given time. Specifically, new-build, physical power generation and transmission (a topic worthy of another article) projects must be built. To be sure, innovative financial products such as virtual power purchase agreements (VPPAs), hedges, environmental attributes, and other offtake strategies have been, and will continue to be, critical to growing the U.S. renewable energy markets and facilitating the energy transition, but the U.S. electrical grid needs to generate and move significantly more electrons to support the digital infrastructure transformation.

But there is now a third permanent priority: sustainability. New power generation over the next decade will include a mix of solar (large and small scale, offsite and onsite), wind and natural gas resources, with existing nuclear power, hydro, biomass, and geothermal remaining important in their respective regions.

Solar, in particular, will grow as a percentage of U.S grid generation. The Solar Energy Industries Association (SEIA) reported that solar added 50 gigawatts of new capacity to the U.S. grid in 2024, “the largest single year of new capacity added to the grid by an energy technology in over two decades.” Solar is leading, as it can be flexibly sized and sited.

Under-utilized technology such as carbon capture, utilization and storage (CCUS) will become more prominent. Hydrogen may be a potential game-changer in the medium-to-long-term. Further, a nuclear power renaissance (conventional and small modular reactor (SMR) technologies) appears to be real, with recent commitments from some of the largest companies in the world, led by technology companies. Nuclear is poised to be a part of a “net-zero” future in the United States, also in the medium-to-long term.

The transition from fossil fuels to zero carbon renewable energy is well on its way – this is undeniable – and will continue, regardless of U.S. political and market cycles. Along with reliability and cost efficiency, sustainability has become a permanent third leg of the U.S. power grid stool.

Sustainability is now non-negotiable. Corporate renewable and low carbon energy procurement is strong. State renewable portfolio standards (RPS) and clean energy standards (CES) have established aggressive goals. Domestic manufacturing of the equipment deployed in the U.S. is growing meaningfully and in politically diverse regions of the country. Solar, wind and batteries are increasing less expensive. But, perhaps more importantly, the grid needs as much renewable and low carbon power generation as possible - not in lieu of gas generation, but as an increasingly growing pairing with gas and other technologies. This is not an “R” or “D” issue (as we say in Washington), and it's not an “either, or” issue, it's good business and a physical necessity.

As a result, solar, wind and battery storage deployment, in particular, will continue to accelerate in the U.S. These clean technologies will inevitably become more efficient as the buildout in the U.S. increases, investments continue and technology advances.

At some point in the future (it won’t be in the 2020s, it could be in the 2030s, but, more realistically, in the 2040s), the U.S. will have achieved the remarkable – a truly modern (if not entirely overhauled) grid dependent largely on a mix of zero and low carbon power generation and storage technology. And when this happens, it will have been due in large part to the clean technology deployment and advances over the next 10 to 15 years resulting from the current digital infrastructure boom.

---

Hans Dyke and Gabbie Hindera are lawyers at Bracewell. Dyke's experience includes transactions in the electric power and oil and gas midstream space, as well as transactions involving energy intensive industries such as data storage. Hindera focuses on mergers and acquisitions, joint ventures, and public and private capital market offerings.

Rice researchers' quantum breakthrough could pave the way for next-gen superconductors

new findings

A new study from researchers at Rice University, published in Nature Communications, could lead to future advances in superconductors with the potential to transform energy use.

The study revealed that electrons in strange metals, which exhibit unusual resistance to electricity and behave strangely at low temperatures, become more entangled at a specific tipping point, shedding new light on these materials.

A team led by Rice’s Qimiao Si, the Harry C. and Olga K. Wiess Professor of Physics and Astronomy, used quantum Fisher information (QFI), a concept from quantum metrology, to measure how electron interactions evolve under extreme conditions. The research team also included Rice’s Yuan Fang, Yiming Wang, Mounica Mahankali and Lei Chen along with Haoyu Hu of the Donostia International Physics Center and Silke Paschen of the Vienna University of Technology. Their work showed that the quantum phenomenon of electron entanglement peaks at a quantum critical point, which is the transition between two states of matter.

“Our findings reveal that strange metals exhibit a unique entanglement pattern, which offers a new lens to understand their exotic behavior,” Si said in a news release. “By leveraging quantum information theory, we are uncovering deep quantum correlations that were previously inaccessible.”

The researchers examined a theoretical framework known as the Kondo lattice, which explains how magnetic moments interact with surrounding electrons. At a critical transition point, these interactions intensify to the extent that the quasiparticles—key to understanding electrical behavior—disappear. Using QFI, the team traced this loss of quasiparticles to the growing entanglement of electron spins, which peaks precisely at the quantum critical point.

In terms of future use, the materials share a close connection with high-temperature superconductors, which have the potential to transmit electricity without energy loss, according to the researchers. By unblocking their properties, researchers believe this could revolutionize power grids and make energy transmission more efficient.

The team also found that quantum information tools can be applied to other “exotic materials” and quantum technologies.

“By integrating quantum information science with condensed matter physics, we are pivoting in a new direction in materials research,” Si said in the release.