News

Why standardization matters for US quantum technology leadership

CQE and ASCET, a US standardization group, convened experts to create a ‘collaborative blueprint’ for creating shared protocols to drive innovation

The average person encounters countless standards in everyday life: the size of a standard sheet of paper, the uniformity of traffic signals and signs, the compatibility of electrical plugs and outlets. Most people give little thought to how these conveniences streamline their activities — until they travel abroad, for instance, and can’t use their own small appliances without an adapter.

One of the first known standardized measures of length was the cubit, established in ancient Egypt more than 5,000 years ago. It’s a unit that sounds a bit like the one quantum stakeholders gathered last week to discuss: the qubit, the fundamental building block of quantum information technologies.

“We are at an important moment in the development of quantum information technology, where standardization is quickly becoming essential,” said David Awschalom, the Liew Family Professor of Quantum Engineering and Physics at the University of Chicago and the director of the Chicago Quantum Exchange (CQE), which convened academic, industry, and government leaders last week for a workshop in partnership with the Advancing Standardization for Critical and Emerging Technologies (ASCET) Center of Excellence, a US-launched initiative to strengthen leadership in international technology standards. 

The event is the second US workshop as part of the multi-year initiative, part of a growing effort to ensure that the quantum technology sector scales efficiently and effectively as it approaches commercial utility — and that the US maintains a competitive edge.

“Standards development isn’t just about technical requirements,” Awschalom said. “It’s about enabling the kind of shared language and protocols that can really drive innovation and catalyze the field. Efforts like this are part of the CQE mission to build an integrated quantum ecosystem capable to driving a robust quantum economy.”

The workshop, held in the offices of CQE corporate partner Protiviti, focused on how standardization can guide the U.S. quantum sector in alignment with broader innovation goals, provide tools and strategies to streamline pre-standardization efforts, and fill ecosystem-wide gaps to give the U.S. quantum industry a global advantage.

“We think this moment is really critical as quantum is growing, strengthening, and finding substantially more investment of capital from a variety of markets, both domestically and internationally,” said James Dickerson, director of the ASCET Center of Excellence, which was launched in 2015 in partnership with the National Institute of Standards and Technology (NIST) and ASTM International. “We think this is the right time, the right opportunity, for us to help subject matter experts like yourselves understand why standards and standardization truly is critical and show you how to get involved.” 

Standards: a necessary balance

Standardization is the invisible bedrock of a technological society — the glue that allows materials to be mass-produced, disparate products to operate together, and researchers to have a common basis upon which to innovate. 

But standardization in the early stages of a technology is a delicate balance, as both too-weak and too-strong standards can stifle innovation and creativity in the field. When standards are too rigid, inventors can be shoehorned into certain approaches, processes, or materials at the expense of others that might develop slower. Without any standards, disparate actors can’t effectively communicate or collaborate, manufacturing and supply chains struggle to scale, and discoveries can remain siloed and miss opportunities for emergent breakthroughs. 

The speakers at the workshop had a central message: standardization can align definitions, measurements, and expectations to enable trust, comparability, and market confidence across the ecosystem. Shared terminology, measurement methods, and interoperability can drive innovation by improving both competitiveness and collaboration, reducing duplicated efforts and increasing scalability. 

“Standards [in the US] are not just a set of rules, but a collaborative blueprint, driven by experts from industry,” said Glauci Fernandes, senior program manager of Software-Enabled Products, Autonomy and Robotics (SEPAR) at UL Standards & Engagement, a global safety organization that develops and advocates for standards across many fields. “It’s consensus-based, and we leverage the expertise from great minds in the industry to really form the framework.”

But there is a financial and resource strain that comes from participating in standards development. Companies must send workers to conferences and meetings, paying for their travel and time, as well as membership in standards organizations, among other costs. This can be a disproportionate barrier for smaller startup companies — but smaller startup companies can also be the most left-behind by early standardization, especially if their technology is more bespoke.

“Those who get involved in standards development ultimately help shape where innovation occurs and what it looks like,” Dickerson said. “And those who don’t get involved in the development of standards still end up being shaped by them.”

That’s where ASCET comes in, said Amy Lee, workforce development and outreach specialist at ASTM International. 

“One of the biggest gaps that we're finding, especially in emerging fields like quantum, is not that there’s a lack of interest — it is that sometimes it’s unclear where the pathway is to participate,” she said, “Building that early infrastructure is not just helpful, it really gives you that competitive advantage.”

Technology also evolves quickly, often requiring standards that change and adapt with it. USB chargers are a good example of this, where the USB-A became the USB-C, which can transfer data and charge devices much faster. Workshop speakers emphasized that standards development in emerging technologies required sustained engagement for this reason. 

“The process is really iterative,” said Maria Knake, lead of the Standards and Conformity Assessment Services group at NIST. “The maintenance is ongoing, especially for emerging technologies, so it’s not just one-and-done. It’s staying for the long haul and making sure that the standard continues to meet market needs.”

Early successes, long way to go

One of the first standardized areas in the field of quantum technology is “post-quantum cryptography” (PQC). As quantum computers grow larger and more powerful, they become closer to being able to break very common types of encryption, such as those protecting online financial transactions. PQC is the development of new kinds of encryption algorithms that are secure against larger quantum computers. 

In 2024, NIST released the first PQC standards, the result of an eight-year effort with the world’s cryptography experts to come up with new algorithms specifically designed to resist decryption by quantum computers, and some of the first official standards in quantum technology.

Another standardization of quantum technology that was discussed by workshop participants as a success is quantum intermediate representation (QIR), which interfaces quantum programming languages (software) with quantum computers regardless of which hardware they’re based on. It is also important for interfacing classical computing with quantum computing, a hybrid approach that is considered extremely important while quantum computers are still in the early stages of technological development. 

But both PQC and QIR are related to software. The harder problem, workshop participants agreed, is going to be standardizing quantum hardware in quantum computing, networking, and sensing. While all classical computers essentially operate with the same physical parts, qubits and quantum sensors can be atoms, ions, superconducting circuits, or electrons floating on liquid helium, just to name a few. And workshop participants noted that physical interconnects between different types of quantum computers, networks, and sensors is a crucial underdeveloped space that’s challenging to address. 

There was also much discussion about comparability between quantum computers: a long-unsettled and nuanced topic considered extremely important for quantum computing’s commercial future. Because of the multitude of different qubit modalities, comparing performance between two different quantum computers can be a very messy endeavor. Simple measures such as the number of qubits can mean drastically different things depending on whether the qubits are superconducting circuits or trapped atoms, or if their error rates differ by even a tenth of a percent. 

Participants named numerous other opportunities for standardization in quantum technology: more strict shared definitions for error rates, gates, logical qubits, and decoherence times in quantum computers; standardization of how quantum sensors integrate with classical hardware, given their different power scaling needs and the fact that they seem closest to mass commercialization; and standardization of the transmission of quantum information over long, ground-based networks.

“Quantum, relatively speaking, is early in its journey into off-the-shelf technologies,” Dickerson said. “But we believe it is ripe and primed for deeper engagement in the standards process.”