Documents /

Global Quantum Software Research Agenda
(2026–2035)

Quantum Software Alliance

November 15, 2025

Abstract

Quantum computing is moving from laboratory prototypes to programmable systems that integrate with high-performance computing and cloud infrastructures. In this phase, practical utility is shaped by both hardware and software: hardware must scale and improve fidelity, and software must make that scaling efficient and useful through co-design, until the regime of fault-tolerant application scale quantum computers has been reached. The binding constraints resource-realistic algorithms that identify where advantage can possibly arise and that in the near term are built on informed architectural choices, make use of suitable compilers, are testable via verification and benchmarking and are orchestrated with HPC, and in the long term lead to fault tolerant quantum computers. Error correction is crucial in the route from noisy prototypes to scalable operation, so the software stack must select codes, schedule syndrome extraction, and integrate decoders into runtime paths while exposing logical level metrics. This Agenda defines a scientific and policy-ready plan to advance these capabilities over the next decade and to translate them into sector impact through auditable pilots. Algorithms discover quantum advantage, the software stack ensures that hardware scales systematically into deployable applications, and verification and assurance keep results trustworthy and certifiable. The Agenda is a community document maintained by the Quantum Software Alliance and updated annually. It is aimed at being neutral and does not reproduce any single programme plan.

Executive Summary

Quantum computing is entering a decisive moment. The field has progressed beyond purely theoretical exploration to the systematic construction of processors that grow in scale and capabilities. Hardware roadmaps have advanced [1], yet matching software roadmaps have not kept pace, slowing adoption. The shared challenge is to make quantum computing practical through co-design of hardware and software. This requires quantum software that runs on real devices at the available scale, integrates error correction and decoding into runtime paths, and is verified, tested, benchmarked, and trusted.

Despite significant progress on algorithmic primitives and scaling theory, quantum software development has often proceeded in fragmented efforts that are weakly connected to advances in high-performance computing and artificial intelligence. Moreover, we still need to identify a full set of impactful use-cases that will be run on quantum computers in the future, representing another challenge for algorithm developers [2]. This risks slowing progress at the moment when society, industry, and government are demanding useful quantum applications[3]. The frontier has shifted from proof of principle demonstrations toward co-development of hardware and software tools that show practical utility. To remain at the cutting edge, the community must level up practical software development for processors at the current scale of hundreds of qubits, laying the groundwork for the advantage era that follows [4]. The goal is a quantum software ecosystem that connects fundamental research with hardware realities, error correction requirements, data, and end-users, and that is guided by clear goals and shared values.

The Quantum Software Alliance, or QSA for short, will act as the global alliance for quantum software developers and users. We commit to a future in which quantum software is practical, collaborative, impactful, integrated with AI and HPC, and future ready through workflows, standards, and data infrastructures that sustain progress toward quantum advantage. Our action plan is guided by six values: scalable algorithmic development with explicit resource estimates; practical utility aligned with real hardware; principled synergy between quantum, AI and HPC; open standards and shared benchmarks; community and end-user engagement; and strategic handling of quantum data and software artefacts.

This Agenda defines core Capabilities for the software stack and shows how they translate into several Sector Programmes. Each capability is accompanied by measurable outcomes and an indicative progression over years. The capabilities place error correction and fault-tolerant operation within the software remit, including code selection, syndrome scheduling, decoder integration, and exposure of logical level metrics that link hardware performance to application outcomes. The sector programmes prioritise problem families, acceptance criteria, and verification-ready pilots rather than vendor or device specific details. A living artefact repository will include intermediate representation conformance tests, benchmark definitions, reference workloads, and assurance pipelines that make claims testable on real hardware. The QSA will maintain this Agenda as a living document under a General Assembly and Steering Committee, with topic area task forces that coordinate updates and artefact releases.

Purpose and Scope

This Agenda is the community reference for what quantum software capabilities to build, how to measure progress credibly, and where to apply those capabilities so that public research and development investment produces reliable value. It responds to the present phase of quantum computing in which practical utility depends on software that is resource realistic, verifiable, and engineered for hybrid operation with high-performance computing, and it recognises that the economic opportunity is concentrated in the software and application layer but can only be captured through a coordinated international effort. To operationalise this, QSA will run and grow a global open platform that federates existing community repositories into a neutral, one-stop index of code, datasets, benchmarks, intermediate-representation artefacts, standards drafts, and testbed access procedures, with submission, review, versioning, and conformance tracking managed by topic-area task forces. The platform will curate a living artefact registry that includes IR conformance tests, benchmark suites, reference workloads, verification-ready pipelines, and reproducibility harnesses that embody digital-twin patterns for quantum–HPC orchestration, and it will provide a collaboration registry that connects end-users and research teams for sector pilots. The Agenda sets procurement-ready expectations by defining verification-ready outputs, open artefacts, cross-platform comparability, and platform-level metrics such as conformance status, cross-testbed replication, time-to-replicate, and adoption by facilities. Scope spans the full software stack needed to convert hardware progress into dependable applications, including algorithms and complexity results relevant to credible advantage, languages and intermediate representations with compilers and conformance tests, verification and benchmarking that tie device metrics to application-level performance, architectures with error correction and hybrid quantum–HPC runtimes, and distributed or networked quantum computing with security. Cross-cutting enablers include standards and interoperability, resource-estimation and cost models, open-science infrastructure and workforce, community and end-user engagement, and responsible communication. The intended audience includes policymakers, funders, national laboratories, high-performance computing facilities, standards bodies, industry end-users in regulated and strategic sectors, and the academic software community. QSA will maintain the Agenda and the platform as living, public resources under General Assembly and Steering Committee governance, with geographically balanced task forces that curate artefacts in public repositories so that advances remain portable, comparable, and verifiable across regions and hardware modalities.

Vision and Guiding Principles

Quantum computing is entering a decisive phase in which progress depends on building the structures that turn discovery into dependable capability. The Alliance’s vision is not to chase premature end-user solutions, but to operate an engine for discovery in which ideas become rigorous frameworks, are integrated into algorithmic and programming environments, are evaluated through verification and benchmarking techniques, and are connected to high–performance computing and cloud systems [5]. In this model, co-development of software with hardware and error correction is the norm: system and compiler design reflect device constraints, decoders and runtime control are part of the software remit, and verification informs the definition of performance targets. This vision is consistent with the community call to move from proof-of-principle demonstrations to practical software that runs on real devices with credible resource models and testable claims.

The guiding principle is systematic rather than opportunistic discovery. New algorithms and protocols are developed with transparent scaling analysis and open resource-estimation that map design choices to logical depth, qubit counts, runtimes, and energy budgets. Translation is demand-driven yet readiness-aware: application candidates are framed against the best classical baselines, evaluated on cross-testbed pipelines, and progressed only when verification criteria are met. Adaptivity is explicit because hardware progress is non-linear; error-correction milestones, connectivity, and control advances can accelerate or resequence feasibility, so priorities are revisited as evidence accumulates. These practices align with a verification-first workflow in which benchmarking is derived from formal properties, assurance links device metrics to application performance, and reproducibility is achieved through open artefacts and conformance tests.

The Alliance will serve as a neutral, global platform that connects researchers, testbeds, standards bodies, and end-users. Its role is to curate open intermediate representations, compilers, datasets, benchmarks, decoders, and orchestration software, and to maintain a source-of-truth catalogue of artefacts and teams so collaboration is straightforward across regions and modalities. Governance ensures a lean process for annual updates of the Agenda and artefact releases, with geographically balanced leadership and task forces that can act quickly when the evidence base shifts. This open platform mandate and cadence follow directly from the QSA Charter.

This vision also fixes the structure of the Agenda. The next chapters specify the software Capabilities the community should build, the Cross-cutting Enablers that make them portable and comparable, and the Application roadmaps that convert capabilities into sector value under auditable conditions. Each item is accompanied by measurable outcomes on 1–3, 3–6, and 6–10 year horizons so that policy, funding, and procurement can be aligned with credible milestones.

Software Capabilities

C1: Algorithms and Methods

Algorithms determine whether quantum processors deliver practical value. This capability addresses both near term algorithms for the first useful devices and long-term methods for general purpose fault-tolerant systems. Progress requires foundational research on new algorithmic principles and sustained engineering to optimise known algorithms for concrete applications.

Near term focus. Design algorithms for problems that matter to end-users in industry and academia and that can yield a robust advantage on relatively small and noisy devices. Promising families include modelling of physical, material and chemical systems, approximate optimisation, and machine learning tasks amenable to hybrid execution. Development covers design, implementation, and validation on hardware at the available scale.

Longer term focus. As devices mature, priority lines include quantum simulation and scientific computing, amplitude amplification and estimation and related Monte Carlo accelerators, optimisation methods and problem relaxations, search problems, linear-system and differentialequation solvers. It will be important that quantum advantages are robust and do not concern only highly fine-tuned instances. Complexity and lower bound studies remain essential to identify genuine advantage and to set resource budgets for early fault-tolerant routes.

Methodology. End-to-end baselines against the best classical methods and open resource estimation tools that map algorithmic choices to logical qubits, depth, gate counts, wall clock time, and energy, including the classical computing needed for control and error decoding. Verification oriented development integrates measurable acceptance criteria and assurance pipelines so that correctness and performance are testable on real hardware.

Deliverables. Reference implementations with explicit resource models; domain specific kernels; interoperable toolchains that connect algorithm design to compilers, runtimes, and data management; reproducible results with classical baselines engineered for hybrid HPC orchestration.

C2: Languages, Intermediate Representations, and Compilers

Intermediate representations with formal semantics must bridge high level programs to hardware aware back ends. Quantum compilers translate high level algorithms or model specifications into runnable code on specific devices; they optimise to reduce resource overhead; they perform architecture aware transformations including routing, synthesis, scheduling, and the introduction and management of error correction [6]; and they verify, to the greatest extent possible, that a program implements the intended algorithm before execution.

Progress requires both hardware agnostic techniques at higher levels of abstraction and hardware sensitive techniques that account for topology, control constraints, calibration data, and error models. Compilation is also a vehicle for assurance: the toolchain should surface proof obligations, enable equivalence checking, and produce artefacts that link algorithmic choices to logical resource estimates and device level constraints so that claims about correctness and performance are testable. Open and interoperable IR ecosystems with conformance tests are necessary to ensure portability and fair comparison and to prevent lock in.

C3: Verification, Assurance, and Benchmarking

Verification and testing are design drivers. Classical simulation does not scale to certify large quantum devices, so trustworthy performance requires new theoretical ideas and engineering practices. Future architectures should be shaped by their testability as well as by throughput, with verification hooks exposed at every layer of the stack and integrated with error correction and decoding.

Assurance is layered. At the specification level, program logics, type systems, and property checking establish correctness where possible and surface proof obligations that can be carried through compilation to the device [7]. At the protocol level, cryptographic and interactive verification techniques allow small trusted components to bootstrap trust in larger systems and enable verifiable delegation. Benchmarking is verification inspired: formal tests are compiled into executable workloads that produce platform agnostic capability certificates and link hardware metrics and logical error rates to application level performance. Device characterisation and diagnosis on the hardware level using tools such as randomized benchmarking feed compilers, decoders, and schedulers.

Deliverables. A unified assurance pipeline from specification to hardware; verification inspired benchmark suites with open artefacts; IR conformance tests; and design for testability guidelines exercised across multiple platforms and networked settings.

C4: Architectures, fault-tolerance, and Hybrid Quantum–HPC Systems

This capability defines the systems layer that makes quantum processors usable at scale. It spans architectural choices, error correction and fault-tolerant operation, classical co-processing for decoding and control, and hybrid orchestration with high-performance computing. The goal is predictable performance on real hardware under explicit constraints of noise, latency, and energy.

Architectures and programming models. Provide models that map high level descriptions to physical constraints, including compilation, routing, scheduling, and mapping to limited control surfaces. Plan for modular systems built from interconnected parts with partitioning, placement, and resource aware communication, and with verification hooks for testability.

Error correction and fault-tolerant computing. Select codes, schedule syndrome extraction, integrate decoding that translate syndrome information into actual quantum error correction steps into runtime paths, and link logical error rates to application level performance [8]. Near term variants tailored to small qubit counts reduce overheads and prepare subsystems for subsequent scaling.

Hybrid orchestration with HPC. Treat quantum as a first class accelerator [9]. Schedulers, workflow engines, and data management coordinate CPU, GPU, and QPU resources. Runtimes support real-time decoding paths and expose device agnostic job semantics. Orchestration integrates error aware simulation for pre- and post-processing, keeps energy and latency budgets explicit, and enables federation across facilities. Programming models allow quantum accelerated subroutines to be embedded in established HPC codes with audit trails for reproducibility.

Deliverables. Reference architecture and programming model for hybrid operation; open compiler pipeline targeting code families and logical instruction sets; decoder implementations with documented accuracy, latency, and energy profiles; runtime support for real-time decoding and device-agnostic job execution; verification-ready artefacts that link hardware metrics and logical error rates to application level outcomes.

C5: Distributed or Networked Quantum Computing and Security

At scale, useful workloads will involve multi-node execution, remote access, and secure delegation. A global quantum network architecture will distribute entanglement over terrestrial and space links and allow distributed quantum computation coordinated by long-range classical communication [10]. Such networks enable security properties that are not achievable with classical communication alone, including information theoretic guarantees for tasks such as uncloneability, certified deletion, and device or position verification. Networked algorithms must be designed to respect signalling constraints implied by relativity while using entanglement, teleportation, and error correction to reduce communication where possible. The software stack should therefore treat computation and communication as one design problem and optimise responsiveness to data generated in real-time across the network.

Software support is needed for routing and scheduling across heterogeneous nodes, authenticated access and privacy-preserving orchestration, and verifiable delegation protocols that make remote services certifiable [11]. Proofs of quantumness provide an operational path to demonstrate that cloud services are using quantum resources and can anchor service level agreements. A network operating layer should expose explicit latency and bandwidth budgets, manage entanglement distribution and buffering, and record evidence required for compliance and audit. Conformance tests and capability certificates should extend naturally from single node settings to multi-node execution so that performance claims remain portable and comparable.

Large general-purpose quantum computers will break the current widely deployed public-key cryptography. Migration to quantum-safe alternatives is therefore a mandatory, long-duration systems task. Initial standards have been published and their adoption will take years [12]. The underlying assumptions are largely lattice based and hash based, and increased academic work is required at the interface of quantum algorithms and the relevant mathematics, including revisiting classical proof techniques that do not carry over to quantum attackers. Software deliverables for this migration include parameter-selection tools, reference implementations integrated with mainstream libraries, and crypto agile architectures that allow algorithms and parameters to be updated without service disruption.

Quantum communication devices for key distribution and random-number generation already exist, but wider uptake depends on certification profiles, interoperable standards, and welldefined interfaces to post-quantum cryptographic schemes. The software stack should provide network components that combine these quantum primitives with post-quantum cryptography in a coherent, auditable system. Discovering where quantum information techniques provide genuine security advantages remains an open research frontier, especially for tasks with quantum inputs or outputs and for large-scale networks where light speed signalling constraints matter. Progress will be fastest when physicists, engineers, security experts, and cryptographers work to common specifications that tie algorithmic choices to measurable end-to-end outcomes.

Cross Cutting Enablers

Standards and Interoperability 

The current quantum computing landscape is highly fragmented due to vendor-specific abstrac tions, vertically integrated toolchains, and proprietary programming models. This duplication of effort and lack of standardized benchmarks impede growth. Standards and interoperability are thus critical to building a cohesive global quantum-software ecosystem. Specifically, open, consensus-driven specifications are required to govern language and intermediate representations (IRs), compilers, runtime and job APIs, benchmark schemas, and crypto agility [13]. Crucially, mandatory conformance testing must be fully integrated across all modalities and hardware providers. Over the next decade, global actors will need to align around open, consensus-driven specifications that allow quantum software to be written once and executed across diverse back ends. Key elements of this effort will include: common quantum-software abstractions and IRs; interoperable runtime and job-submission APIs; benchmarking metrics and verification frame works; and security, cryptography, and governance standards. 

The Quantum Software Alliance (QSA) must serve as a neutral convener, bridging academia, industry, national laboratories, and international standards organizations. By fostering pre competitive collaboration, curating best practices, mapping standardisation gaps, and designing roadmaps, QSA will coordinate global efforts to ensure the quantum software ecosystem develops with core principles of openness, portability, and security. 

Resource Estimation and Cost Models 

Accurate and transparent resource estimation will be essential for evaluating quantum algo rithms, planning system architectures, and guiding national investments over the next decade. As quantum computation progresses toward fault-tolerance, stakeholders require transparent, community-driven tools that map high-level algorithmic designs to detailed costs, including log ical qubit counts, error-correction overheads, runtime under noise, and energy consumption. These tools will create a shared, comparable language between algorithm designers, compiler engineers, and hardware providers, directly informing procurement and policy decisions [14]. 

Simultaneously, cost models must expand beyond technical resources to include operational considerations such as energy consumption, cooling requirements, latency in hybrid quantum classical workflows, and total cost of ownership for quantum infrastructure. Standardizing these estimation frameworks will improve clarity for policymakers, procurement officers, and funding agencies, allowing them to assess feasibility, prioritize research directions, and allocate resources effectively. By fostering interoperable tools, shared datasets, and reference workflows, the com munity can build a consistent methodology for projecting the real-world impact and scalability of quantum computation. 

Open Science Infrastructure and Workforce 

Expanding open science infrastructure will be vital to ensuring transparency, reproducibility, and equitable participation in quantum software research. Sustained investment in open-source stacks, reference implementations, and standardized datasets enables validation and benchmark ing. Furthermore, interoperable cross-testbed harnesses—which allow the same workflow to execute on diverse hardware backends—will accelerate rigorous experimental comparisons and best-practice development. Long-term funding models are needed to strengthen these shared resources, foster community-driven governance, and ensure continuity as platforms evolve. 

In parallel, the global quantum software workforce must grow in both size and breadth. Quantum software engineering increasingly depends on expertise in classical domains such as compilers, programming languages, formal verification, distributed systems, cloud orchestra tion, and cybersecurity. Education and training programs should therefore integrate these dis ciplines alongside quantum information fundamentals, offering students and professionals acces sible pathways into the field. Collaborative initiatives—such as international summer schools, coordinated curriculum frameworks, shared teaching materials, and executive-level training pro grams—will help harmonize competencies across regions and strengthen global capacity. 

QSA can provide guidance for developing and growing robust open science infrastructure and a well-prepared workforce as foundations for sustainable progress in quantum software, enabling reproducible, secure, and globally-inclusive innovation. 

Community and end-user Engagement 

Quantum software should be grounded in real needs. Sustained community and end-user en gagement is essential to grounding quantum software research in real-world needs and delivering measurable value. Currently, many potential users – from scientists to industry experts – lack clear pathways to articulate requirements or evaluate capabilities. Building mechanisms for on going dialogue helps align research priorities with real-world demand, preventing misallocation of effort and accelerating adoption where quantum advantage is plausible. Community and end-user engagement includes shared datasets, open benchmarks, procurement pilots structured with audit trails and reproducibility requirements, and participation in standards development. These enable diverse stakeholders to explore applications and contribute domain expertise and allow end-users to shape specifications, tooling requirements, and interoperability expectations. 

QSA can serve as a neutral reference for cultivating and curating these forms of engagement, fostering an inclusive, application-driven ecosystem that ensures global investments address genuine societal and industrial needs. 

Ethics and Responsible Communication 

Ethical and responsible communication are critical to maintaining public trust, guiding informed policy, and preventing unrealistic expectations. It is essential to clearly distinguish between demonstrated results, plausible advances, and speculative projections. Transparent reporting of assumptions, uncertainties, noise models, and methodological limitations is required to ensure decision makers and end-users can accurately interpret claims and make evidence-based choices. A commitment to verification-first practices—including reproducible benchmarks, documented pipelines, and open artefacts—strengthens accountability. Publishing reference implementa tions, datasets, and experimental logs enables the community to validate findings and refine methodologies collaboratively. This openness reduces the risk of overstated capabilities and accelerates technical progress by making empirical foundations widely accessible. Responsible communication must also address broader ethical dimensions, including equitable access to knowledge, the societal implications of quantum computation, and minimizing hype-driven dis tortions in public discourse. Coordinated guidelines, community norms, and training in science communication can help researchers, companies, and policymakers articulate realistic expecta tions while highlighting genuine opportunities. 

By institutionalizing these practices, QSA can promote integrity, foster trust, and ensure that the field’s development is grounded in rigor, transparency, and societal responsibility.

Applications and Sector Roadmaps 

Capabilities translate into sector impact through application driven validation loops. Representative problem kernels are selected, classical baselines are established, algorithms and toolchains are designed with explicit resource models [15], hybrid execution is orchestrated, and outcomes are verified with benchmarking pipelines. Sectors follow a common structure to enable comparability.

Health and Life Sciences 

Quantum software targets molecular modelling and discovery workflows that include ground and excited state energies, reaction pathways, ligand–receptor interactions, and structure–activity relationships. Near-term emphasis is on hybrid simulation and embedding strategies that keep classical pre- and post-processing explicit and move only the computationally expensive quantum kernels to hardware. Representative kernels are adiabatically assisted variational eigensolvers for strongly correlated fragments, quantum-enhanced Monte Carlo for rare-event sampling in free-energy estimation, and excited-state solvers for spectroscopy. Compiler support must coordinate basis choices, mappings, routing, and error-aware synthesis, and expose resource models that report qubits, depth, logical error rates, wall-clock time, and energy. Verification uses domain units such as chemical accuracy with uncertainty budgets, results are replicated across testbeds, and classical baselines are mandatory. Privacy-preserving workflows and quantum-safe data pipelines are gating requirements for any deployment in clinical or biomedical settings. 

Materials and Advanced Manufacturing 

For catalysis, battery systems, and strongly correlated matter, progress depends on hybrid strategies that combine disentangling transforms and embedding with compiler support for ma terials primitives and error-aware synthesis [16]. Domain-driven workflows couple dynamical mean-field methods and other advanced classical solvers with quantum subroutines for spectral features and barrier heights. Verification-inspired benchmarking anchors claims: spectral lines, reaction barriers, and correlation functions are compared to trusted references, results are replicated across testbeds, and resource estimates are reported with transparent assumptions and energy budgets. Early fault-tolerant routes based on linear-algebra primitives and Quantum singular value transformation can be exercised on a small set of community-curated materials problems with open artefacts and explicit logical error targets. 

Energy and Climate Systems 

Energy systems combine optimisation, simulation, and data-driven modelling. Target workloads include unit-commitment variants, optimal power-flow under network constraints, short-term forecasting for renewables, and discovery of carbon-capture materials. Software priorities are robust encodings for network-constrained optimisations, hybrid orchestration that meets latency and bandwidth budgets for grid operations, and assurance pipelines suitable for safety-critical deployments. Benchmarks report optimisation-gap reductions and forecast-skill deltas at fixed reliability thresholds, all against strong classical baselines, with explicit reporting of wall-clock, energy, and memory costs. Materials-focused subproblems reuse the simulation kernels from the materials sector, enabling cross-sector reproducibility. 

Finance 

Canonical kernels are risk aggregation (including VaR and CVaR), anomaly and fraud detection, and pricing of path-dependent derivatives. Near-term approaches embed quantum subroutines inside classical pipelines: amplitude-estimation variants for Monte Carlo components, quan tum optimisation primitives for portfolio allocation, and sampling-based routines for rare-event analysis. Compilers must handle arithmetic intensity and memory access patterns, expose re source and energy models, and support audit trails appropriate for regulated environments. Acceptance criteria are framed in task-level metrics, include model-risk considerations, and are always evaluated against tuned classical baselines running on modern accelerators. Migration to quantum-safe cryptography on the classical perimeter is a gating requirement for integration with financial infrastructure. 

Cybersecurity and Public Digital Infrastructure 

Security evolves along two coordinated axes. First, the classical perimeter must transition to quantum-safe standards. This is a long-horizon systems engineering task that needs parameter-selection tools, reference implementations integrated with mainstream libraries, and crypto-agile architectures that allow algorithms and parameters to be updated without service disruption. Sector pilots should exercise end-to-end upgrade paths in representative infrastructures such as identity management, secure messaging, data-at-rest, and key management for cloud services. Second, quantum-native capabilities are emerging. Quantum key distribution and quantum random-number generation can provide security properties that are classically unattainable, but broader uptake depends on certification and interoperable standards; software deliverables include conformance tests and deployment profiles. Protocols for verifiable delegation, certified deletion, proofs of quantumness, and task-specific privacy are candidates for early networked demonstrations, with acceptance criteria that include explicit threat models, quantified sound ness and completeness, and independently auditable logs. As quantum computing becomes distributed, the network operating layer must enforce authenticity and privacy across heterogeneous nodes, schedule tasks with explicit latency and bandwidth constraints, and record evidence required for compliance. Co-design of computation and communication ensures that resource estimation and verification account for both local processing and network costs. Progress re quires close collaboration between cryptographers, quantum information theorists, systems engineers, and standards bodies, with strong connections between post-quantum cryptography and quantum-communication communities. 

Manufacturing and Supply Chains 

Routing and scheduling under uncertainty, in-line quality control, and secure traceability are core challenges. Quantum-accelerated optimisation and learning routines are integrated into digital-twin workflows where classical and quantum resources are orchestrated under explicit latency and energy budgets. Runtime systems broker resources across vendors and modalities while preserving auditability. Verification ensures comparability of results, robustness to data-shift, and reproducibility across testbeds. Acceptance criteria include throughput improve ments, cycle-time reductions, and scrap-rate reductions under fixed reliability thresholds, with cryptographic signing and quantum-safe pipelines used for provenance in regulated environ ments. 

Science and Engineering 

Quantum simulation remains a leading application area. The immediate goal is to model quantum-mechanical systems that challenge classical methods across chemistry [17], materials, condensed-matter physics[18], and high-energy physics [19], and to do so at realistic noise and scale [20]. Priority kernels include many-body dynamics, lattice models, spectroscopy, and compact electronic-structure tasks that admit hybrid execution. Implementations com bine error-aware compilation with explicit classical pre- and post-processing and hybridisation with tensor networks, dynamical mean-field theory, and other advanced solvers. Workflows must state data movement, memory, and latency costs, and expose interfaces that allow quan tum subroutines to be used as libraries inside existing HPC codes. Suitability for near-term and early fault-tolerant platforms is assessed before execution through transparent reporting of qubits, depth, logical error rates where relevant, wall-clock time, and energy budgets. Verification captures both physical accuracy and computational performance. Benchmarks compare spectral lines, correlation functions, and thermodynamic quantities to trusted references and are replicated across testbeds with acceptance thresholds defined in domain units and explicit uncertainty budgets. As devices mature, simulation methods based on linear-algebra primitives, QSVT, and amplitude estimation provide routes to higher precision; these should be exercised on community-curated problem sets with open resource estimates and artefacts. 

AI and Data Ecosystems 

Quantum machine learning sits at the intersection of artificial intelligence and quantum computing [21, 22]. Algorithmic work indicates potential acceleration in linear-algebra subroutines and related tasks relevant to recommendation, classification, regression, and generative modelling. The sector goal is to turn such ideas into verified end-to-end workloads that operate under realistic constraints of data movement, memory, latency, and energy. Practical progress targets hybrid models that embed quantum subroutines inside classical training and inference loops, for example quantum-enhanced sampling, kernel methods, and structured stochastic components in generative models. Development prioritises problems where input data is low dimensional, compressible, generated in situ by physical models, or prepared by feature pipelines that keep I/O costs explicit. All claims are evaluated against strong classical baselines on modern accel erators with acceptance criteria framed in task-level metrics and reported at fixed energy and latency budgets. Where large quantum memories are unavailable, emphasis is on regimes that operate on compact encodings, synthetic data, or features produced by classical preprocessing. As systems mature, quantum methods for linear systems, gradient estimation, and amplitude estimation can support learning pipelines, with compiler-generated instruction counts, logical error rates, and verification hooks recorded alongside privacy-preserving mechanisms suitable for regulated environments.

Milestones and Measurable Outcomes

Within 1 to 3 years publish IR specifications with conformance tests, release verification inspired benchmark suites with open artefacts, operate reference hybrid schedulers with digital twin style test rigs in HPC facilities, and provide quantum safe reference implementations. Within 3 to 6 years support early fault-tolerant subsystems with logical level compilation and decoder co design accredited by program level protocols, make production HPC integration routine with device agnostic partitioning, and report sector pilots with reproducible gains under certified conditions. Within 6 to 10 years deliver end-to-end fault-tolerant workflows for at least two sector kernels with distributed execution across heterogeneous nodes. Throughout the decade capability certificates and open artefacts turn isolated demonstrations into cumulative and comparable progress. 

Policy and Funding Actions

Treat software as strategic infrastructure. Fund the five capabilities with explicit verification deliverables and open artefacts. Support standardisation for language and IR, runtime APIs, benchmark formats, and quantum safe conformance. Use procurement to require verification ready outputs and cross platform comparability. Invest in hybrid infrastructure that couples HPC facilities with quantum testbeds under common orchestration [23]. Resource programmes that maintain community tools and datasets. Sector pilots should be small and auditable with classical baselines and clear acceptance thresholds. 

Implementation and Maintenance

QSA will maintain this Agenda as a living document. A General Assembly elects a geographically balanced Steering Committee that oversees topic area task forces. Annual updates are ratified in open meetings. Artefacts such as benchmarks, IR tests, and datasets are curated in public repositories. QSA will engage with governments and international organisations to align efforts and will provide expert guidance on priorities that support long-term impact.

References

[1] Quantum Flagship. Strategic Research and Industry Agenda (SRIA 2030): Roadmap and Quantum Ambitions over this Decade. Technical report, European Commission, Feb. 2024. url: https : / / qt . eu / about – us / strategic – research – agenda – sria. The official roadmap for the EU’s quantum technology strategy, detailing the transition to industrial scale software. 

[2] R. Babbush, R. King, S. Boixo, W. Huggins, T. Khattar, G. H. Low, J. R. McClean, T. O’Brien, and N. C. Rubin. The grand challenge of quantum applications, 2025. arXiv: 2511.09124 [quant-ph]. url: https://arxiv.org/abs/2511.09124. 

[3] European Quantum Industry Consortium (QuIC). Strategic Industry Roadmap 2025: A Shared Vision for Europe’s Quantum Future. Technical report, QuIC, Apr. 2025. url: https : / / www . euroquic . org / strategic – industry – roadmap – sir/. The industrial roadmap focusing on the software supply chain and hybrid quantum-HPC integration. 

[4] J. Eisert and J. Preskill. Mind the gaps: the fraught road to quantum advantage, 2025. arXiv: 2510.19928 [quant-ph]. url: https://arxiv.org/abs/2510.19928. 

[5] A. D. Carleton, M. Klein, E. Harper, et al. Architecting the Future of Software Engineering: A National Agenda for Software Engineering Research & Development. Technical report, Carnegie Mellon University Software Engineering Institute (SEI), Nov. 2021. url: https: / / resources . sei . cmu . edu / library / asset – view . cfm ? assetid = 741193. Establish quantum software as a formal engineering discipline with specific architectural paradigms. 

[6] F. J. Cardama, J. Vázquez-Pérez, T. F. Pena, et al. Quantum compilation process: a sur vey. The Journal of Supercomputing, 2024. url: https://link.springer.com/article/ 10.1007/s11227-024-06123-x. Detailed survey of the compilation stack from high-level languages to pulse control (Capability C2). 

[7] M. Paltenghi and M. Pradel. A survey on testing and analysis of quantum software. arXiv preprint arXiv:2410.00650, Oct. 2024. url: https://arxiv.org/abs/2410.00650. A comprehensive review of verification and testing methods for Capability C3. 

[8] A. de Marti i Olius, P. Fuentes, R. Orús, P. M. Crespo, and J. Etxezarreta Martinez. De coding algorithms for surface codes. Quantum, 8:1498, Oct. 2024. url: https://quantum journal.org/papers/q- 2024- 10- 10- 1498/. State-of-the-art review of decoding algo rithms relevant to Capability C4. 

[9] V. Bartsch, G. Colin de Verdière, J.-P. Nominé, et al. Quantum for HPC: White Pa per. Technical report, ETP4HPC (European Technology Platform for High-Performance Computing), 2021. url: https://www.etp4hpc.eu/white-papers.html. Foundational reference for the orchestration of hybrid Quantum-HPC systems. 

[10] D. Barral, F. J. Cardama, G. Díaz, et al. Review of distributed quantum computing: from single QPU to high performance quantum computing. arXiv preprint arXiv:2404.01265, Apr. 2024. url: https://arxiv.org/abs/2404.01265. The definitive survey for Capa bility C5. It comprehensively reviews the entanglement distribution, blind delegation, and network layers you describe. 

[11] Secretariat of Science, Technology and Innovation Policy. Strategy of Quantum Future In dustry Development. Technical report, Cabinet Office, Government of Japan, Apr. 2023. url: https://www8.cao.go.jp/cstp/english/quantum/quantum_index.html. Japan’s national strategy for integrating quantum software into the ’Society 5.0’ industrial initia tive. 

[12] National Institute of Standards and Technology (NIST). FIPS 203: Module-Lattice-Based Key-Encapsulation Mechanism Standard. Technical report, U.S. Department of Com merce, Aug. 2024. url: https://csrc.nist.gov/pubs/fips/203/final. The finalized Post-Quantum Cryptography standard, essential for the cybersecurity sector roadmap. 

[13] IEEE Standards Association. International Roadmap for Devices and Systems (IRDS) 2024 Edition. Technical report, IEEE, 2024. url: https : / / irds . ieee . org/. Global industry standards for quantum computing metrics and volumetric benchmarking. 

[14] CSIRO Futures. Growing Australia’s Quantum Technology Industry (Updated Economic Modelling). Technical report, Commonwealth Scientific and Industrial Research Organ isation (CSIRO), Oct. 2022. url: https : / / www . csiro . au / en / work – with – us / services / consultancy – strategic – advice – services / csiro – futures / futures – reports / quantum. Provides economic benchmarks for the value of quantum software, control engineering, and error correction. 

[15] A. M. Dalzell, S. McArdle, M. Berta, P. Bienias, C.-F. Chen, A. Gilyén, et al. Quan tum algorithms: a survey of applications and end-to-end complexities. arXiv preprint arXiv:2310.03011, Oct. 2023. url: https://arxiv.org/abs/2310.03011. The ’Gold Standard’ reference for Section 6. It provides the rigorous end-to-end resource estimation for Chemistry, Finance, and Materials that you call for in your Sector Roadmaps. 

[16] S. E. Herman et al. Quantum computing for chemical and materials sciences. Nature Reviews Chemistry, 7:692–709, 2023. url: https://www.nature.com/articles/s41570- 023 – 00517 – 9. Validates Section 6.1 (Health/Life Sciences). It explicitly discusses the transition from VQE to the ’early fault-tolerant’ embedding methods you prioritize. 

[17] S. McArdle, S. Endo, A. Aspuru-Guzik, S. C. Benjamin, and X. Yuan. Quantum compu tational chemistry. Rev. Mod. Phys., 92:015003, 1, Mar. 2020. doi: 10.1103/RevModPhys. 92.015003. url: https://link.aps.org/doi/10.1103/RevModPhys.92.015003. 

[18] Y. Alexeev, M. Amsler, M. A. Barroca, S. Bassini, T. Battelle, D. Camps, D. Casanova, Y. J. Choi, F. T. Chong, C. Chung, C. Codella, A. D. Córcoles, J. Cruise, A. Di Meglio, I. Duran, T. Eckl, S. Economou, S. Eidenbenz, B. Elmegreen, C. Fare, I. Faro, C. S. Fernández, R. N. B. Ferreira, K. Fuji, B. Fuller, L. Gagliardi, G. Galli, J. R. Glick, I. Gobbi, P. Gokhale, S. de la Puente Gonzalez, J. Greiner, B. Gropp, M. Grossi, E. Gull, B. Healy, M. R. Hermes, B. Huang, T. S. Humble, N. Ito, A. F. Izmaylov, A. Javadi Abhari, D. Jennewein, S. Jha, L. Jiang, B. Jones, W. A. de Jong, P. Jurcevic, W. Kirby, S. Kister, M. Kitagawa, J. Klassen, K. Klymko, K. Koh, M. Kondo, D. M. Kürkçüoglu, K. Kurowski, T. Laino, R. Landfield, M. Leininger, V. Leyton-Ortega, A. Li, M. Lin, J. Liu, N. Lorente, A. Luckow, S. Martiel, F. Martin-Fernandez, M. Martonosi, C. Marvinney, A. C. Medina, D. Merten, A. Mezzacapo, K. Michielsen, A. Mitra, T. Mittal, K. Moon, J. Moore, S. Mostame, M. Motta, Y.-H. Na, Y. Nam, P. Narang, Y.-y. Ohnishi, D. Ottaviani, M. Otten, S. Pakin, V. R. Pascuzzi, E. Pednault, T. Piontek, J. Pitera, P. Rall, G. S. Ravi, N. Robertson, M. A. Rossi, P. Rydlichowski, H. Ryu, G. Samsonidze, M. Sato, N. Saurabh, V. Sharma, K. Sharma, S. Shin, G. Slessman, M. Steiner, I. Sitdikov, I.-S. Suh, E. D. Switzer, W. Tang, J. Thompson, S. Todo, M. C. Tran, D. Trenev, C. Trott, H.-H. Tseng, N. M. Tubman, E. Tureci, D. G. Valiñas, S. Vallecorsa, C. Wever, K. Wojciechowski, X. Wu, S. Yoo, N. Yoshioka, V. W.-z. Yu, S. Yunoki, S. Zhuk, and D. Zubarev. Quantum-centric supercomputing for materials science: a perspective on challenges and future directions. 

Future Generation Computer Systems, 160:666–710, 2024. doi: https://doi.org/10. 1016 / j . future . 2024 . 04 . 060. url: https : / / www . sciencedirect . com / science / article/pii/S0167739X24002012. 

[19] A. Di Meglio, K. Jansen, I. Tavernelli, C. Alexandrou, S. Arunachalam, C. W. Bauer, K. Borras, S. Carrazza, A. Crippa, V. Croft, R. de Putter, A. Delgado, V. Dunjko, D. J. Egger, E. Fernández-Combarro, E. Fuchs, L. Funcke, D. González-Cuadra, M. Grossi, J. C. Halimeh, Z. Holmes, S. Kühn, D. Lacroix, R. Lewis, D. Lucchesi, M. L. Martinez, F. Meloni, A. Mezzacapo, S. Montangero, L. Nagano, V. R. Pascuzzi, V. Radescu, E. R. Ortega, A. Roggero, J. Schuhmacher, J. Seixas, P. Silvi, P. Spentzouris, F. Tacchino, K. Temme, K. Terashi, J. Tura, C. Tüysüz, S. Vallecorsa, U.-J. Wiese, S. Yoo, and J. Zhang. Quantum computing for high-energy physics: state of the art and challenges. PRX Quantum, 5:037001, 3, Aug. 2024. doi: 10.1103/PRXQuantum.5.037001. url: https: //link.aps.org/doi/10.1103/PRXQuantum.5.037001. 

[20] Office of Science, U.S. Department of Energy. Basic Research Needs in Quantum Comput ing and Networking. Technical report, Advanced Scientific Computing Research (ASCR), July 2023. url: https://www.osti.gov/biblio/2001045. Defines priority research direc tions for the US software stack, including middleware and networked quantum computing. 

[21] G. Acampora, A. Ambainis, N. Ares, L. Banchi, P. Bhardwaj, D. Binosi, G. A. D. Briggs, T. Calarco, V. Dunjko, J. Eisert, O. Ezratty, P. Erker, F. Fedele, E. Gil-Fuster, M. Gärt tner, M. Granath, M. Heyl, I. Kerenidis, M. Klusch, A. F. Kockum, R. Kueng, M. Krenn, J. Lässig, A. Macaluso, S. Maniscalco, F. Marquardt, K. Michielsen, G. Muñoz-Gil, D. Müssig, H. P. Nautrup, S. A. Neubauer, E. van Nieuwenburg, R. Orus, J. Schmiedmayer, M. Schmitt, P. Slusallek, F. Vicentini, C. Weitenberg, and F. K. Wilhelm. Quantum computing and artificial intelligence: status and perspectives, 2025. arXiv: 2505.23860 [quant-ph]. url: https://arxiv.org/abs/2505.23860. 

[22] Quantum Economic Development Consortium (QED-C). Quantum Computing and Ar tificial Intelligence Use Cases. Technical report, QED-C, Mar. 2025. url: https : / / quantumconsortium.org/reports/. Explores the convergence of AI and Quantum (QC for-AI and AI-for-QC) for hybrid applications. 

[23] Ministry of Science and ICT (MSIT). Quantum Science and Technology Strategy of Korea. Technical report, Government of the Republic of Korea, June 2023. url: https://www. msit.go.kr/eng/bbs/view.do?sCode=eng&mId=4&mPid=2&pageIndex=&bbsSeqNo=42& nttSeqNo=837. South Korea’s strategy targeting a 1,000-qubit system and a supporting software ecosystem by the early 2030s.