The number that did the rounds in the financial press this week was eight million. That is the count of logical qubits Goldman Sachs's internal research team concluded their target portfolio problem would need, alongside an algorithm runtime measured in millions of years — and the proximate cause for the bank quietly dismantling its quantum programme, as Bloomberg reported on 26 April. The press read this as a referendum: Goldman quit, JPMorgan invested, technology disappointed.
That is the wrong read. The same week, a QuEra/Harvard/MIT collaboration posted results showing a roughly 2:1 physical-to-logical qubit ratio. Meta's engineering team published a serious post-quantum cryptography migration framework. Canadian federal departments hit their statutory PQC migration plan deadline. Three of those four stories should be on a board agenda. The fourth is noise.
I read the lot so you don't have to. Here is what mattered, what didn't, and what your board should take from it.
The big one: Goldman quit, JPMorgan invested, HSBC reported 34% — and the press told a binary
Bloomberg's feature on Wall Street's "quantum divide" is the story of the week. As reported, Goldman Sachs's internal research, conducted with Amazon, concluded that its target portfolio optimisation problem would require approximately eight million logical qubits and an algorithm runtime measured in millions of years. Goldman's quantum team has since been dismantled in the bank's wider cost-cutting programme. JPMorgan Chase, by contrast, retains more than fifty physicists, mathematicians and computer scientists, has worked with Quantinuum's Helios processor on data-processing efficiency, and continues research into portfolio construction, machine learning and cryptography.
The press has framed this as a binary. It is not. Three further data points reframe the picture.
The first is HSBC. As Bloomberg's own piece notes, HSBC has reported up to a 34% improvement on certain bond-trading forecasts using IBM hardware — albeit on smaller datasets. UBS has trained around fifty quant analysts. BBVA is working on portfolio optimisation. Crédit Agricole is studying credit-downgrade prediction. Wall Street's "binary" is in fact a spectrum, and a more interesting one than the Goldman/JPMorgan framing allows.
The second is the eight-million figure itself. It is a real number, sourced to Goldman's internal research as reported by Bloomberg. It is also a snapshot — an artefact of the error-correction overhead assumed at the time of the analysis — and as the next section shows, the field-wide overhead assumptions the figure depends on are visibly recalibrating in the literature this same week.
The third is the editorial distinction the financial press has spent the week blurring. A bank's offensive bet on quantum computing — whether to fund portfolio-optimisation research today — is genuinely optional. Goldman is entitled to conclude the maths does not yet justify the spend. JPMorgan is entitled to conclude that early presence is itself the asset. Both calls are defensible.
A regulated bank's defensive obligation to migrate its cryptographic estate to post-quantum standards is not optional, and it is identical regardless of which offensive call the bank made. Goldman's retreat from quantum research changes nothing about its NIST, NCSC, EU or CNSA 2.0 obligations. CIOs and CROs reading the Bloomberg piece this week should ensure their boards are clear on that distinction before the noise hardens into received wisdom.
The engineering story: QuEra, Harvard and MIT halved the overhead the field has been working against
The financial press did not pick this up. It should have.
In the week of 21 April, a research collaboration between QuEra Computing, Harvard and MIT published an arXiv preprint demonstrating a roughly 2:1 physical-to-logical qubit ratio using quantum Low-Density Parity-Check (qLDPC) codes co-designed for reconfigurable neutral-atom hardware. Standard surface-code approaches typically require hundreds to thousands of physical qubits to encode one reliable logical qubit. The new result encodes at rates above one-half, and in simulation reaches the so-called "Teraquop" regime — roughly one error per trillion logical operations.
The work builds on a theoretical breakthrough by Kasai (2026) showing that quantum error-correcting codes with encoding rates above one-half could be made practical, and exploits the ability of neutral-atom arrays to move qubits in parallel using acousto-optic deflectors. The result is for quantum memory rather than full computation; further work on decoders, gates and system integration is needed before this becomes a fault-tolerant architecture.
Read against the Bloomberg piece, the implication is straightforward. The field-wide assumption that logical qubits cost hundreds-to-thousands of physical qubits is in active revision; figures derived from older overhead assumptions are correspondingly dated. That does not mean Goldman was wrong; it means the moment the analysis was anchored to is moving.
The European parallel is worth surfacing. Norwegian University of Science and Technology researchers published a result in Physical Review X (April 2026) on real-time adaptive tracking of fluctuating relaxation rates in superconducting qubits — a measurement technique reportedly more than a hundred times faster than previous methods. Anglo-American press largely ignored it. It is, in its own way, the same story: the engineering bottleneck the financial press treats as fixed is being dismantled in instalments.
The cryptography story: Meta's PQC Migration Levels framework is the document of the week
On 16 April, Meta's engineering team published an account of its post-quantum cryptography migration. The trade press flattened it to "Meta is doing PQC." That is not the contribution.
The contribution is a framework Meta calls "PQC Migration Levels," which ladders an organisation's migration state by how rapidly it can respond to a relevant quantum event. Meta also articulates four explicit principles guiding the work — effectiveness, timeliness, performance, and cost efficiency — and notes that its cryptographers are co-authors of HQC, one of NIST's selected post-quantum algorithms.
This is the most useful executive document on PQC migration to appear in months, and it is freely available on Meta's engineering blog. CISOs should read it directly. Three observations.
First, the levels framework is the actionable bit. It moves the conversation from "have we adopted ML-KEM?" — a vendor question — to "how quickly could we respond if a cryptographically relevant quantum computer were credibly imminent?" — a governance question. That is the right question.
Second, Meta's framing implicitly endorses the position Cloudflare and Google have both now adopted: 2029 is the realistic internal target for serious PQC coverage of public-facing infrastructure. Cloudflare reported in April that more than 65% of human traffic on its network is already protected by post-quantum methods, with full migration including authentication targeted by 2029. Google has set the same internal deadline. India's National Quantum Mission Task Force has gone further still, mandating full PQC adoption for Critical Information Infrastructure by end-2029 — defence, power, telecom — with broader enterprise migration by 2033. Whichever you index against, the date that matters is closer than most enterprise programmes are planning to.
Third, Meta's contribution to HQC is a reminder that the standards underpinning enterprise migration are still being set by a small number of researchers at named institutions. Crypto-agility — the ability to swap algorithms as standards evolve — is not a future concern. It is the design principle for the migration that is currently in flight.
The noise: stock movements, six-day billionaires, and CUDA-Q-as-stock-catalyst
The S&P Kensho Global Quantum Computing Technologies Index is up roughly 40% year-to-date and 161% over the past twelve months, per WTOP citing the index data. Quantum-related stocks were broadly bid this week. Xanadu's founder briefly became a paper billionaire after the company's stock surged following its public listing. Nvidia's CUDA-Q updates produced another round of pure-play quantum-stock movement.
None of this is signal. It is weather.
Specifically: Nvidia's announcement of an Ising-model error-correction approach with reportedly 2.5x speed and 3x accuracy improvements over previous methods is a real engineering update. It is being covered as a stock catalyst. The actual story is that Nvidia is consolidating its position as the hybrid-architecture broker — the layer that sits between any eventual QPU and the GPU infrastructure already in every data centre — and is hedging its bets on whether quantum hardware ever standardises around a single modality. That is a real strategic story; "IonQ stock up on Nvidia headlines" is not.
What is the underlying signal worth tracking instead? Two things, both named earlier in this briefing. Cloudflare's 65% post-quantum coverage figure and Google's 2029 internal deadline. These are the two organisations with the largest visibility into actual internet traffic, and they have both committed to a date that most enterprise PQC programmes are not yet planning against. That is the data point that should be on a board paper. The S&P Kensho index is not.
The China dimension: Origin Wukong's AI integration and the 15th Five-Year Plan
On 21 April, the Anhui Quantum Computing Engineering Research Center announced that Origin Wukong — China's third-generation superconducting quantum computer — has integrated AI computing capabilities and launched a model called Origin Brain. The announcement is state-supported and warrants the usual scepticism on performance claims, but the strategic context is harder to dismiss.
China's recommendations for the 15th Five-Year Plan (2026–2030) name quantum technology first among seven "future industries," per reporting on the National People's Congress and analysis by China-Briefing. The plan is operationalised through MIIT implementation opinions that, as documented in Bird & Bird's 2026 regulatory survey, set explicit 2026 targets — including a measurement-and-control system supporting at least one thousand qubits with sub-microsecond feedback latency. The Chinese quantum sector reached approximately RMB 11.56 billion (roughly US$1.61 billion) in 2025, per CAICT data cited in China-Briefing's reporting.
The point is not that China is ahead. The point is that Western executive briefings on quantum routinely model the field as a US/UK/EU phenomenon with China as a footnote. That model is now incorrect. Whatever timeline assumption a Western board uses for its quantum-security planning needs to be tested against the possibility that progress in China will at some point cease to be public — a point Scott Aaronson made publicly and Cloudflare quoted in its April roadmap update.
What your board should take from this week
The eight-million qubit number is real, but it is a snapshot. The figure is sourced to internal Goldman research as reported by Bloomberg, and reflects the error-correction overhead assumptions in play when the analysis was conducted. Those assumptions are in active revision in the academic literature this same week. Treat the figure as a moment-in-time datum, not a forever constraint.
Quantum computing decisions are optional. Quantum security decisions are not. Goldman's retreat changes nothing about its post-quantum cryptography obligations under NIST, NCSC, EU and CNSA 2.0 timelines. Confusing the offensive and defensive halves of the quantum agenda is the single most common error in current board-level discussion. Make the distinction explicit.
If you have not read your jurisdiction's PQC roadmap, you are behind. Canadian federal departments were required to develop initial PQC migration plans by April 2026, with annual reporting thereafter. The UK NCSC three-phase timeline runs to 2028, then 2031, then full migration by 2035. The EU Coordinated Implementation Roadmap targets full transition by 2035 with national roadmaps due by end-2026. Australia's ASD requires a refined plan by end-2026 and transition of critical systems by end-2028. The defensive obligation is now scheduled.
Crypto-agility is the only sustainable answer. Today's NIST standards — ML-KEM, ML-DSA, SLH-DSA — will themselves age. Meta's PQC Migration Levels framework is right to ladder organisations by their ability to swap algorithms, not by which algorithm they adopted first. Procurement specifications for new systems should require crypto-agility as a baseline. NCSC guidance is explicit on this point.
Your suppliers are now the bottleneck. A cryptographic inventory that stops at your own perimeter is theatre. Cloudflare's 2029 roadmap update specifically calls out third-party dependencies — financial services and utilities included — as the long pole. Boards should ask not just "have we inventoried our cryptography?" but "have we written to our top fifty suppliers asking for theirs?"
The week ahead
Three watchpoints. IonQ reports Q1 2026 earnings on 6 May; expect commentary on the 10,000-qubit roadmap and any update on the SkyWater integration. The IETF's PQC TLS standardisation track continues to mature toward the final standards expected around 2027 per NCSC's framing; watch for working-group movement that would unblock browser-vendor full deployment. And expect follow-up reporting in the FT and Bloomberg on which European banks have actually funded PQC migration programmes versus which have inventoried and stopped.
Start now. Or explain why you didn't.
Phil Intallura is a quantum physicist and Group Head of Quantum Technologies at HSBC, where he leads the bank's quantum-safe migration programme. He serves as a Quantum Adviser to the UK Government on the DSIT Strategic Advisory Board. This is a weekly briefing, published every Tuesday.
