Post-quantum cryptography (PQC) has been “the next big thing” for years. But if you’re in Federal space, it’s no longer living comfortably in the future tense.
That was the central theme of a recent Carahsoft hosted webinar led by Harvey Morrison (CEO, Marion Square), with Jacquay Henderson (CEO, SquarePeg Technologies) and Evgeny Gervis (CEO, SafeLogic). The discussion wasn’t about which post-quantum algorithms to pick or what vendor to buy. It was about something more practical—and more urgent:
OMB’s PQC reporting deadlines are forcing agencies to build the capability to understand, govern, and evolve their cryptography over time.
In other words, reporting isn’t the goal. It’s the catalyst.
A key point up front: agencies aren’t mobilizing because a cryptographically relevant quantum computer has arrived. They’re mobilizing because OMB reporting requirements are now demanding measurable, defensible answers about cryptography usage and quantum risk.
That pressure exposes a reality many organizations have lived with for a long time:
PQC reporting brings those issues into the open, fast.
And that’s why the panel emphasized that PQC isn’t just a technical upgrade—it’s a program.
Strip away the acronyms and it comes down to a practical progression:
The warning was clear: agencies get into trouble when they jump straight from “read the memo” to “start migrating” (or “buy a tool”), without understanding how cryptography is implemented and managed across the enterprise.
That sequence matters because PQC is not a one-time cutover. It’s a long transition that will unfold in phases, across heterogeneous systems, amid shifting standards and evolving threats.
One of the most useful sections of the webinar was the overview of how PQC direction is structured across government, because confusion here leads to poor planning and wasted effort.
The panel described a layered model:
Evgeny added a critical point: the “we can’t act because standards aren’t final” excuse has largely expired. The standards foundation is now real and continuing to evolve—so agencies need a plan that can evolve with it.
The webinar returned repeatedly to a misunderstanding that causes unnecessary panic:
OMB is not demanding an immediate PQC cutover.
OMB is demanding agencies be able to show—credibly and repeatedly—that they understand their cryptographic reality and have a risk-based plan.
In practice, that means:
The hardest part isn’t producing a report once.
The hardest part is sustaining the ability to report as your environment changes.
Jacquay offered one of the most grounded perspectives: once you complete the inventory, it becomes data—and reporting becomes a data trust problem.
If an agency can’t show:
...then the reporting won’t stand up to oversight. And oversight isn’t theoretical—agencies answer to OMB, inspectors general, and sometimes Congress.
This is why “defensible evidence” isn’t a buzzword. It’s a requirement.
And it’s why agencies need an operational reporting pipeline, not a one-time spreadsheet scramble.
The term “crypto-agility” can sound abstract, but the panel made it concrete: it’s the ability to govern and change cryptography deliberately, safely, and repeatedly without rewriting everything or breaking trust.
Harvey framed it as a practical operational question:
Evgeny described crypto-agility not as a single feature but as a set of capabilities:
His analogy landed: today, many orgs treat cryptography like leaky pipes behind drywall—you can fix them, but every repair means tearing open walls. Crypto-agility is building the “access panel” so changes become routine, not a demolition project.
A standout Q&A asked where agencies underestimate PQC complexity the most: the cryptography itself or the broader architecture and operations?
Jacquay’s answer: architecture complexity is the real monster.
Federal environments aren’t uniform. They’re sprawling mixtures of:
Change cryptography in one place and you may break a fragile dependency somewhere else—sometimes in systems only a handful of people even understand.
Evgeny added two additional pain points:
One of the best questions asked: what’s an early indicator that an agency is becoming crypto-agile (vs simply completing a one-time migration)?
Two answers stood out:
Harvey summarized it nicely: crypto-agility shows up as reduced “time-to-know” and “time-to-fix.”
Another practical Q&A: Where does PQC fit into Zero Trust?
The panel’s view: Zero Trust depends on cryptographic foundations (identity, PKI, secure communications). PQC migration isn’t a separate track—it’s tightly connected.
There was also a strategic budgeting implication: agencies with strong Zero Trust momentum may be able to align PQC workstreams with that investment and architecture roadmap.
In closing, the speakers reinforced a mindset shift:
Evgeny’s final advice was especially pragmatic: approach PQC like an agile project. Start where it matters most, start where dependencies are manageable, and improve continuously—so you’re more quantum-resistant tomorrow than you are today.
If you’re staring down PQC reporting requirements, the webinar’s implicit action plan looks like this:
Because the reality is simple:
The organizations that treat PQC as a living capability will stop scrambling every reporting cycle—and start managing cryptographic change like a normal part of operations.
And that’s the point.