Why Hardware-Backed Identity Matters.
Software-only credentials look fine until you stack the threat surfaces. Hardware shifts the cost-per-extraction by orders of magnitude — and that is the frame that actually decides whether your authentication holds at scale.
The question this post answers
Why does hardware matter? You can do public-key crypto in software. The math is the same. Modern operating systems sandbox processes; well-written software keeps secrets in protected memory. So what does hardware-backed identity buy you that software-backed identity does not?
The short answer: an entirely different exfiltration economics. The longer answer is the rest of this post. We are going to walk through what software-only credentials are vulnerable to that hardware-backed credentials are not, what "hardware" actually means at different price points, and what a deployment that takes hardware seriously looks like in 2026.
Software-only credentials — the threat surface
A software-only credential is a private key sitting in a file or in a process's memory. The defences that protect it are operating-system access controls (file permissions, ACLs), language-level protections (libraries that zero memory), and the runtime's ability to keep the bytes off the swap file.
That is a lot of defences for a normal threat model. It is not enough for a serious one. Consider the actual exfiltration paths:
- Malware running as the user. Reads the user's files, including private keys. Reads the user's processes' memory if the OS allows (LD_PRELOAD, Windows debugging APIs, ptrace). Captures keystrokes. The credential is gone.
- Backup software. Backs up everything on the disk. The credential is now in N backups, off-site, plausibly indefinitely.
- Cloud sync. Same shape: the credential is now in iCloud / Google Drive / Dropbox / OneDrive, depending on what is in scope.
- Memory dumps. Crash dumps, hibernation files, vmcore captures. The private key sits in memory; if it is in memory at the moment of dump, it is in the dump.
- Operating-system compromise. Kernel-level attackers can read any process's memory. Once kernel is owned, software-only credentials are forfeit.
- Cross-tenant attacks in shared hosts. Side-channel attacks (Spectre, Meltdown, Rowhammer, L1TF, MDS) have demonstrated that even nominally isolated processes leak across boundaries. The attacks have all been real.
Each of these is rare in a single environment. Aggregated across an organisation, with all the operational complexity of a real fleet, "rare per machine" becomes "common across the fleet".
Hardware-backed credentials — what changes
A hardware-backed credential is a private key that lives inside a chip designed to never expose its key material to anything outside that chip. The chip exposes operations — "sign this", "decrypt this", "derive a session key from this" — but never lets you read the key. Extracting the key requires physically attacking the chip.
Three categories of chip, in increasing price and assurance:
TPM (Trusted Platform Module)
Soldered onto the motherboard or instantiated inside a CPU package (fTPM). Protects keys from the operating system above it; doesn't necessarily protect against motivated physical attackers with $50k of equipment. Fine for laptop platform credentials. Common in Windows Hello deployments.
Secure Element (SE)
A discrete crypto-chip with its own packaging and side-channel countermeasures. Lives inside a smart card, USB key, eSIM, secure enclave (Apple, Samsung Knox), or as an embedded SE on a phone. Certified to higher assurance levels (Common Criteria EAL5+). Designed to resist active physical attacks.
HSM (Hardware Security Module)
Dedicated server-class crypto appliance. Higher throughput, FIPS 140-2 / 140-3 level 3 or 4 certified. Used for issuance, root keys, code-signing, certificate authorities. Not typically a per-user authenticator; it backs the issuance pipeline.
For user authentication, the relevant tier is the secure element (SE). It is what sits inside the OnePass Card, the OnePass USB Key, and inside Apple's secure enclave / Samsung Knox / Google Titan-M.
What the hardware actually does differently
Three concrete properties matter for an authentication credential:
Property 1: key material never leaves the boundary
The private key is generated inside the SE, used inside the SE, destroyed inside the SE. Operating-system-level attackers cannot read it; backup software cannot back it up; cloud sync cannot copy it. The attack surface is the chip's I/O pins and its packaging — both physical, both expensive to attack.
Property 2: side-channel countermeasures
Certified secure elements implement countermeasures against power analysis (DPA), electromagnetic analysis, timing attacks, and fault-injection (clock glitching, voltage glitching, optical fault injection). Software cannot reasonably implement these — they require chip-level design.
Property 3: PIN / UV bound to the chip
The PIN that gates use of the credential is verified inside the chip. Brute-force attempts are rate-limited and counter-locked by the chip itself; you can't extract the PIN-verification material to attack offline.
Each of these is the kind of property you cannot bolt on later. They have to be in the design from the start. That is why the answer to "can we just use software keys?" for high-assurance authentication is structurally no.
The cost-per-extraction frame
Security economics: the right metric is not "is this attack possible" but "what does it cost to perform". Plot the cost-per-extraction curve for a credential type:
- Password: $0 to $5. Phishing kit; mass-mailer.
- Software-stored private key on a normal corporate laptop: $50 to $5k per machine. Deploy malware, exfiltrate, or buy from initial-access broker.
- TPM-bound credential: $5k to $50k. Either a kernel-level compromise (after which the OS issues operations), or physical extraction with chip-level equipment.
- Secure-element-bound credential (consumer-grade): $50k to $500k. Acid-etch the chip, microprobe the bond pads, side-channel the operations.
- Secure-element-bound credential (CC EAL5+): $500k to $5m+ per device. Most attempts fail; success requires a well-funded actor with months of effort.
For mass attacks (phishing, credential stuffing) the relevant tier is the lower end. Defence tier matters because mass attacks are cheap. Once you push the per-extraction cost into the six-digit range, mass attacks become uneconomic; only targeted attacks remain. That is the business case for hardware.
"But the platform vendor's secure enclave is good enough"
This is the most common objection from enterprises that bridle at the hardware-procurement cost. It deserves a careful answer.
For consumer use, the platform vendor's secure enclave (Apple Secure Enclave, Android Strongbox, Windows Pluton) is in fact good enough. These are real secure elements with real countermeasures, and the cost-per-extraction frame works the same as a discrete SE.
For workforce use, two issues remain. First, key recovery is the platform vendor's. iCloud Keychain backups can be restored to any new Apple device the user owns; Google Password Manager can sync across all the user's Google-attached devices. That is a syncable-passkey posture (BE=1). For high-assurance use you want device-bound (BE=0), which the platform vendor explicitly does not give you.
Second, the trust assumption is "Apple / Google / Microsoft is not compromised, and the user's cloud account is not compromised". Both have been wrong in real incidents. Removing the cloud account from the dependency graph — by using a discrete authenticator that has no cloud account — reduces the failure modes you must defend against.
Net: platform-bound is excellent for consumer; discrete hardware is the right floor for workforce.
The auditability angle
Hardware-backed identity makes auditing tractable in a way software-only does not. With a discrete hardware authenticator you can answer concrete operational questions:
- "How many authenticators have we issued?" — a number you can reconcile with procurement.
- "Which authenticator was used to authenticate this session?" — signCount + AAGUID + serial in your audit log.
- "Have any authenticators been used after the user's offboarding?" — reconcile log against HR.
- "Have any authenticators left the country we ship them to?" — geo-correlate IP against known-shipped device serials.
The same questions are answerable in principle with software credentials, but the device-binding is fuzzier — what does it mean to "have" a software credential when it has been backed up, synced, copied, restored, and possibly forgotten in five places? Hardware gives you a tangible referent.
What hardware does not solve
Three honest limits.
Limit 1: physical attack on a stolen device. If an attacker steals a user's authenticator and has the user's PIN (shoulder-surfed, phished after the theft), they have what the user had. Hardware doesn't fix this; it raises the cost (the attacker must steal the physical thing) but it doesn't make it impossible. Mitigation: tampering detection, prompt revocation on user-reported loss.
Limit 2: supply-chain compromise. If the manufacturer is compromised, the trust at the root is compromised. Mitigations: certified manufacturing lines, audited issuance, FIDO MDS as a status reporting channel.
Limit 3: the social layer. Hardware doesn't stop a user from approving a phishing OAuth scope, signing a malicious transaction, or being coerced into producing a signature. Hardware is a foundation; the application layer above it has to do its job too.
What this looks like in practice
An enterprise deployment with hardware-backed identity at the core has these attributes:
- Every workforce identity backed by at least one secure-element-bound authenticator. OnePass Card for desk + turnstile, OnePass USB Key for laptops.
- Issuance through a SAM-protected pipeline with HSM-rooted key custody. Per-card diversification, attestation-anchored to vendor root.
- RP attestation policy enforces AAGUID allow-list against the FIDO MDS BLOB. Compromised AAGUIDs are rejected at registration time.
- Logging captures AAGUID, serial, signCount on every ceremony. Anomalies feed SOC.
- Recovery is IT-mediated re-enrolment, never password fallback.
The bottom line
Hardware-backed identity is not "the same thing, in a chip". It is a categorically different security posture with categorically different attack economics. For high-assurance environments — financial services, government, regulated healthcare, infrastructure operators — software-only is below the floor. For consumer, platform-bound hardware (Apple, Google, Microsoft) is excellent.
The work for an enterprise is to set the policy at the right tier and then enforce it. WebAuthn + FIDO MDS + a hardware authenticator family gives you the protocol-level tools to do it. The rest is operational.