“Three may keep a Secret, if two of them are dead.”
So wrote Benjamin Franklin, in Poor Richard’s Almanack, in 1735. Franklin knew a thing or two about secrets, as well as about cryptography, given his experience as a diplomat for the fledgling United States, and he’s right: a secret shared is a secret exposed.
But it’s 2017 now, and the Department of Justice and the FBI are still hacking away at encryption, and the conversation about encryption and the need for the government to be able to access any and all encrypted data continues to hit the same talking points as when then FBI Director Louis Freeh and Attorney General Janet Reno were pushing them in the 1990s — and, we might imagine, the same arguments could have been offered by King George’s government in the run-up to the Revolutionary War.
FBI Director Christopher Wray and Deputy Attorney General Rod Rosenstein have been taking the latest version of the “strong encryption is bad” show on the road, again, with a new buzzword: “responsible encryption.” While phrasing continues to morph, the outline is the same: the forces of evil are abusing strong encryption and running wild, destroying our civilization.
Some things have changed since the first battles in the crypto wars were waged more than 25 years ago. For example, the FBI and DOJ have listed money launderers and software pirates alongside the terrorists, human traffickers and drug dealers as part of the existential threat posed by unbreakable encryption.
It all boils down to a single question: Should law-abiding citizens be forbidden to defend themselves with encryption so strong that not even a government can break it, just so criminals can be denied it?
Rosenstein makes it clear that any piece of encrypted data subject to a valid court order must be made accessible to law enforcement agencies. “I simply maintain that companies should retain the capability to provide the government unencrypted copies of communications and data stored on devices, when a court orders them to do so,” he said at the 2017 North American International Cyber Summit, in Detroit on October 30.
If the person who encrypted the data chooses not to unlock it, Rosenstein and Wray believe the company that provided the encryption technology must be able to make that data available upon presentation of a warrant.
In the 1990s, the government demanded a key escrow platform through which all encryption could be reversed on demand. The resulting Clipper Chip was a spectacular failure, both technically and politically. And during the 2015 campaign, former FBI Director James Comey promoted the term “going dark” into the conversation.
This time around, we’re offered the concept of “responsible encryption.” This is presumably some form of encryption that includes some (as yet undetermined) mechanism by means of which lawful access is provided to the encrypted data. The phrase itself is not new — it seems to have originated in 1996 Senate testimony by Freeh:
The only acceptable answer that serves all of our societal interests is to foster the use of “socially-responsible” encryption products, products that provide robust encryption, but which also permit timely law enforcement and national security access and decryption pursuant to court order or as otherwise authorized by law.
As for how that might be achieved, well, that’s not the business of the government, Rosenstein now tells us. Speaking in Detroit, he said, “I do not believe that the government should mandate a specific means of ensuring access. The government does not need to micromanage the engineering.”
However, he does seem to think that the answer is not as difficult as the experts would have us believe — and it would not be necessary to resort to back doors, either. Rosenstein said:
“Responsible encryption is effective secure encryption, coupled with access capabilities. We know encryption can include safeguards. For example, there are systems that include central management of security keys and operating system updates; scanning of content, like your e-mails, for advertising purposes; simulcast of messages to multiple destinations at once; and key recovery when a user forgets the password to decrypt a laptop. No one calls any of those functions a “backdoor.” In fact, those very capabilities are marketed and sought out.”
It seems Rosenstein is suggesting these functions — key management, data scanning, “simulcast” of data and key recovery — can each be a part of a “responsible encryption” solution. And since these features have already been deployed individually in commercial products, tech firms need to “nerd harder” and come up with a “responsible encryption” solution by:
- maintaining a giant key repository database, so all encryption keys are accessible to government agents with court orders — but also secure enough to protect against all unauthorized access
- scanning all content before it is encrypted, presumably to look for evidence of criminal activity — but hopefully without producing too many false positives
- “simulcasting” all data, either before it is encrypted or maybe after it is encrypted and the keys are stored for government access — so it can be retrieved or scanned at the government’s leisure
- deploying “key recovery” for encrypted laptops, but for all laptops, everywhere, and accessible to authorized government agents only
Unfortunately, the answers the government provides can’t make key escrow scalable or secure. There are many, many reasons the law enforcement community’s demand for breakable encryption is not a reasonable (or even practical) solution, but two spring to mind immediately:
- Key escrow schemes are massively complicated and produce huge new attack surfaces that could, if successfully breached, destroy the world’s economy. And, they would be breached (see Office of Personnel Management, Yahoo, Equifax and others).
- “Responsible encryption” means law-abiding organizations and people can no longer trust their data. With cryptography backdoored, forget about privacy; there no longer is any way to verify that data has not been altered.
A ban on end to end encryption in commercial tech products will only prevent consumers from enjoying the benefits — it won’t prevent criminals and threat actors from using it.
We shouldn’t be surprised that this or any government is interested in having unfettered, universal access to all encrypted data (subject, of course, to lawful court orders).
However, once we allow the government to legislate weaker encryption, we’re lost. As Franklin wrote in the 1741 edition of Poor Richard’s Almanack:
“If you would keep your secret from an enemy, tell it not to a friend.”