I feel the debate about unlocking that San Bernardino attackers’ phone is another one the press don’t get the technical stuff right, at least enough.
First of all it is not a balance between privacy and security or safety. In Fourth Amendment jurisprudence, the heart of the matter is that safety and freedom come hand-in-hand. More privacy and security means more freedom and better and safer society more often than not. Safety and privacy only conflicts at the edge case. We should not frame this debate as a zero-sum game.
This is the central paradigm of liberty. If we don’t have a shitty government, people aren’t going to behave like criminal as often. I’ve seen a few OP-EDs to this extent and it’s worth pointing out, at least in one instance, where the Snowden leaks revealed the massive government betrayal of trust in not so much just the surveillance, but the lack of oversight and the abuse of gag orders and other quieting devices to keep things dark. If the government didn’t screw this up and we had a robust system to check abuses in this regard, the public might not be as pissy about building a governmental-access-only backdoor or whatever ballyhoo they’re trying to say.
And an extension of this argument is whenever anyone brings up China or Russia and slippery slope of precedence that can lead to those countries getting master keys to iPhones. If China and Russia are trustworthy countries for Americans, I don’t think we would be using this argument. In other words, shitty national governments are directly the reason why shitty situations like this even exists. Or TL;DR, we need strong encryption because safety against government is a thing!
But that’s not even what I find troubling about the coverage on why this is bad. What’s bad is what Tim Cook says about the master key. In this case, the master key isn’t some rogue version of iOS so much. It’s two things together: the necessary keys to decrypt or “backdoorize” iOS, and the engineer needed to make the hack. I mean there’s no reason to assume FBI don’t have coders who can likewise build a backdoor if given Apple’s auth keys to sign the code that you can then sideload into that iPhone Of Interest. But of course Apple will not give out those keys, so this means some Apple engineer is required to produce the hack.
If I run a mid-size company selling smartphones do I really want to spend engineers on FBI requests and other law enforcement requests just so they can investigate crime that are not as dire as international terrorism? But I’m busy shipping code. Maybe a big company like Apple, this ain’t a thang, even if they may get hundreds/thousands of requests a year. But if I was, say, Motorola or something, do I even give a damn? Will I get laid off tomorrow because nobody is buying my perfectly okay phones? Or replace Motorola with OnePlus, and replace “buying” with “can buy” from the previous sentence.
This is obviously why MS, Google, and even Verizon signs on. Because it’s not like the FBI can’t hire people who can code, so why bother these companies? It’s disruptive, it’s intrusive, it doesn’t do their respective bottom lines any good. It’s arguably even bad for security, and bad for all their customers. It is not even a zero-sum outcome. It makes sense for all those tech companies to resist.
The alternative is to fight it out in courts. Okay, you don’t need lawyers to ship code, maybe, but that’s expensive and a piece-meal way to deal with the situation. So yeah, of course you want Congress to make the call just so we can be done with it.
As to the master key itself, I don’t quite buy it that it may be leaked by making it existing in the first place, in that the same danger has always existed before and even today. All the needed ingredient to make that key already exists. Nobody except Apple may know how to put the key together, but it’s not significantly more likely that just by writing it down as to how to put it together makes leaking the components of the master key AND the key more likely. It’s quite possible that a big enough of a security leak at Apple will cause this to happen, if hackers wo obtained certain source codes and the key were able to engineer out the ingredients and put it together.
But this is all just fancy talk that we don’t even want to deal with, and I agree. I’m all for companies helping law enforcement, but the key distinction has to be the amount of work. Shipping a new version of iOS is way overreach and it’s unfair to compel any company for doing it.
Now maybe you can shame Apple into doing it. That seems like a fair approach.