How much would your agency pay to make them let go?
If you’re the non-federal Hollywood Presbyterian Medical Center, you are willing to fork over about $17,000, paid in 40 bitcoins. According to this CSO report, the hospital was under extortion by hackers. In return for the money, they would return the hospital’s network to functioning status. Hospital officials say they were able to restore their electronic medical record system; key to its clinical functions. Mission critical, that is.
This little exchange is called ransomware, the act of “kidnapping” an organization’s network — typically by encrypting everything, holding out the keys, and not giving them back until the target pays. I think a better term is extortionware. It’s damned ugly, but in this instance: effective.
Rumor had it the hackers wanted millions in bitcoin. Had they asked for a cashier’s check, wire transfer or suitcase of cash, authorities might’ve been able to track them down. Bitcoin, swirling around in the “dark Web,” apparently is a safer bet, presuming you can launder it into actual money.
The potential of a government network being seized for ransomware raises questions. Does policy exist for such situations? Who decides whether to pay and how much? How bad is the precedent once the first agency pays? You can see how a difficult standoff could develop.
A difficult standoff is what we’re also witnessing now between Apple and the Justice Department. Each has the other in a good grip. Justice has a court order enforceable, one presumes, by the full powers of the federal government.
How would Apple CEO Tim Cook look in a perp walk to an idling black sedan outside 1 Infinite Loop?
But the government has a point. It’s invoking an early American law, the All Writs Act, giving the government powers when no other remedy is available. The FBI has a clear and legitimate need for the information on the phone — but the people who knew the PIN are dead.
Apple has goods, too. It apparently is the only company on earth that can crack the iPhone used by the dead San Bernardino terrorists. In a letter to its customers Apple is spelling out why it’s standing on the court order “with the deepest respect for American democracy and a love of our country.” The letter makes a strong case that building a back door to this iPhone is building a backdoor to iPhones is equivalent to opening Pandora’s box.
For me, a missing piece is the answer to the question: Is Apple’s security encryption scheme so strong that only its own engineers are capable of writing a program to alter it? Unknowable at this point. But I’ll bet it’s become the new holy grail in hacker circles (if it hasn’t been already).
I see this dispute as a cyber version of the “ticking bomb” conundrum: What means are justified to get information from a terrorist when you know there is a bomb ticking somewhere about to kill innocent people?
The dispute has been coming to a boil for months, with statements from FBI Director Jim Comey and Cook. I can’t really say where I stand on this one, although I’m fairly certain the government will eventually win.
Looking at this huge gray area, I guess I lean slightly to the FBI here, mainly because I think Apple is overstating the privacy issues. The hundreds of millions of people who use smartphones, for the most part, willingly give up their privacy to Silicon Valley firms hour by hour by the simple invoking of location services. Google, Parking Panda, Foresquare, Facebook of course — hundred of these applications know what you do, where you go, what you eat, whom you associate with, what you’re interested in, and your preferences in airlines, restaurants, hotels, books, music and clothes.
Bernie Sanders says the business model of Wall Street is fraud. Well, perhaps the business model of app purveyors is snooping. But people gladly leave the window shades up. I guess the difference is once you shut off and encrypt your phone, nobody else can see what’s on it.
Short of a showdown at the Supreme Court, maybe Apple and the FBI could work out a system where the tools to crack the phone are used on the one phone, then themselves encrypted and stored in such a way that only an anonymous Apple engineer and the FBI director working together can decrypt and use them again. In the meantime, if you want that Pandora’s box to stay shut, you’ve got to hope Apple itself, among its 115,000 employees, isn’t harboring someone working on that very code right now.