The fight against government encryption back doors

I’m sure most people reading this are familiar with the current legal battle between Apple and the FBI, but for those uninitiated; the FBI wants Apple to create a custom firmware to bypass the lockout feature that is on by default for any iPhone that has a pin enabled. In this case, it is the local government issued iPhone 5c used by one of the San Bernardino terrorists.

In a nutshell, the FBI made an error during their investigation and changed the password to the local government controlled iCloud account associated with the iPhone. This prevented the iPhone from automatically connecting and making a backup to iCloud.

This left the FBI with no other choice than to brute force the 4 digit passcode. During a brute force attack, every possible passcode is tried until the correct one is found. In the case of a 4 digit code, this would take only a few hours.

The issue for the FBI, is that after 10 incorrect attempts to enter the passcode, the iPhone will delete all data on the device. So the FBI’s solution was to ask Apple to create a special firmware for use on the suspect phone only that will allow them to try every possible passcode and unlock the device to be forensically investigated.

The above brief summery lays out the basic dilemma that the government investigation is faced with, but it does not begin to explain what is really going on and what is at stake.

It would be easy to assume that the FBI is in the right to demand a method to conduct a terrorism related investigation. It would be just as easy to assume that Apple has the right to protect the privacy of their customers, even if some of those customers turn out to be criminals.

The deeper issues here are not this high profile case, but the dozens of other cases that are not terror related and guy by the name of Edward Snowden.

Apple fans and advocates of privacy in general are hailing Apple as a hero for protecting the privacy of their customers. Since iOS 8, all data on iPhones is encrypted and if we are to believe the statements from law enforcement, that makes it impossible for all of those 3 letter agencies to access the data contained in them. However, prior to the Snowden leaks, Apple was at least somewhat cooperative with most investigations and in my opinion it is fair to assume that the Snowden revelations were at least part of the reason Apple chose to encrypt iPhones by default.

Remember, that this makes customer service issues for Apple as well because they cannot unlock iPhones that were accidentally put into a locked state by their owners. I’m not saying that Apple is only motivated by the PR and marketing benefits of taking a strong stand for privacy, but it would be naive to assume that they did not take this into account in their risk assessment.

Taking a stand on encryption also benefits Apple in their pitch about selling physical products and not selling customer data. Android is an amazing creation and a powerful platform, but Google makes money not from selling the hardware, but from selling their user’s data and access to their marketing platforms.

So that’s how we got here. Apple must stand for privacy at any cost because they planted their flag there and have tied their reputation and profit model to following through. The government isn’t going to stop because they want the ability to investigate any person and their devices and they want to establish a precedent for their right to force a corporation to assist them.

The FBI chose the San Bernardino case because to your average person, it sounds reasonable to force Apple to help stop terrorism and the larger argument that, “It’s only this phone and only this time.” again sounds reasonable.

I have worked in the technology world professionally since 1998 and as early as 1999 the company that I worked for began receiving requests from government agencies for assistance. The first such case I was involved in was to physically remove the hard drive from a Macbook at the request of the Secret Service. They setup a camera to record me opening the laptop and extracting the hard drive. Once it was out, they placed it in an evidence bag and then sealed it in a box and updated the chain of custody form to reflect my involvement. I asked the agents what the suspect was accused of and I was told that it was counterfeiting. Interestingly, the reason we were asked to help was because the Secret Service did not have the correct screw drivers to open the case. Sound familiar?

Later on in 2003, a hotel chain in Florida that I worked for regularly hosted then President George W. Bush. My department was routinely tasked with assisting the Secret Service with communication related issues such as connecting T1 lines and special phone service in different parts of the buildings. When they were ready to get their crypto equipment out, no one but government agents were even allowed to be in the room. Doesn’t this sound like Apple fiercely guarding their secrets and property?

I mention these two personal experiences to highlight some points that I feel are important. First, in both of the cases outlined above, the government didn’t have to force anyone to assist. Both of the companies were eager to help out. The tech services firm wanted to brag about helping the Secret Service and the hotel wanted to brag about hosting the President of the United States. Furthermore, both of those companies were paid handsomely for their time and trouble. For instance, the hotel once hosted Vice President Dick Cheney for 4 hours. This was a campaign stop while Bush JR. was running for his second term. Cheney was there long enough to give a brief speech, woo some donors and take a shower in his room and change clothes. However the Secret Service began visiting months ahead of the visit to plan, prepare and secure the hotel for the event. The hotel was paid for all of the planning and technology accommodations as well as above market rate for 3 luxury suites. That’s right, they booked 3 rooms to throw off anyone trying to get a lock on his location and the other 2 rooms were not used.

What the FBI is asking telling Apple to do, is not only wrong from a perspective of privacy rights; it’s also puts an undue burden on Apple both financially and in forcing them to do something that could hurt the trust their customers place in Apple and it’s products.

While I don’t personally believe that all of Apple’s motivations are purely in the interests of protecting the privacy of their users, that doesn’t mean that they are any less correct in doing so. If the FBI gets what they want, then any company, device or individual can be forced to create the same kind of compromised products in the future. It’s up to those of us that can understand and explain the technology issues of this case to those who can’t. We must stop this overreach before all of our digital privacy has evaporated, if it’s not already too late.