Apple fighting court order to unlock San Bernardino gunman’s iPhone – and, they should!
In the Order Compelling Apple Inc. to Assist Agents in Search, Judge Pym ordered Apple to provide “reasonable technical assistance” to the FBI. in unlocking the phone. The order states that the “technical assistance” shall (1) “bypass or erase the auto-erase function” on the phone; (2) “enable the FBI to submit passcodes” for testing on the phone; and (3) “ensure that the FBI submits passcodes” the phone “will not purposefully introduce any additional delay” among other thins. (See the Order here). She essentially ordered Apple to build special software that would essentially act as a skeleton key capable of unlocking the phone.
In a statement, Timothy D. Cook, Apple’s CEO, said the company would fight the order and resist efforts to provide a “back door” to the iPhone. Mr. Cook called the court order an “unprecedented step” by the federal government. “We oppose this order, which has implications far beyond the legal case at hand,” he wrote.
Apple is taking the said that “we believe it would be wrong to weaken security for hundreds of millions of law-abiding customers so that it will also be weaker for the very few who pose a threat.”
Of course, Apple is not only opposing Judge Pym’s order on the principles of privacy, one of its motivations has to be the preservation of its reputation for robust encryption, at a time of rising concerns about identity theft, cybercrime and electronic surveillance by intelligence agencies and overzealous law enforcement agencies. And, nothing is wrong with that. In fact, Apple is right to oppose Judge Pym’s order.
Apple and other technology companies say that creating an opening in their products for government investigators would also create a vulnerability that Chinese, Iranian, Russian, or North Korean hackers could exploit.
As reported by the New York Times in 2014, Apple and Google — whose operating systems are used in 96 percent of smartphones worldwide — announced that they had re-engineered their software with “full disk” encryption, and could no longer unlock their own products as a result.
That move set up a confrontation with police and prosecutors, who want the companies to build, in essence, a master key that can be used to get around the encryption. The technology companies say that creating such a key would have disastrous consequences for privacy. The San Bernardino tragedy gave the government the perfect opportunity to push the issue.
On one side stand the technology companies who want to protect the privacy of customers. On the other side are law enforcement authorities who say that new encryption technologies hamper their ability to prevent and solve crime.
I find the law enforcement argument weak and unsupported. Law enforcement today has access to more data — data which they can use to prevent terrorist attacks, solve crimes and help bring perpetrators to justice — than ever before in the history of our world. New technology has always been out there, sometimes one step ahead of law enforcement and other times, one step behind. The FBI has teams for this, teams of hackers and analyists to help keep the FBI one step ahead. Forcing a private company to do their job seems wrong. And, while the government may argue that its use would be limited to this case, there is no way to guarantee such control.
If the government wins here, it is a slippery slope, potentially giving them the power to reach into anyone’s device to capture their data. The government could extend this breach of privacy and demand that Apple build surveillance software to intercept your messages, access your health records or financial data, track your location, or even access your phone’s microphone or camera without your knowledge.
Asked about Apple’s resistance, the Justice Department pointed to a statement by Eileen M. Decker, the United States attorney for the Central District of California: “We have made a solemn commitment to the victims and their families that we will leave no stone unturned as we gather as much information and evidence as possible. These victims and families deserve nothing less.”
The FBI said that its experts had been unable to access data on Mr. Farook’s iPhone, and that only Apple could bypass its security features. FBI experts have said they risk losing the data permanently after 10 failed attempts to enter the password because of the phone’s security features. The government argued that it does not have alternative means to access the phone.
And, of course, the government argues that access to this phone could provide crucial evidence about the attackers’ communications and contacts before the shooting. But, it may not.
So, are the government’s arguments enough? Should they be enough to force Apple to do something that arguably falls well outside the scope of the Fourth Amendment and, if upheld, would give law enforcement authority to compel technology companies to do almost anything conceivable in the name of a purported investigation or surveillance of a target? That seems to go well beyond what the Constitution and existing law permits law enforcement to do.
It is important to note that law enforcement isn’t asking Apple to provide information that it already has, which is what an ordinary search warrant does. It is essentially asking a Federal Court to compel Apple to do something that does not exist. The legal issues are fascinating and complicated.
The Electronic Frontier Foundation, a nonprofit organization that defends digital rights, says it best:
“The government is asking Apple to create a master key so that it can open a single phone.” “And once that master key is created, we’re certain that our government will ask for it again and again, for other phones, and turn this power against any software or device that has the audacity to offer strong security.”
We all should be concerned.