Editorial: Apple vs. FBI on National Security

Courtesy of Flickr

Courtesy of Flickr

This week, the editorial board’s opinions were split on the on-going national security issue between the FBI and Apple. The following are the two opposing views on the issue.

Editorial | In Defense of the FBI

Fourteen. Two touchdowns in a football game. The number of lines in a sonnet. A fortnight.

Fourteen. The number of people killed on Dec. 2, 2015 in a terrorist attack at the Inland Regional Center in San Bernardino, California for no reason at all.

Nearly three months after the tragedy, the FBI’s investigation continues to examine those responsible, the “violent extremists,” according to FBI Director James Comey. As has been in the news for over a week, the United States government, specifically the FBI, wants Apple Inc. to unlock a phone used by one of the terrorists who was involved in the attack.

Seems like a no brainer, right? Unlock a terrorist’s phone to complete an investigation of a senseless massacre to prevent a similar act from occurring again in the future, in addition to getting as many answers as possible. But that has obviously not been the case.

On Tuesday, Feb. 16, the FBI obtained a court order requesting that Apple create software for a “backdoor” that would allow the FBI to access the encrypted iPhone of the San Bernardino terrorist. Apple refused vehemently, claiming that the request was unprecedented. “The implications of the government’s demands are chilling,” CEO Tim Cook said in a letter responding to the FBI’s request.

This point of view is understandable. Apple is concerned that if they were to create software to access this specific iPhone, it would allow others outside of the case to be accessed in the future. Such an instance could, and likely would, alarm a large portion of Apple’s customers.

But what Apple doesn’t seem to realize is the highly unique nature of this case. We’re talking terrorism, possible life and death for hundreds of thousands of Americans. We’ve seen how grave terrorist attacks can be in the past. Obviously, the fewer there are the better. If the legitimate concern is privacy, then destroy or secure the backdoor software after it’s employed in this case.

It is not that simple to create such an individualized software for this specific case given today’s digital landscape, but as a leader in innovative technology, I’m sure Apple can find a way.

Perhaps you are still not convinced. Let’s go back to Cook’s letter to customers on Feb. 16. To use his word, Cook writes some pretty “chilling” things in the section of the letter referencing the San Bernardino case.

“We have no sympathy for terrorists,” he states. Cook’s logic is off somewhere. Apple has no sympathy for terrorists, yet it won’t exercise all means to access one of its products tied to an extreme act of terrorism?

I’m not implying that Apple supports terrorism or has sympathy for terrorists, but stating that Apple having no sympathy for terrorists is extremely misleading and not the whole truth in light of this specific case. Additionally, Cook stated, “Up to this point, we have done everything that is both within our power and within the law to help them.”

Cook and Apple are missing a slight distinction. It should be written, “Up to this point, we have done ‘almost’ everything that is both within our power and within the law to help them.”

Essentially, Apple has helped with the San Bernardino case when it’s wanted to help.

What makes this current situation even more puzzling is the fact that San Bernardino is just 400 miles from Apple’s headquarters in Cupertino, California. With human emotion, it would only be natural to feel a sort of connection based on geographic proximity to such a horrific event.

Apple, unlock the phone. Erase the “almost.” Take a look at your logo. There’s now a bigger chunk.

 

Dissenting Editorial | In Defense of Apple

Almost three months after the terrorist attack that left 14 dead in San Bernardino, the FBI wants Apple to help hack into the iPhone used by one of the terrorists in the San Bernardino shooting.

The FBI is in possession of shooter Syed Rizwan Farook’s iPhone 5C, which belonged to his employer, the San Bernardino County Department of Public Health, and they have been given the department’s permission to examine it. The feds hope to find information, including the contact list and text messages that are locked behind a password, which could help them investigate Farook.

The problem is that Farook took his passcode to his grave in a shootout with law enforcement officers on Dec. 2. In 2014, Apple decided to no longer store copies of its customers’ passcodes in order to make them less accessible to hackers. But his phone still has more information, which is why the FBI wants Apple to help disable a password-protection feature. The feds want the ability to make tens of millions of guesses at Farook’s password until they eventually guess correctly.

This may seem like a non-issue for Apple at first. Who wouldn’t want to help the government catch a terrorist? Apple’s CEO Tim Cook, seemingly does not. According to Cook, a “back-door” key to its operating system, would leave Apple vulnerable to criminal hackers across the globe and put users’ privacy at risk.

Last week, a federal magistrate demanded that Apple work with the FBI, a decision which Apple has vowed to appeal.

Is Apple using privacy as a marketing ploy? Yes. Is this more of a business strategy for Apple than it would like you to think? Most likely. Should Apple unlock the phone? Absolutely not.

Farook’s data was backed up on the cloud six weeks before the tragic events in San Bernardino occurred, and Apple has turned over that data.

In order to comply further, Apple would need to create a software program that could be used on other devices, puting users’ security at risk, while the FBI insists it needs to create a tool to unlock a single phone.

There’s a larger issue at stake. The reason Apple and other companies created highly secure systems is to ward off criminals and terrorists who, every day, attack weaknesses in computer systems, including at colleges.

The fact that Apple is not able to break into its own iPhones is a good thing. Creating a back-door for law enforcement will almost invariably lead to information being obtained by a source that desperately and frequently makes attempts to obtain it.

The United States is trying to protect valuable assets while warding off daily attacks from hackers who ransack banks and other firms, stealing credit card data, personnel files and trade secrets. This obviously is not an easy position for Apple, as no U.S. company wants to stand in the way of solving a terrorist event. It’s clear how some could see Apple’s move as a national betrayal and boycott Apple goods.

Thankfully for consumers, Apple has been able to outsmart hackers so far. Writing software to defeat its own encryption systems will not bode well with customers. Its competitors, often outsourcing manufacturing out of range of U.S. laws, will use the software to bolster their own systems.

This case is likely to go to tbe Supreme Court, and with good reason. At play is not only user privacy, but cybersafety concerns. This is not an issue of terrorism anymore, in our opinion. The terror attack has already occured. It is an issue of precendent and of protection. We don’t want a back door key to be created and we trust Apple not to be terrorist sympathizers, but privacy enforcers.

Comment

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s