It's a little confusing, so we'll take you through it.
On Tuesday, a federal judge ordered Apple to help FBI agents access data stored on the iPhone of Syed Farook, a suspect in the December terror attack in San Bernardino. Apple refused to comply with the order on the grounds that creating software to unlock Farook's phone could make iPhones everywhere less secure, according to a letter from Apple CEO Tim Cook.
To be clear, Apple cannot extract information directly from Farook’s, or anyone else's, iPhone. All data on an iPhone is encrypted. The security measures for iOS 8, which rolled out in 2014, ensure that no one, not even Apple, can access information on an iPhone by sneaking through a software “backdoor.”
That's because “the files to be extracted are protected by an encryption key that is tied to the user’s passcode, which Apple does not possess," according to an explanation Apple posted on its website in 2014.
What the government was really asking Apple to do was to help the FBI guess the passcode for Farook’s iPhone, according to technology security expert Bruce Schneier.
"What was being asked for was not a way to get at the iPhone, but a way to get the iPhone password," Schneier told The Huffington Post on Thursday.
Apple could have done that pretty easily, because the phone in question -- an iPhone 5C, manufactured in 2013 -- has a serious security defect, according to Schneier. All iPhones, even newer ones, have this issue, he added.
“While the data is encrypted, the software controlling the phone is not,” Schneier wrote in an op-ed in the Washington Post on Thursday.
In other words, the part of iOS that handles passcode security could be breached, he told HuffPost.
"[S]omeone can create a hacked version of the software and install it on the phone," he wrote in the Washington Post.
The hacked software the FBI wanted Apple to install on Farook’s iPhone could have disabled two key security measures, Schneier said.
The software could have bypassed a feature that wipes all of an iPhone’s data after 10 failed attempts to guess the passcode. And it could have disabled another feature that prevents users from rapidly entering one passcode after another.
If Apple had bypassed those two measures, Schneier wrote, the FBI would have been able to use what’s known as “brute force” to unlock the iPhone -- a method that employs high-powered computing to repeatedly test passcodes until one works.
“This is a backdoor that exists in Apple phones that anyone could exploit,” Schneier told HuffPost.
Apple's refusal to create a backdoor for law enforcement is the latest move in an ongoing confrontation between technology companies and the government over data privacy. Law enforcement agencies maintain that accessing mobile phone data can help prevent and solve crimes.
But Apple insists that creating a way for law enforcement to access encrypted phone data would threaten the privacy of iPhone users everywhere.
"Building a version of iOS that bypasses security in this way would undeniably create a backdoor," Apple's CEO wrote on Wednesday. "And while the government may argue that its use would be limited to this case, there is no way to guarantee such control."