by Robert Knake
March 29, 2016
An Apple iPhone is pictured next to the logo of Apple in Bordeaux, southwestern France, February 26, 2016. (Regis Duvignau/Reuters).
With yesterday’s announcement that the FBI had gained access to the phone used by Syed Rizwan Farook, the San Bernardino gunman, the tech community is clamoring to find out how they did it. Many commentersbelieve that any vulnerability used to access the data must be subject to the Vulnerabilities Equities Process (VEP), the process by which the U.S. government decides whether to disclose a computer vulnerability (partially declassified here).
Drawing on a blog post that laid out the criteria for disclosure by Michael Daniel, the president’s cybersecurity advisor, many have also concluded that it must be disclosed. Having helped to run the process at the White House, I’d say they have a good case.
Daniel laid out nine criteria. A quick run-down suggests that seven of nine favor disclosure:
iPhones are widely used in the U.S. economy;
With knowledge of the vulnerability, data could be extracted off of phones used by military personnel, diplomats traveling abroad, corporate executives, and just about anybody else;
The ramifications of this kind of data theft could be devastating to national security;
It would be hard to know whether someone else was exploiting the vulnerability given that it isn’t a remote exploit;
Assuming it only applies to an older model, the utility of protecting the capability goes down each day, recommending using it on the phone in question and then disclosing;
Someone is likely to figure out how to do it now that everyone knows it is possible; and
It probably can’t be patched by anyone other than Apple.
Against disclosure, there are really only two arguments:
The U.S. government badly needs the intelligence it can get from phones; and
That there aren’t other ways the U.S. government can get it.
The FBI have made a pretty convincing case that data from iCloud and metadata from service providers doesn’t meet all the needs of the investigation; moreover, cloud providers seem to be moving toward engineering their way out of answering requests for data as fast as they can.
If the FBI could demonstrate through the VEP that the exploit in question only works on iPhone 5Cs running iOS 9, they’d probably have a stronger case for retaining the knowledge.
None of this really matters though. I doubt that the Equities Review Board will ever have a chance to review the vulnerability and weigh these criteria.
If a brilliant GS-14 in an FBI forensics lab discovered the vulnerability, no doubt it would be entered into the process; if the FBI contracted with a defense contractor to find an exploitable vulnerability, the same would be true. When the policy was written in 2010, those scenarios likely covered most vulnerabilities exploited by the federal government.
Today, however, vulnerabilities are big business. The vendor, whether Cellebrite or another forensics firm, likely did not disclose the details on how they extracted the data.
Given that Apple is no longer helping law enforcement for free, extracting data off of iPhones is shaping up to be a revenue stream for companies that can figure out how to crack them. Companies aren’t selling the know-how so law enforcement and intelligence agencies can roll their own; they are packaging them up as products complete with customer service and slick graphical-user interfaces.
The vendor probably demonstrated they could access data off of a phone but refused to share the details on how they did it to protect their future market. The week it took to validate the approach likely had less to do with confirming whether it worked and more to do with sorting out the contract details, complete with an industry standard non-disclosure agreement.
All the FBI can likely tell Apple is what they have already made public: there’s a vulnerability in iOS. Good luck finding it.
No comments:
Post a Comment