Tech Talk Live Blog

Apple, the FBI, and AUPs

Mike Williams

The battle between Apple and the FBI has been plastered all over the news and social media these past few days. Following the shootings in San Bernardino, CA, the FBI retrieved an iPhone belonging to one of the shooters. They believe there may be information on that phone which could lead to other co-conspirators. However, the iPhone is locked with a passcode. While trying to simplify the issue, many in the mainstream media have misrepresented that for which the FBI is asking. Many of the stories have said that the FBI wants a backdoor to break the encryption. In their open customer letter, Apple itself even skirts the issue by only talking about encryption.

What the FBI has actually requested is a custom iOS build that disables the internal booby-traps, making it easier for the FBI to brute-force the passcode without the possibility of destroying the data. Specifically, they want the following features disabled: “Erase Data,” which wipes the phone after ten unsuccessful passcode attempts; the incremental time delay between failed attempts; and the ability to enter passcode attempts by methods other than the touch screen. The FBI could build this software themselves, but would not be able to sign the code with Apple’s key to allow it to run on the iPhone.

Compliance with the request would not be that difficult for Apple. But should they build it? Sure, this whole issue is about the possibility of finding more information that might stop a future terrorist attack. I can respect that, but is this the only means of gathering more intelligence? The real question is, should Apple break the model of strong security for which they have become an industry leader? If security is compromised once, what are the criteria for future use? What happens to the software after it is built? Could it be reverse-engineered for use on other phones? What happens when it falls into the hands of bad actors, or elements of an over-reaching government? The information released by Edward Snowden already casts suspicion on the actions of some in government. What happens when the IRS decides to target another conservative group, or when a new conservative administration decides to go after the ACLU? While I have my own strong opinions on these matters, I will put away the tin-foil hat for now.

This specific case is not about the Fourth Amendment or government overreach. Due process has been followed. It is pretty clear that the user of the phone is not a target of political persecution, or an innocent that has been swept up in an indiscriminate dragnet. The interesting part about this case is that the phone is not the property of the perpetrator. It is owned by his employer. The investigators have a warrant AND permission of the owner, but that is where the technical issues come in. Let’s consider what this means for those of us in education. Suppose there is a case where one of your staff members has committed a crime, and investigators want access to his/her school issued devices. Most of our Acceptable Use Policies have language indicating there is no expectation of privacy on district-provided equipment and networks, but assuming the school can turn over the device, can you actually provide access? Most of us could easily provide access to computers through administrator accounts, but what about access to disk encryption? Do you maintain a record of disk encryption recovery keys for each device? What about other encrypted disk images created on the machine?

If you have issued mobile devices to your users, hopefully you require them to use a passcode for the security of their data? Do you keep a record of those passcodes, or does your Mobile Device Management solution provide for administrative access to the device? Most importantly, have you discussed these issues with your solicitor?

Of course, what would an article about Apple be without “one more thing”. . . .

For those using Touch ID, this whole issue could be a moot point. Apple’s Touch ID feature allows the use of a fingerprint for access to the device. Most Touch ID users use their thumb or forefinger. That is four possible “passcode combinations,” and there are five Touch ID attempts before lock-out. Since fingerprints are one of the first things law enforcement collects, if a suspect is using Touch ID, they have likely already given their passcode upon arrest.

What do you think should be done? Should Apple comply? Should there be sanctions against them if they continue to resist? What implications does this issue hold for your district?

Tech Talk Live Blog Comment Guidelines:

One of our main goals at Tech Talk Live is to build a community. It is our hope that this blog can be a forum for discussion around our content. We see commenting as an integral part of this community. It allows everyone to participate, contribute, connect, and share relevant personal experience that adds value to the conversation. Respect counts. We believe you can disagree without being disagreeable. Please refrain from personal attacks, name calling, libel/defamation, hate speech, discriminatory or obscene/profane language, etc. Comments should keep to the topic at hand, and not be promotional or commercial in nature. Please do not link to personal blog posts, websites, or social media accounts that are irrelevant to the conversation. This is considered self-promotion. We welcome links that help further the conversation and reserve the right to delete those we deem unnecessary. The appearance of external links on this site does not constitute official endorsement on behalf of Tech Talk Live or Lancaster-Lebanon Intermediate Unit 13. You are solely responsible for the content that you post – please use your best judgment. We reserve the right to remove posts that do not follow these guidelines.

Leave a Reply

Your email address will not be published. Required fields are marked *


Tech Talk Live is the only conference of its kind in the region specifically designed for IT pros in education.


1020 New Holland Avenue
Lancaster, PA 17601

(717) 606-1770