Apple vs the FBI

I’ve been waiting to comment on this because more information seems to be coming out every day. Also, there has been so much written about this already that I wasn’t sure what I would have to add. But I’m not being hyperbolic when I say that this is a pivotal moment in our democracy, so I couldn’t just ignore it. One thing I haven’t seen is a good summary of what’s really going on here, so let’s start with that.

Just the Facts: What’s Really Going On Here?

First, let’s establish what’s really going on here, because it’s been very muddy. The FBI recovered an iPhone 5c that was used by one of the shooters in the San Bernardino attacks last year. This phone was issued to the shooter by his employer, and therefore was not his private cell phone – meaning that the data on that phone was technically not private. Nevertheless, that data was encrypted by default because that’s how Apple sets up every modern iPhone. The FBI believes there may be information on that iPhone that could help them perhaps find other co-conspirators or maybe uncover clues to some future plot.

This phone was backed up using Apple’s iCloud service, and it’s worth noting that Apple was willing and able to provide the FBI with the backed up data. However, for some reason, the backups to iCloud stopped about 6 weeks before the shooting – so the FBI wants to get to the data on the device itself to see what’s missing. Due to some sort of screw-up, the FBI instructed the local law enforcement to change the user’s iCloud password, which prevented it from doing another backup. If they had taken the device to a Wi-Fi network known to that device, the device might have backed up on its own, and then the FBI would have had the 6 weeks’ worth of data that was missing. But because the password was changed, we’ll never know.

The FBI is not asking Apple to break the encryption on the phone. That’s actually not possible. Encryption works. When done right, it can’t be broken. However, if you can unlock the device, then you can get to all the data on it. Unlocking the device means entering a PIN or password on the home screen – it could be as simple as a 4-digit number, meaning there are only 10,000 possible codes. With a computer-assisted “guesser”, it would be trivial to go through all the 10,000 options till you found the right one to unlock the phone.

To combat this “brute force” attack, Apple added some roadblocks. First, it restricted how often you could try a new number – taking progressively longer between guesses, from minutes up to a full hour. That would make guessing even 10,000 options take a very long time. Second, Apple gave the user the option to completely erase the device if someone entered an incorrect password ten times in a row. This feature is not enabled by default, but it easy to turn on (and I highly recommend that everyone do this).

The FBI is basically asking Apple to create a new, custom version of it’s iPhone operating system (iOS) that disables these two features and allows a connected computer to input its guesses electronically (so that a human wouldn’t have to try them all by hand). This would allow the FBI to quickly and automatically guess all possible PIN codes until the phone was unlocked. It’s not breaking the encryption, it’s allowing the FBI to “brute force” the password/PIN guessing. It’s not cracking the safe, it’s allowing a robot to quickly try every possible safe combination till it opens.

That’s just a thumbnail sketch, but I felt it was necessary background. This article from the EFF goes into a lot more depth and answers some excellent questions. If you’d like to know more, I encourage you to read it.

Why Is This Case So Important?

Both the FBI and Apple are putting heavy spin on this issue. The FBI has always disliked Apple’s stance on customer privacy (encrypting iPhone data by default) and picked this terrorism case to invoke maximum public sympathy. Apple is using this opportunity to extol its commitment to protecting its customers’ private data, particularly as compared to Android (which does not encrypt data by default). Despite what the FBI claims, this is not about a single iPhone and a single case; despite what Apple claims, creating this software is not really comparable to creating “software cancer”. We have to try to set all of that aside and look at the bigger picture. This is not a black and white situation – these situations rarely are. However, the implications are enormous and the precedent set here will have far-reaching effects on our society.

In this country, we have the the Fourth Amendment which prevents unreasonable search and seizure, and basically says that you need a warrant from a judge if you want to breach our right to privacy. In this case, the FBI has done its job in this regard. And it’s technically feasible for Apple to create a special, one-time version of it’s iOS that would allow the FBI to unlock this one iPhone – and this special software would not run on any other iPhone. This is due to a process called “signing”, which is another wonderful application of cryptographic techniques. So in this sense, it’s not a cancer – this special software load can’t be used on other devices. However, if Apple does this once, it can do it again, and there are already many other iPhones waiting at the FBI and in New York that will be next in line for this treatment. There is no doubt that this will set a precedent and will open the flood gates for more such requests in cases – not just from US law enforcement, but from repressive regimes around the globe. Furthermore, the very existence of such a tool, even though guarded heavily within Apple’s walls, will be a massive target for spy agencies and hackers around the globe.

So the issue is much deeper than simply satisfying a valid warrant (even without all the arcane All Writs Act stuff from the late 1700’s that the FBI claims should compel Apple – a third party – to help them satisfy this warrant.) The outcome of this case will have severe implications for privacy in general – and that’s why Apple is fighting back.

My Two Cents

I’ve read a lot of good articles on this issue, and I’ll point you to a couple of them shortly. But the bottom line is that we, as a society, need to figure out how we handle privacy in the age of digital communications and ubiquitous monitoring. Like it or not, you are surrounded by cameras and microphones, and it’s getting worse rapidly. You carry with you a single device that can simultaneously record video and audio, track your position anywhere on the planet, track many of your friends and family, record your physical movement, and store your personal health and financial data, as well as untold amounts of other personal information. That device is your smartphone. That one device probably has more information about you than any other single thing you own. Beyond that, all of our communications are now digital and can therefore be perfectly preserved forever. And in the grand scheme of things, any person or group of people that can gain surreptitious access to this information – regardless of their intentions – will have unimaginable power over us. This was not envisioned by the Founding Fathers – we’re in new territory here.

It’s long since time that we have an informed, open and frank discussion – as a nation – about how we balance the need for basic human privacy versus the need for discovery in the pursuit of safety. It’s also about targeted surveillance versus mass surveillance, and creating an open, transparent system of checks and balances to govern both. If nothing else, I hope this case leads to a more informed public and some rational, thoughtful debate that thinks about the broader issues here – not just this one, highly-emotional case.

As promised, here are some links with some excellent info and perspectives around these topics:

Here are some links to more general but related topics:


Pre-Sales of the Second Edition!

I’ve launched a KickStarter campaign for the second edition of Firewalls Don’t Stop Dragons! This second edition will cover Windows 10 and OS X 10.11 (El Capitan), along with a host of other updates. This KS campaign is basically pre-sales of the book, with the idea of taking the proceeds and investing them in marketing and launching the second edition. Investors will also have the opportunity to review drafts of the book and provide input on the content. Help me spread the word!!

On the Ethics of Ad-Blocking

As the saying goes, if you’re not paying for the product, then you are the product. The business model for most of the Internet revolves around advertising – which in and of itself is not a bad thing. It may be an annoying thing, but passive advertising isn’t actually harmful. Passive advertising is placing ads where people can see them. And savvy marketers will place their ads in places where their target audiences tend to spend their time. If you’re targeting middle-aged men, you might buy ad space on fantasy football or NASCAR web sites, for example. If you’re targeting tween girls, you might buy ad space on any site that might feature something about Taylor Swift or Justin Bieber. And if it stopped there, I don’t think many of us would object – or at least have solid grounds for objection. After all, this advertising is paying for the content we’re consuming. Producing the content costs money – so someone has to pay for it or the content goes away.

Unfortunately, online marketing didn’t stop there. On the web, competition for your limited attention has gotten fierce – with multiple ads on a single page, marketers need you to somehow focus on their ad over the others. And being on the Internet (and not a printed page), advertisers are able to do a lot more to grab your attention. Instead of simple pictures, ads can pop up, pop under, flash, move around, or float over the articles you’re trying to read. Worse yet, ad companies want to be able to prove to their customers that they were reaching the right people and that those people were buying their product – because this makes their ad services far more valuable, meaning they can charge more for the ads.

Enter the era of “active advertising”. It has now become very hard to avoid or ignore web page and mobile ads. Worse yet, the code that displays those ads is tracking where you go and what you buy, building up profiles on you and selling those profiles to marketers without your consent (and without most people even realizing it). Furthermore, those ads use precious data on cell phones and take a lot of extra time to download regardless of what type of device you use. And if that weren’t bad enough, ad software has become so powerful, and ad networks so ubiquitous and so commoditized, that bad guys are now using ad networks to distribute “malware” (bad software, like viruses). It’s even spawned a new term: malvertising.

Over the years, browsers have given users the tools they need to tame some of these abuses, either directly in the browser or via add-ons. It’s been a cat-and-mouse game: when users find a way to avoid one tactic, advertisers switch to a new one. The most recent tool in this toolbox is the ad-blocker. These plugins allow the user to completely block most web ads. Unfortunately, there’s really no way for ad blockers to sort out “good” advertising from “bad” advertising. AdBlock Plus (one of the most popular ad-blockers) has attempted to address this with their acceptable ads policy, but it’s still not perfect.

But many web content providers need that ad revenue to stay afloat. Last week, Wired Magazine announced that they will begin to block people that use ad-blockers on their web site. You will either need to add to your “whitelist” (allowing them to show you ads) or pay them $1 per week. They state clearly that they need that ad revenue to provide their content, and so they need to make sure that if you’re going to consume that content that you are paying for it – either directly ($1/week) or indirectly (via ad revenue).

So… what’s the answer here? As always, it’s not black and white. Below is my personal opinion, as things stand right now.

I fully understand that web sites need revenue to pay their bills. However,the business model they have chosen is ad-supported-content, and unfortunately the ad industry has gotten over-zealous in the competition for eyeballs. In the process of seeking to make more money and differentiate their services, they’re killing the golden goose. Given the abusive and annoying advertising practices, the relentless and surreptitious tracking of our web habits, the buying and selling of our profiles without our consent, and the lax policing that allows malware into ads, I believe that the ad industry only has itself to blame here. We have every reason to mistrust them and every right to protect ourselves. Therefore, I think that people are fully justified in the use of ad-blockers.

That said, Wired (and other web sites) also have the right to refuse to let us see their content if we refuse to either view their ads or pay them money. However, I think in the end they will find that people will just stop coming to their web sites if they do this. (It’s worth noting that some sites do well with voluntary donations, like Wikipedia.) Therefore, something has to change here. Ideally, the ad industry will realize that they’ve gone too far, that they must stop tracking our online pursuits and stop trafficking in highly personal information without our consent.

The bottom line is that the ad industry has itself to blame here. They’ve alienated users and they’re going to kill the business model for most of the Internet. They must earn back our trust, and that won’t be easy. Until they do, I think it’s perfectly ethical (and frankly safer) to use ad-blocking and anti-tracking tools.

Below are some of my favorite plugins. Each browser has a different method for finding and installing add-ons. You can find help here: Firefox, Safari, Internet Explorer, Chrome.

  • uBlock Origin – ad-blocker
  • Privacy Badger – anti-tracking plugin
  • HTTPS Everywhere – forces secure connections whenever possible
  • Better Privacy – another privacy plugin, slightly different from Privacy Badger

If you would like to get more involved, you might consider contributing to the Electronic Frontier Foundation.