How it started

Jeff Johnson (@lapcatsoftware) seems to have noticed it first:

The day where no one could open software on their Mac’s because Apple’s OCSP server was very, very slow.

Why it matters

I agree with a lot in this Jeff Paul post:

Your computer now serves a remote master, who has decided that they are entitled to spy on you. If you’ve the most efficient high-res laptop in the world [a new Apple Silicon Mac], you can’t turn this off.

Let’s not think very much right now about the additional fact that Apple can, via these online certificate checks, prevent you from launching any app they (or their government) demands be censored.

Frightening, if you fear centralized control of things. As more value from computing is derived from centralized networks and the ability to connect remotely to services you don’t own, and where those services increasingly own more and more of your data, we are step-by-step approaching a future where you don’t own your computer. At least not in the simple, straightforward way we’ve thought of ownership prior to this moment.

Take a silly example. If my car decided I could not turn down a certain street, and physically prevented me from doing so, I would think “perhaps this is not a device which I own?” Ford may have had every good intention in doing this -

  • Ford keeps track of streets which are “invalid” - closed for construction, illegal to drive on, maybe even dangerous in some way
  • Ford pushes a list of those invalid streets to all cars it makes
  • When you turn your steering wheel to head down one of these streets, the GPS inside your car checks with Ford first, and if the road you’re heading towards is invalid Ford simply takes control of your car and drives you away

This could be a future we live in some day.

A less on-the-nose example might be GPS. I can still drive down any street I want, but my car’s GPS would never route me down one of those invalid streets; I may never know those routes exist. Better? Not quite but certainly less obvious. If I don’t notice it, then what is the harm?

The Apple+OCSP example is closer to stopping me from turning my steering wheel, though. Apple actively prevents a user from launching software if OCSP determines the certificate is revoked. Every single good intention could go into deciding when to invoke a certificate, but over time we should always remember that “the road to hell is paved with good intentions.”

Nilay Patel (@reckless) had a good take:

Privacy problems

It turns out Jeff Paul’s analysis got one extremely important detail wrong:

It turns out that in the current version of the macOS, the OS sends to Apple a hash (unique identifier) of each and every program you run, when you run it. 

This is incorrect, and Jacopo Jannone discovers what is really being sent:

It is clear that the trustd service on macOS doesn’t send out a hash of the apps you launch. Instead, it just sends information about some certificate - as we would certainly expect after understanding what OCSP is in the first place.

Emphasis mine.

That “some certificate” ends up being the developer’s certificate, the one they use to sign their app. Coupled with the fact that Apple’s OCSP traffic is sent across the wire unencrypted (using HTTP/port 80) this still has some privacy implications, but not nearly as bad as if Apple were sending application hashes across the wire. Instead, a third-party could only sniff out that I was using software from Microsoft, Adobe, or some small indie developer.

That is still not great! If an ill-intentioned attacker could discern that I was running software from a developer who had known security flaws, they could still launch attacks against me. I would prefer all external third parties to my home network not know anything about the software I’m running.

How it’s going

Another blog post from Jeff Johnson dug into the details of a change Apple made:

I didn’t have a lot of data, but the data that I found did indicate that macOS was caching OCSP responses for 5 minutes before Thursday and half a day now.

The best solution (for now) being the simplest, most obvious one: drive down load on those OCSP servers by decreasing the frequency at which Mac’s poll for certificate info. That’s something. And the response came quickly, although not quickly enough if you depend upon your Mac for production work and to earn an income.

Yet there’s still a core issue, as Jeff notes:

I think it’s important for us outside of Apple to know what’s happening with our computers. I also hope that Apple itself will make a public statement about yesterday’s incident, but sadly Apple’s history of “postmortems” is almost nonexistent.

Apple’s willingness to publicly recognize their mistakes is…just not a thing. And for a company focused on protecting the privacy of it’s users, the single most important thing they could do is offer honesty and transparency regarding (a) what happened and (b) how we’re going to fix it.

The Washington Post has a great tagline: “Democracy dies in darkness.” It has been proven time and again within the information security world that companies who practice security through obscurity will fall victim to a breach eventually. Apple is a behemoth having more money than security sense, so perhaps they can avoid being burned in this arena for a long while yet. But it would generate such an outpouring of good will if they would only acknowledge their problems and announce they were working to redouble their efforts to do better in the future.