It’s Not The Next Outage

(This column is posted at www.StevenSavage.com, Steve’s Tumblr, and Pillowfort.  Find out more at my newsletter, and all my social media at my linktr.ee)

So the CrowdStrike Outage of 2024 happened. Actually, let me clarify, the CloudStrike Outage of July 2024. I might as well be clear because that was a doozy and it showed some wide-raging system instabilities. Also considering it was such a disaster maybe there’s another.

If you don’t know what I’m talking about, an update to some security software bricked a lot of windows machines in a disaster that shouldn’t have happened. If “scrutiny software shut down systems” sound bad, yes it was!

If “security disaster happened” AND you work in IT, AND your friends are nerds and/or work in IT, you know MY experience. I spent most of that Friday quietly losing my mind.

Of course there’s questions of “how do we avoid the next outage” which is sort of sad, because you’d kind of like there not to be one, or one as widespread. But I don’t think that’s quite the issue, preparing for the next Giant Ooposie misses two things.

First, this exposed just how vulnerable systems are, and I’m worried about intentional attacks. We saw in real time how a software update could destroy systems. We saw how people did – or didn’t recover. We saw where vulnerabilities might be. We wondered what would have happened had this been during another crises – hurricane, terrorist attack, etc.

CrowdStrike was a mix of blueprint, roadmap, and test run for how to screw up IT systems worldwide. This is what you get by accident, meaning intentional attacks are now much easier to pull off effectively. We need to worry about intention.

Imagine a CrowdStrike-like outage but with more destructive not just an issue that an in theory be fixed by booting 15 times. Something designed to not be recoverable, an IT WMD.


Secondly, we’ve just seen that many major systems are just plain vulnerable period. Everyone is on Windows, a lot of people use CrowdStrike, and recovery plans were individual. Though I was impressed with the global recovery, if you’re an IT pro or hang out with them (I do both) you know this was not easy.

Recovering from a one-shot, caught, error is one thing. But it’s a reminder that we are very vulnerable and might want to be questioning about how a lot of infrastructure is set up. How many smaller-scale disasters do we not see because it wasn’t big news? My general take is systems need to be easier to recover, more diverse, and honestly more walled off.

Also we need to stop depending on heroism in IT security. It should be incredibly boring.

The next CrowdStrike type error should not happen. But right now my concern is what happens intentionally, what may happen on a smaller scale at first, and that we’re probably not ready for either.

CrowdStrike was a wake-up call to so many things wrong in modern infrastructure, so many things that could go wrong. As much as the company screwed up massively there’s far more to worry about.

Steven Savage

The Scale of Victims

(This column is posted at www.StevenSavage.com, Steve’s Tumblr, and Pillowfort.  Find out more at my newsletter, and all my social media at my linktr.ee)

It sure seems there’s a lot of IT security breaches lately. In fact, it’s to the point where I can’t remember which one inspired this column. It’s probably just as well, since you can map whatever horrific violation of privacy you heard of this week onto this column. There, I’ve sort of written something relatively timeless because people are dumb.

One of the things I wonder about is why more CTOs, CIOs, and so forth aren’t being taken to court, followed by reporters, and in general held freaking responsible for their companies having lousy security. Yes there’s all sorts of shielding from accountability, but you think we’d see some effort, but I think one thing protecting them is that the company is seen mostly as a victim.

I’d argue that’s technically right, the companies were attacked by some external force. But treating companies as equivalent of people ignores their responsibilities. People, individual moral agents, can be victims, but corporations are not people and not moral agents, and treating them as victims like people lets them out of responsibilities. Sorry, Mitt Romney.

Think about a person who is a victim of a crime. Though people often try to blame victims, those blamers are usually both wrong and assholes (and sometimes justifying their own crimes). A person who is victim of a crime is a victim in that someone else chose to behave criminally.. Even if said victim enhanced their own danger it doesn’t remove the culpability of the criminal, who violated social and legal norms that people are expected to follow.

When I watch people shrug as corporation after corporation has customer records placed on the dark web, I see comments about how crappy their security is, but it doesn’t seem particularly judgmental. This impresses me as an echo of the don’t-blame-the-victim mentality.

But corporations are groups of people – organizations. That organization makes certain agreements and promises in order to exist. Security of data is, obviously, part of them. If one’s data is breached, despite the criminals actions, you also take responsibility as you are responsible. If you’re leadership, you should be on the line because you made a promise that this probably won’t happen.

Organizations are about promises and responsibility. Screw that up, and no matter why, someone has to pay as your failure hurt the organization and the people involved. You don’t have to restrain yourself on going after the people who did the actual crime, but corporations have made promises. If you can’t keep them, you’ve got a problem.

In fact, I’d say a corporation that suffers a data breach or similar failure must be investigated to see if it violated social norms. If the corporation made guarantees it could not and did not keep, if good faith effort was not made, the corporation was responsible. There is a failure of the company that echoes the action of the criminal, it too violated norms.

Of course we all know that if we at all ask this we’ll find a lot of corporations have done terrible at security. It’s all cost cutting, half-assed integration, and big bonuses. A lot of companies, if they were really investigated for security problems, would be locked down and sold off for being terrible.

(And yes, I work in Healthcare, which has insanely strict rules, but everyone should for everything, and we remember that these rules protect people.)

We don’t need to act like corporations are victims like people. If they can’t keep their promises, if security violations reveal they’ve done a poor job of protecting people, they’re part of the problem. Some of them should pay. Some shouldn’t exist.

Steven Savage

Living In The Future We Were Sold

(This column is posted at www.StevenSavage.com, Steve’s Tumblr, and Pillowfort.  Find out more at my newsletter, and all my social media at my linktr.ee)

We’re living in the future, and it’s lousy.

So-called AI is just Ultra-Clippy being shoehorned into everything that will temporarily goose stock prices. We’ve got computerized cars that allow us to bluescreen while driving, and universal automated cars are many dreams and lawsuits away. Phones gave us something like Star Trek gadgets, but we’re using them to become depressed by doomscrolling. I could make a comment on the Cybertruck, but honestly, that seems pointless.

We’ve got a lot of things that we think are futuristic, and a lot of them are lame, terrible, pointless, or have side effects. Plus you know, we’ve got climate change, Nazis, and pandemics as well.

The future isn’t what it used to be? No, the problem is we’re living in the future we were sold.

A lot of our futuristic ideas derive from popular culture, but that popular culture has nothing to do with what we can, should, or even may want to do. A lot of or popular culture is what people could sell us or what worked in media of the time. It has nothing to do with the possible or the necessary.

AI? It’s easier to just have Hunky Space Captain talk to the computer, because no one wants to watch someone scroll on a monitor. Besides, it sounds cool. Also if you’re bored eventually the computer can try to murder people as part of the plot, a real horror film twist. But do we need it?

Automated cars are a dream, especially if you’ve ever driven . . . well, anywhere. It’s a dream that’s cool and convenient and doesn’t have messy people, and looks awesome in films. It doesn’t deal with the reality that driving needs a moral actor to make decisions, even if you’re paying them by the mile. Also it doesn’t deal with outages, software updates, and crashes.

Then there’s our phones, our pocket computers. This is a totally understandable dream of course, going back to hand-held sci-fi gizmos and communicators. It’s just we never asked how we’d misuse them, as if people won’t find some weird use for technology five seconds after inventing it.

All of these are things we’ve seen in pop culture media since the 60’s (and I’d argue a lot of what we’re living in is very 80’s). But it’s not stuff from speculative fiction or deep analysis or asking hard questions of what we want and need in the future. It’s stuff that was fun to put into movies, tv, and comics.

That’s it. For many of us, the future we envision is something that was marketable.

So of course all the backfire we’re experiencing is a surprise. We weren’t buying a warning, we were buying a cool experience.

“A good science fiction story should be able to predict not the automobile but the traffic jam,” said Fredrick Pohl. Indeed it should. It’s just sometimes the warnings don’t sell – and other times people think the warning is cool (see many a stylish dystopia with lots of leather for no reason).

So much of the future that people want – or are trying to sell us at least – seems to just be whatever was laying around in pop culture for a while. It doesn’t have anything to do with speculation, or possibility, or what we need. It’s what many of us assume the future is supposed to be because we bought it.

But what is the future we really want and need? The struggle is to find that, and perhaps in this time where the future we bought is failing us, we have a chance to find it.

Steven Savage