The Un-Measurable Cost of Bullshit

(This column is posted at www.StevenSavage.com, Steve’s Tumblr, and Pillowfort.  Find out more at my newsletter, and all my social media at my linktr.ee)

As my regular readers are painfully aware, I feel a lot of the world is awash in bullshit, and the technology world triply so. We’re sold products we don’t need, that don’t do what we want, from companies who will then collapse and be sold off for parts. Meanwhile too much of the media celebrates innovations that basically burn money and forests while delivering nothing but stock prices. And if you think you know what I’m talking about, once again don’t be so sure, I have a long list of grievances.

And I wonder how much does this stuff cost us? I’m not just talking money, but time, social damage, environmental damage, and having to clean up after it all falls apart.

I think it’s hard to measure because a lot of the economic bullshit is now a loop.

Investors invest in each other and the people they know to get a return, even if a service won’t provide anything. Media breathlessly starts a hype cycle about nothing, and will do it again weeks or months later having learned little. Bookkeeping flummery keeps the real costs off of the books and out of view. Environmental impact is exported. There’s a giant cycle that occupies a lot of time and resources to keep people from asking what time and resources are being consumed.

And we do it all over again repeatedly and more rapidly.

We can’t measure costs of all this meaninglessness as it moves too fast, doesn’t have enough data, because of made-up data, and because we’ll do it all again anyway. We know there’s bullshit in the economy, but we can’t penetrate the veil of it to figure what it costs us until the bill becomes due the hard way.

It’s enough to make you wish you could yell “stop” and we’d all just stop inventing stuff for ten years so we could pick up the pieces and see how much people were lying. And yes, I thought about how long that freeze should be.

I have the unsettling feeling that an enormous amount of our economy is waste that yields little more than line go up for a tiny amount of people. But I’d like if we could pause and find out.

Pause voluntarily, that is. Judging by the way our climate is changing, we’re gonna get a pretty hard pause involuntarily.

Steven Savage

It’s Not The Next Outage

(This column is posted at www.StevenSavage.com, Steve’s Tumblr, and Pillowfort.  Find out more at my newsletter, and all my social media at my linktr.ee)

So the CrowdStrike Outage of 2024 happened. Actually, let me clarify, the CloudStrike Outage of July 2024. I might as well be clear because that was a doozy and it showed some wide-raging system instabilities. Also considering it was such a disaster maybe there’s another.

If you don’t know what I’m talking about, an update to some security software bricked a lot of windows machines in a disaster that shouldn’t have happened. If “scrutiny software shut down systems” sound bad, yes it was!

If “security disaster happened” AND you work in IT, AND your friends are nerds and/or work in IT, you know MY experience. I spent most of that Friday quietly losing my mind.

Of course there’s questions of “how do we avoid the next outage” which is sort of sad, because you’d kind of like there not to be one, or one as widespread. But I don’t think that’s quite the issue, preparing for the next Giant Ooposie misses two things.

First, this exposed just how vulnerable systems are, and I’m worried about intentional attacks. We saw in real time how a software update could destroy systems. We saw how people did – or didn’t recover. We saw where vulnerabilities might be. We wondered what would have happened had this been during another crises – hurricane, terrorist attack, etc.

CrowdStrike was a mix of blueprint, roadmap, and test run for how to screw up IT systems worldwide. This is what you get by accident, meaning intentional attacks are now much easier to pull off effectively. We need to worry about intention.

Imagine a CrowdStrike-like outage but with more destructive not just an issue that an in theory be fixed by booting 15 times. Something designed to not be recoverable, an IT WMD.


Secondly, we’ve just seen that many major systems are just plain vulnerable period. Everyone is on Windows, a lot of people use CrowdStrike, and recovery plans were individual. Though I was impressed with the global recovery, if you’re an IT pro or hang out with them (I do both) you know this was not easy.

Recovering from a one-shot, caught, error is one thing. But it’s a reminder that we are very vulnerable and might want to be questioning about how a lot of infrastructure is set up. How many smaller-scale disasters do we not see because it wasn’t big news? My general take is systems need to be easier to recover, more diverse, and honestly more walled off.

Also we need to stop depending on heroism in IT security. It should be incredibly boring.

The next CrowdStrike type error should not happen. But right now my concern is what happens intentionally, what may happen on a smaller scale at first, and that we’re probably not ready for either.

CrowdStrike was a wake-up call to so many things wrong in modern infrastructure, so many things that could go wrong. As much as the company screwed up massively there’s far more to worry about.

Steven Savage

The Scale of Victims

(This column is posted at www.StevenSavage.com, Steve’s Tumblr, and Pillowfort.  Find out more at my newsletter, and all my social media at my linktr.ee)

It sure seems there’s a lot of IT security breaches lately. In fact, it’s to the point where I can’t remember which one inspired this column. It’s probably just as well, since you can map whatever horrific violation of privacy you heard of this week onto this column. There, I’ve sort of written something relatively timeless because people are dumb.

One of the things I wonder about is why more CTOs, CIOs, and so forth aren’t being taken to court, followed by reporters, and in general held freaking responsible for their companies having lousy security. Yes there’s all sorts of shielding from accountability, but you think we’d see some effort, but I think one thing protecting them is that the company is seen mostly as a victim.

I’d argue that’s technically right, the companies were attacked by some external force. But treating companies as equivalent of people ignores their responsibilities. People, individual moral agents, can be victims, but corporations are not people and not moral agents, and treating them as victims like people lets them out of responsibilities. Sorry, Mitt Romney.

Think about a person who is a victim of a crime. Though people often try to blame victims, those blamers are usually both wrong and assholes (and sometimes justifying their own crimes). A person who is victim of a crime is a victim in that someone else chose to behave criminally.. Even if said victim enhanced their own danger it doesn’t remove the culpability of the criminal, who violated social and legal norms that people are expected to follow.

When I watch people shrug as corporation after corporation has customer records placed on the dark web, I see comments about how crappy their security is, but it doesn’t seem particularly judgmental. This impresses me as an echo of the don’t-blame-the-victim mentality.

But corporations are groups of people – organizations. That organization makes certain agreements and promises in order to exist. Security of data is, obviously, part of them. If one’s data is breached, despite the criminals actions, you also take responsibility as you are responsible. If you’re leadership, you should be on the line because you made a promise that this probably won’t happen.

Organizations are about promises and responsibility. Screw that up, and no matter why, someone has to pay as your failure hurt the organization and the people involved. You don’t have to restrain yourself on going after the people who did the actual crime, but corporations have made promises. If you can’t keep them, you’ve got a problem.

In fact, I’d say a corporation that suffers a data breach or similar failure must be investigated to see if it violated social norms. If the corporation made guarantees it could not and did not keep, if good faith effort was not made, the corporation was responsible. There is a failure of the company that echoes the action of the criminal, it too violated norms.

Of course we all know that if we at all ask this we’ll find a lot of corporations have done terrible at security. It’s all cost cutting, half-assed integration, and big bonuses. A lot of companies, if they were really investigated for security problems, would be locked down and sold off for being terrible.

(And yes, I work in Healthcare, which has insanely strict rules, but everyone should for everything, and we remember that these rules protect people.)

We don’t need to act like corporations are victims like people. If they can’t keep their promises, if security violations reveal they’ve done a poor job of protecting people, they’re part of the problem. Some of them should pay. Some shouldn’t exist.

Steven Savage