"[...]someone recently explained to me how great it is that, instead of using data to make decisions, we use political connections, and that the idea of making decisions based on data is a myth anyway; no one does that."
From my experience the second part is somewhat true. I have yet to see a "data driven" decision that wasn't actually driven by a very political process of choosing what data to gather and how to interpret the results.
(This obviously doesn't mean you should ignore data and focus on politics. Focus on making the politics good so that data can be properly used.)
Spivak · 1h ago
I feel like there must be some
shame/stigma among leadership types to say that your decisions aren't "data driven" even though we've all
experienced that most decisions are based on some executive's intuition. Which is funny because if the path forward was an obvious conclusion from
the data we wouldn't need decision makers—the company direction would be set by a council of data science nerds.
themafia · 1h ago
The data you cannot possibly access would be "is our staff going to be capable of handling this challenge within the time frame alotted?" Or "will any new issues present themselves during this time frame and reduce the amount of available staff?" Or.. well.. any of dozens of known failure cases in business.
Good business is about hedging your bets. It's not about creating business processes that always make the right bets.
It's also not the case that good business is the only way to survive. Which is one of my favorite stories from FedEx's beginnings. They were short on cash and couldn't make fuel payments or payroll the next day. The CEO extract cash from the business, took it to Las Vegas, increased his holdings by gambling, then returned the money to the business the next day.
If it worked, then great, you "saved" the business. If it didn't, then bummer, you're now a felonious embezzler.
It's a piece which lists very good examples of normalization of deviance in organizations.
Personally it happened to myself as well. I regularly rent a Tesla and once, I took a date on a trip and I drove us through the city, her riding shotgun. She said: "Look at the orange line on the screen. You're driving too close to the parked cars on our right".
I answered, "it always does that, the proximity sensor on these Teslas is way too nervous". She looked out of the window and said: "no, you are actually way too close to the parked cars!"
I had totally normalized the proximity warning.
mips_avatar · 3h ago
One unintended consequence of rolling big tech layoffs is the suppression of weak signals. The layoffs provide very strong signals and individuals feel no agency to resolve problems that they are observing. Maybe the siloed AI orgs inside of Microsoft/Google can still operate but the rest of these company is following the layoff orthodoxy.
ashleyn · 1h ago
I think of this concept every time a friend of mine keeps speeding on the highway and brags about not getting caught.
danhite · 1h ago
OP wrote:
> He acknowledged that my way reduced the chance of failure without making the technical consequences of failure worse, but it was more important that we not be embarrassed. Now that I've been working for a decade, I have a better understanding of how and why people play this game, but I still find it absurd.
If OP's embarrassment comment and the topic of normalization of deviance interest you then you might find this soft (Social) Science Fiction short story to be amusingly enlightening...
"The trouble with you Earth people" by Katherine MacLean (1968)
^ link is to google books and their preview includes the entirety of the titular short story from the collection.
If ^ that short story is tl;dnr for you, Spoiler Alert:
Well meaning Alien POV discovery that Humankind is a self important and superstitious lot, and not mostly harmless.
tacitusarc · 3h ago
I completely agree with almost all of this.
But…
> Humans are bad at reasoning about how failures cascade, so we implement bright line rules about when it's safe to deploy.
I think aggregate human intuition is often undervalued. It is the case that every bright line rule has a cost, and the total cost of its adherence must be weighed against the occasional cost of failure to adhere.
Benefits don’t exist in a vacuum.
bediger4000 · 2h ago
> Microsoft was a joke in the security world for years, until multiple disastrously bad exploits forced them to get serious about security.
Microsoft's security orientation must have peaked before this article (2015), and the culture slid back, because I see a lot of folks bagging on Microsoft security right now. If true, deviance was normalized at Microsoft, de-normalized, and the re-normalized.
jiggawatts · 1h ago
Microsoft's product security in 2025 is nowhere near as bad as it used to be, despite a much higher amount of code deployed more publicly.
For example, Azure offers Microsoft software with various proprietary protocols exposed to the Internet that would have been unthinkable for any competent administrator a decade ago. This includes the SMB file sharing protocol and the SQL Server TDS network protocol.
It's bizarre to me to see a file share and a SQL database just "on" the Internet, no firewall or anything.
From my experience the second part is somewhat true. I have yet to see a "data driven" decision that wasn't actually driven by a very political process of choosing what data to gather and how to interpret the results.
(This obviously doesn't mean you should ignore data and focus on politics. Focus on making the politics good so that data can be properly used.)
Good business is about hedging your bets. It's not about creating business processes that always make the right bets.
It's also not the case that good business is the only way to survive. Which is one of my favorite stories from FedEx's beginnings. They were short on cash and couldn't make fuel payments or payroll the next day. The CEO extract cash from the business, took it to Las Vegas, increased his holdings by gambling, then returned the money to the business the next day.
If it worked, then great, you "saved" the business. If it didn't, then bummer, you're now a felonious embezzler.
Related. Others?
Normalization of Deviance (2015) - https://news.ycombinator.com/item?id=34791106 - Feb 2023 (219 comments)
Normalization of Deviance (2015) - https://news.ycombinator.com/item?id=22144330 - Jan 2020 (43 comments)
Normalization of deviance in software: broken practices become standard (2015) - https://news.ycombinator.com/item?id=15835870 - Dec 2017 (27 comments)
How Completely Messed Up Practices Become Normal - https://news.ycombinator.com/item?id=10811822 - Dec 2015 (252 comments)
What We Can Learn From Aviation, Civil Engineering, Other Safety-critical Fields - https://news.ycombinator.com/item?id=10806063 - Dec 2015 (3 comments)
Personally it happened to myself as well. I regularly rent a Tesla and once, I took a date on a trip and I drove us through the city, her riding shotgun. She said: "Look at the orange line on the screen. You're driving too close to the parked cars on our right".
I answered, "it always does that, the proximity sensor on these Teslas is way too nervous". She looked out of the window and said: "no, you are actually way too close to the parked cars!"
I had totally normalized the proximity warning.
> He acknowledged that my way reduced the chance of failure without making the technical consequences of failure worse, but it was more important that we not be embarrassed. Now that I've been working for a decade, I have a better understanding of how and why people play this game, but I still find it absurd.
If OP's embarrassment comment and the topic of normalization of deviance interest you then you might find this soft (Social) Science Fiction short story to be amusingly enlightening...
"The trouble with you Earth people" by Katherine MacLean (1968)
https://www.google.com/books/edition/The_Trouble_With_You_Ea...
^ link is to google books and their preview includes the entirety of the titular short story from the collection.
If ^ that short story is tl;dnr for you, Spoiler Alert:
Well meaning Alien POV discovery that Humankind is a self important and superstitious lot, and not mostly harmless.
But…
> Humans are bad at reasoning about how failures cascade, so we implement bright line rules about when it's safe to deploy.
I think aggregate human intuition is often undervalued. It is the case that every bright line rule has a cost, and the total cost of its adherence must be weighed against the occasional cost of failure to adhere.
Benefits don’t exist in a vacuum.
Microsoft's security orientation must have peaked before this article (2015), and the culture slid back, because I see a lot of folks bagging on Microsoft security right now. If true, deviance was normalized at Microsoft, de-normalized, and the re-normalized.
For example, Azure offers Microsoft software with various proprietary protocols exposed to the Internet that would have been unthinkable for any competent administrator a decade ago. This includes the SMB file sharing protocol and the SQL Server TDS network protocol.
It's bizarre to me to see a file share and a SQL database just "on" the Internet, no firewall or anything.