Tech's Dumbest Mistake: Why Firing Programmers for AI Will Destroy Everything
-
I think the core takeaway is your shouldn't outsource core capabilities. If the code is that critical to your bottomline, pay for quality (which usually means no contractors - local or not).
If you outsource to other developers or AI it means most likely they will care less and/or someone else can just as easily come along and do it too.
-
I don't know. I look at it like firing all your construction contractors after built out all your stores in a city. You might need some construction trades to maintain your stores and your might need to relocate a store every once in a while, but you don't need the same construction staff on had as you did with the initial build out.
-
...shouldn't outsource core capabilities.
This right here.
-
Great points. Also:
... AI will be great at giving people exactly what they ask for ...
Honestly, I'm not even sure about this. With hallucinations and increasingly complex prompts that it fails to handle, it's just as likely to regurgitate crap. I don't even know if AI will get to a better state before all of this dev-firing starts to backfire and sour most company's want to even touch AI for most development.
Humans talk with humans and do their best to come up with solutions. AI takes prompts and looks at historical human datasets to try and determine what a human would do. It's bound to run into something novel eventually, especially if there aren't more datasets to pull in because human-generated development solutions become scarce.
-
Also, LLM doesn't usually have memory or experience. It's the first page of Google search every time you put in your tokens. A forever trainee that would never leave that stage in their career.
Human's abilities like pattern recognition, intuition, acummulation of proven knowledge in combination makes us become more and more effective at finding the right solution to anything.
The LLM bubble can't replace it and also actively hurts it as people get distanced from actual knowledge by the code door of LLM. They learn how to formulate their requests instead of learning how to do stuff they actually need. This outsourcing makes sense when you need a cookie recipe once a year, it doesn't when you work in a bakery. What makes the doug behave each way? You don't need to ask so you wouldn't know.
And the difference between asking like Lemmy and asking a chatbot is the ultimative convincing manner in which it tells you things, while forums, Q&A boards, blogs handled by people usually have some of these humane qualities behind replies and also an option for someone else to throw a bag of dicks at the suggestion of formating your system partition or turning stuff off and on.
-
Sure but they’re not going to fire all of them. They’re going to fire 90% then make 10% put out the fires and patch the leaks while working twice as many hours for less pay.
The company will gradually get worse and worse until bankrupt or sold and the c-suite bails with their golden parachutes.
-
I'm just a dabbler at coding and even i can see getting rid of programmers and relying to ai for it will lead to disaster. Ai is useful, but only for smallest scraps of code because anything bigger will get too muddled. For me, it liked to come up with its own stupid ideas and then insist on getting stuck on those so i had to constantly reset the conversation. But i managed to have it make useful little function that i couldnt have thought up myself as it used some complex mathematical things.
Also relying on it is quick way to kind of get things done but without understanding at all how things work. Eventually this will lead to such horrible and unsecure code that no one can fix or maintain. Though maybe its good thing eventually since it will bring those shitty companies to ruin.
-
those executives act like parasites. They bring no value and just leech the life from the companies.
-
I think they were suckered in also by the supposed lower cost of running services, which, as it happens, isn't lower at all and in fact is more expensive. But you laid off the Datacenter staff so. Pay up, suckers.
Neat toolsets though.
-
The core takeaway is that except for a few instances the executives still don't understand jack shit and when a smooth talking huckster dazzles them with ridiculous magic to make them super rich they all follow them to the poke.
Judges and Executives understand nothing about computers in 2025. that's the fucked up part. AI is just how we're doing it this time.
-
AI will never not-require a human to hand hold it. Because AI can never know what's true.
Because it doesn't "know" anything. It only has ratios of usage maps between connected entities we call "words".
Sure, you can run it and hope for the best. But that will fail sooner or later.
-
Like relying on automated systems for aircraft so much. You get things like planes going into landing mode because they think they are close to the runway.
-
Executives think they are the most important part of the company. They are high level managers, that is all.
-
Even if I ask AI for how to do a process it will frequently respond with answers for the wrong version, even though I gave the version, parameters that don't work, hand waving answers that are useless, etc.
-
It’s utterly bizarre. The customers lose out by receiving an inferior product at the same cost. The workers lose out by having their employment terminated. And even the company loses out by having its reputation squandered. The only people who gain are the executives and the ownership.
-
Well, yeah, but those costs are for tomorrow's executive to figure out, we need those profits NOW
-
While true, that is a weak analogy. Software rots and needs constant attention of competent people or shit stacks.
-
In my experience, you actually need more people to maintain and extend existing software compared to the initial build out.
Usually because of scalability concerns, increasing complexity of the system and technical debt coming due.
-
The irony of using an AI generated image for this post...
AI imagery makes any article look cheaper in my view, I am more inclined to "judge the book by its cover".
Why would you slap something so lazy on top of a piece of writing you (assuming it isn't also written by AI) put time and effort into?
-
I work for a fortune 500 company.
just recently lost a principal engineer that built an entire platform over the last four years.
just before they left I noticed they were using AI an awful lot. like...a lot a lot. like, "I don't know the answer on a screen share so I'll ask ChatGPT how to solve the problem and copy/paste it directly into the environment until it works" a lot.
they got fired for doing non-related shit.
it's taken us three months, hundreds of hours from at least 5 other principal engineers to try to unravel this bullshit and we're still not close.
the contributions and architecture scream AI all over it.
Point is. I'll happily let idiots destroy the world of software because I'll make fat bank later as a consultant fixing their bullshit.