Tech's Dumbest Mistake: Why Firing Programmers for AI Will Destroy Everything
-
I find it's the most useful when asking it for small snippets of code or dealing with boilerplate stuff. Anything more complicated usually results in something broken.
-
There's also the tribal knowledge of people who've worked somewhere for a few years. There's always a few people who just know where or how a particular thing works and why it works that way. AI simply cannot replace that.
-
Institutional knowledge takes years to replace.
-
I also find it is best when I'm very specific and give as many identifiers as possible. App name, version, OS, quoted error code, etc.
-
I know that it's a meme to hate on generated images people need to understand just how much that ship has sailed.
Getting upset at generative AI is about as absurd as getting upset at CGI special effects or digital images. Both of these things were the subject of derision when they started being widely used. CGI was seen as a second rate knockoff of "real" special effects and digital images were seen as the tool of amateur photographers with their Photoshop, acting as a crutch in place of photography talent.
No amount of arguments film purist or nostalgia for the old days of puppets and models in movies was going to stop computer graphics and digital images capture and manipulation. Today those arguments seem so quaint and ignorant that most people are not even aware that there was even a controversy.
Digital images and computer graphics have nearly completely displaced film photography and physical model-based special effects.
Much like those technologies, generative AI isn't going away and it's only going to improve and become more ubiquitous.
This isn't the hill to die on no matter how many upvoted you get.
-
This is a bad analogy.
It would be more akin to firing your fire departments, because you installed automatic hoses in front of everyone’s homes. When a fire starts, the hoses will squirt water towards the fire, but sometimes it’ll miss, sometimes it’ll squirt backwards, sometimes it’ll squirt the neighbour’s house, and sometimes it’ll squirt the fire.
-
Companies that are incompetently led will fail and companies that integrate new AI tools in a productive and useful manner will succeed.
Worrying about AI replacing coders is pointless. Anyone who writes code for a living understands the limitations that these models have. It isn't going to replace humans for quite a long time.
Language models are hitting some hard limitations and were unlikely to see improvements continue at the same pace.
Transformers, Mixture of Experts and some training efficiency breakthroughs all happened around the same time which gave the impression of an AI explosion but the current models are essentially taking advantage of everything and we're seeing pretty strong diminishing returns on larger training sets.
So language models, absent a new revolutionary breakthrough, are largely as good as they're going to get for the foreseeable future.
They're not replacing software engineers, at best they're slightly more advanced syntax checkers/LSPs. They may help with junior developer level tasks like refactoring or debugging... but they're not designing applications.
-
This is absolutely by design. The corporate raider playbook is well-read. See: Sears, Fluke, DeWalt, Boeing, HP, Intel, Anker, any company purchased by Vista (RIP Smartsheet, we barely knew ye), and so on. Find a brand with an excellent reputation, gut it, strip mine that goodwill, abandon the husk on a golden parachute, and make sure to not be the one holding the bag.
-
Yeah, I'm sure they left the spelling mistake in the image on purpose to get increased engagement from pedants like me. I'm sorry, it works on me.
-
-
people don't like generated so bc it's trainer on copyrighted data but if you don't believe in copyright then it's a tool like any other
-
give it a few more years
-
I'm fine with this. Let it all break, we've earned it.
-
There are thousands of different diffusion models, not all of them are trained on copyright protected work.
In addition, substantially transformative works are allowed to use content that is otherwise copy protected under the fair use doctrine.
It's hard to argue that a model, a file containing the trained weight matrices, is in any way substantially similar to any existing copyrighted work. TL;DR: There are no pictures of Mickey Mouse in a GGUF file.
Fair use has already been upheld in the courts concerning machine learning models trained using books.
For instance, under the precedent established in Authors Guild v. HathiTrust and upheld in Authors Guild v. Google, the US Court of Appeals for the Second Circuit held that mass digitization of a large volume of in-copyright books in order to distill and reveal new information about the books was a fair use.
And, perhaps more pragmatically, the genie is already out of the bottle. The software and weights are already available and you can train and fine-tune your own models on consumer graphics cards. No court ruling or regulation will restrain every country on the globe and every country is rapidly researching and producing generative models.
The battle is already over, the ship has sailed.
-
I wonder if there will eventually be a real Butlerian Jihad
-
this post is about programmers being replaced by ai. the writer seems ok with artists being replaced.
-
https://defragzone.substack.com/p/run-massive-models-on-crappy-machines
the author doesn't oppose AI, just programmers being replaced for it.
-
I don't disagree with that, but there's so many "wtf is this shit" moments that defy all logic and known practices.
like for example, six different branches of the same repo that deploy to two different environments in a phased rollout. branches 1-3 are prod, 4-6 are dev. phases go 3,1,2 for prod and 6,4,5 for dev. they are numbered as well.
also, the pipelines create a new bucket every build. so there's over 700 S3 buckets with varying versions of the frontend....that then gets moved into....another S3 bucket with public access.
my personal favorite is the publicly accessible and non-access controlled lambdas with hard-coded lambda evocation URLs in them. lambda A has a public access evocation URL configured instead of using API Gateway. Lambda B has that evocation URL hard coded into the source that's deployed.
there's so much negligent work here I swear they did it on purpose.
-
Literally anybody who thought about the idea for more than ten seconds already realized this a long time ago; apparently this blog post needed to be written for the people who didn't do even that...
-
A reason I didn't see listed: they are just asking for competition. Yes by all means get rid of your most talented people who know how your business is run.