New Junior Developers Can’t Actually Code.
-
This post did not contain any content.
-
T [email protected] shared this topic
-
One can classify approaches to progress in at least four most popular ways:
The most dumb clueless jerks think that it's replacing something known with something known and better. Progress enthusiasts, not knowing a single thing from areas they are enthusiastic about, are usually here.
The careful and kinda intellectually limited people think that it's replacing something known with something unknown. They can sour the mood, but are generally safe for those around them.
The idealistic idiots think that it's replacing something unknown with something known, that's "order bringers" and revolutionaries. Everybody knows how revolutionaries do things, who doesn't can look at Musk and DOGE.
The only sane kind think that it's replacing something unknown with something unknown. That is, that when replacing one thing with another thing you are breaking not only what you could see and have listed for replacement. Because nature doesn't fscking care what you want to see.
-
I honestly don't know how anyone's been able to code anything predominantly using AI that's production worthy.
Maybe it's the way I'm using AI, and to be honest I've only used chatGPT so far, but if I ask it to generate a bit of code then ask it to build on it and do the next thing, by about the third or fourth iteration it's forgotten half of what we talked about and I missed out bits of code.
On a number of occasions it's given me a solution and when I questions it about the accuracy of it and why a bit of it probably won't work I just get oh yes let me adjust that for you.
Maybe I'm doing AI wrong I don't know, but quite frankly I'll stick with stack overflow thanks.
-
I frankly only used those to generate pictures and sometimes helloworlds for a few languages, which didn't work and didn't seem to make sense. It was long enough ago.
Also I have ASD, so it's hard enough for me to make consistent clear sense from something small. A machine-generated junk to give ideas is the last thing I need, my thought process is different.
-
It's only useful for stuff that's been done a million times before in my experience. As soon as you do anything outside of that, it just stays hallucinating.
-
Unless AI dramatically improves from where LLMs are today (in ways that it so far hasn't), I'm looking forward to the drastic shortage of experienced senior devs in a few years time.
-
You have to aggressively purge the current chat and give it more abstract references for context. With enough context it can rewrite some logic loops, maybe start a design pattern. You just have to aggressively check the changes.
-
I've said it before, but this is a 20-year-old problem.
After Y2K, all those shops that over-porked on devs began shedding the most pricey ones; worse in 'at will' states.
Who were those devs? Mentors. They shipped less code, closed fewer tickets, cost more, but their value wasn't in tickets and code: it was investing in the next generation. And they had to go because #numbersGoUp
And they left. And the first gen of devs with no mentorship joined and started their careers. No idea about edge cases, missing middles or memory management. No lint, no warnings, build and ship and fix the bugs as they come.
And then another generation. And these were the true 'lost boys' of dev. C is dumb, C++ is dumb, perl is dumb, it's all old, supply chain exploits don't exist, I made it go so I'm done, fuck support, look at my numbers. It's all low-attention span, baling wire and trophies because #numbersGoUp.
And let's be fair: they're good at this game, the new way of working where it's a fast finish, a head-pat, and someone else's problem. That's what the companies want, and that's what they built.
They say now that relying on Ai makes one never really exercise critical thought and problem-solving, and I see it when I'm forced to write fucking YAML for fucking Ansible. I let the GPTs do that for me, without worrying that I won't learn to code YAML for Ansible. Coding YAML for Ansible is NEVER going to be on my list of things I want to remember. But we're seeing people do that with actual work; with go and rust code, and yeah, no concept of why we want to check for completeness let alone a concept of how.
One day these new devs will proudly install a patch in the RTOS flashed into your heart monitor and that annoying beep will go away. Sleep tight.
-
Recently my friend was trying to get me to apply for a junior dev position. "I don't have the right skills," I said. "The biggest project I ever coded was a calculator for my Java final, in college, a decade and a half ago."
It did not occur to me that showing up without the skills and using a LLM to half ass it was an option!
-
Poisoning AI with backdoored code is surely a real risk now? I can see this getting quite nasty.
-
While there is some truth to what you said, it sounded to me too much like "old man yells at clouds" because you are over-generalizing. Not everything new is bad. Don't get stuck in the past, that's just as dumb as relying on AI.
-
Feels like it would be quicker and easier just to write the code myself at that point...
-
You and I read a very different comment, apparently. There was nothing there saying new is bad. Maybe read it again.
-
I remember talking to someone about where LLMs are and aren't useful. I pointed out that LLMs would be absolutely worthless for me as my work mostly consists of interacting with company-internal APIs, which the LLM obviously hasn't been trained on.
The other person insisted that that is exactly what LLMs are great at. They wouldn't explain how exactly the LLM was supposed to know how my company's internal software, which is a trade secret, is structured.
But hey, I figured I'd give it a go. So I fired up a local Llama 3.1 instance and asked it how to set up a local copy of ASDIS, one such internal system (name and details changed to protect the innocent). And Llama did give me instructions... on how to write the American States Data Information System, a Python frontend for a single MySQL table containing basic information about the member states of the USA.
Oddly enough, that's not what my company's ASDIS is. It's almost as if the LLM had no idea what I was talking about. Words fail to express my surprise at this turn of events.
-
I'm a little defeatist about it. I saw with my own 3 eyes how a junior asked ChatGPT how to insert something into an
std::unordered_map
. I tell them about cppreference. The little shit tells me "Sorry unc, ChatGPT is objectively more efficient". I almost blew a fucking gasket, mainly cuz I'm not that god damn old. I don't care how much you try to convince me that LLMs are efficient, there is no shot they are more efficient than opening a static page with all the info you would ever need. Not even considering energy efficiency. Utility aside, the damage we have dealt to developing minds is irreversible. We have convinced them that thought is optional. This is gonna bite us in the ass. Hard. -
What are you guys working on where chatgpt can figure it out? Honestly, I haven't been able to get a scrap of working code beyond a trivial example out of that thing or any other LLM.
-
I have seen this too much. My current gripe isn't fresh devs, as long as they are teachable and care.
My main pain over the last several years has been the bulk of 'give-no-shit' perms/contractors who don't want to think or try when they can avoid it.
They run a web of lies until it is no longer sustainable (or the project is done for contractors) and then again its someone else's problem.
There are plenty of 10/20 year plus and devs who don't know what they are doing and don't care whose problem it will be as long as it isnt theirs.
I'm sick of writing coding 101 standards for 1k+ a day 'experts'. More sick of PR feedback where it's a battle to get things done in a maintainable manner from said 'experts'.
-
How is it more efficient than reading a static page? The kids can't read.
They weren't taught phonics, they were taught to guess the and with context clues. It's called "whole language" or "balanced reading" -
Junior Dev's could never code, yes including us
-
Yeah, and the way it will confidently give you a wrong answer instead of either asking for more information or saying it just doesn't know is equally annoying.