New Junior Developers Can’t Actually Code.
-
What are you guys working on where chatgpt can figure it out? Honestly, I haven't been able to get a scrap of working code beyond a trivial example out of that thing or any other LLM.
When I had to get up to speed on a new language, it was very helpful. It's also great to write low to medium complexity scripts in python, powershell, bash, and making ansible tasks. That said I've been programming for ~30 years, and could have done those things myself if I needed, but it would take some time (a lot of it being looking up documentation and writing boilerplate code). It's also nice to write C# unit tests.
However, the times I've been stuck on my main languages, it's been utterly useless.
-
How is it more efficient than reading a static page? The kids can't read.
They weren't taught phonics, they were taught to guess the and with context clues. It's called "whole language" or "balanced reading"Literacy rates are on a severe decline in the US, AI is only going to make that worse.
Over half of Americans between 16 and 74 read below a 6th grade level (that's below the expected reading level of an 11 year old!)
-
I've said it before, but this is a 20-year-old problem.
After Y2K, all those shops that over-porked on devs began shedding the most pricey ones; worse in 'at will' states.
Who were those devs? Mentors. They shipped less code, closed fewer tickets, cost more, but their value wasn't in tickets and code: it was investing in the next generation. And they had to go because #numbersGoUp
And they left. And the first gen of devs with no mentorship joined and started their careers. No idea about edge cases, missing middles or memory management. No lint, no warnings, build and ship and fix the bugs as they come.
And then another generation. And these were the true 'lost boys' of dev. C is dumb, C++ is dumb, perl is dumb, it's all old, supply chain exploits don't exist, I made it go so I'm done, fuck support, look at my numbers. It's all low-attention span, baling wire and trophies because #numbersGoUp.
And let's be fair: they're good at this game, the new way of working where it's a fast finish, a head-pat, and someone else's problem. That's what the companies want, and that's what they built.
They say now that relying on Ai makes one never really exercise critical thought and problem-solving, and I see it when I'm forced to write fucking YAML for fucking Ansible. I let the GPTs do that for me, without worrying that I won't learn to code YAML for Ansible. Coding YAML for Ansible is NEVER going to be on my list of things I want to remember. But we're seeing people do that with actual work; with go and rust code, and yeah, no concept of why we want to check for completeness let alone a concept of how.
One day these new devs will proudly install a patch in the RTOS flashed into your heart monitor and that annoying beep will go away. Sleep tight.
and I see it when I'm forced to write fucking YAML for fucking Ansible. I let the GPTs do that for me, without worrying that I won't learn to code YAML for Ansible. Coding YAML for Ansible is NEVER going to be on my list of things I want to remember.
Feels like this is the attitude towards programming in general nowadays.
-
I'm a little defeatist about it. I saw with my own 3 eyes how a junior asked ChatGPT how to insert something into an
std::unordered_map
. I tell them about cppreference. The little shit tells me "Sorry unc, ChatGPT is objectively more efficient". I almost blew a fucking gasket, mainly cuz I'm not that god damn old. I don't care how much you try to convince me that LLMs are efficient, there is no shot they are more efficient than opening a static page with all the info you would ever need. Not even considering energy efficiency. Utility aside, the damage we have dealt to developing minds is irreversible. We have convinced them that thought is optional. This is gonna bite us in the ass. Hard.I work in a small company that doesn't hire hardly at all... Stories like this scare me because I have no way to personally quantify how common that kind of attitude might be.
-
Literacy rates are on a severe decline in the US, AI is only going to make that worse.
Over half of Americans between 16 and 74 read below a 6th grade level (that's below the expected reading level of an 11 year old!)
We have the same problem with literacy here in Sweden. It’s unnerving to think that these kids will need to become doctors, lawyers and police officers in the future.
-
It's only useful for stuff that's been done a million times before in my experience. As soon as you do anything outside of that, it just stays hallucinating.
This is exactly right. AI can only interpolate between datapoints. I used to write code for research papers and chat gpt couldn’t understand a thing I asked of it.
-
I'm a little defeatist about it. I saw with my own 3 eyes how a junior asked ChatGPT how to insert something into an
std::unordered_map
. I tell them about cppreference. The little shit tells me "Sorry unc, ChatGPT is objectively more efficient". I almost blew a fucking gasket, mainly cuz I'm not that god damn old. I don't care how much you try to convince me that LLMs are efficient, there is no shot they are more efficient than opening a static page with all the info you would ever need. Not even considering energy efficiency. Utility aside, the damage we have dealt to developing minds is irreversible. We have convinced them that thought is optional. This is gonna bite us in the ass. Hard.It's going to get worse. I suspect that this'll end with LLM taking the part of a production programs. Juniors just feeding it scenarios to follow, hook the thing up to a database and web page and let it run. It'll gobble power like there's no tomorrow and is just a nightmare to maintain, but goes live in a quarter if the time so every manager goes with that.
-
Junior Dev's could never code, yes including us
Agreed. A few year back the devs looking for quick fixes would go over to StackOverflow and just copy answers without reading explanations. This caused the same type of problems that OP is talking about. That said, the ease of AI might be making things even worse.
-
I work in a small company that doesn't hire hardly at all... Stories like this scare me because I have no way to personally quantify how common that kind of attitude might be.
Look, ultimately the problem is the same as it has always been: juniors doing junior shit. There's just more of it going on. If you're hiring one, you put a senior on them ready to extinguish fires. A good review process is a must.
Now that I think about it, there was this one time the same young'un I was talking about tried to commit this insane subroutine that was basically resizing a vector in the most roundabout way imaginable. Probably would have worked, but you can also just use the
resize
method, y'know? In retrospect, that was probably some Copilot bullshit, but because we have a review process in place, it was never an issue. -
Yeah, and the way it will confidently give you a wrong answer instead of either asking for more information or saying it just doesn't know is equally annoying.
Because giving answers is not a LLM's job. A LLM's job is to generate text that looks like an answer. And we then try to coax framework that into generating correct answers as often as possible, with mixed results.
-
When I had to get up to speed on a new language, it was very helpful. It's also great to write low to medium complexity scripts in python, powershell, bash, and making ansible tasks. That said I've been programming for ~30 years, and could have done those things myself if I needed, but it would take some time (a lot of it being looking up documentation and writing boilerplate code). It's also nice to write C# unit tests.
However, the times I've been stuck on my main languages, it's been utterly useless.
ChatGPT is extremely useful if you already know what you're doing. It's garbage if you're relying on it to write code for you. There are nearly always bugs and edge cases and hallucinations and version mismatches.
It's also probably useful for looking like you kinda know what you're doing as a junior in a new project. I've seen some shit in code reviews that was clearly AI slop. Usually from exactly the developers you expect.
-
What are you guys working on where chatgpt can figure it out? Honestly, I haven't been able to get a scrap of working code beyond a trivial example out of that thing or any other LLM.
I used it a few days ago to translate a math formula into code.
Here is the formula:
https://wikimedia.org/api/rest_v1/media/math/render/svg/126b6117904ad47459ad0caa791f296e69621782It's not the most complicated thing. I could have done it. But it would take me some time. I just input the formula directly, the desired language and the result was well done and worked flawlessly.
It saved me some time.
-
I've said it before, but this is a 20-year-old problem.
After Y2K, all those shops that over-porked on devs began shedding the most pricey ones; worse in 'at will' states.
Who were those devs? Mentors. They shipped less code, closed fewer tickets, cost more, but their value wasn't in tickets and code: it was investing in the next generation. And they had to go because #numbersGoUp
And they left. And the first gen of devs with no mentorship joined and started their careers. No idea about edge cases, missing middles or memory management. No lint, no warnings, build and ship and fix the bugs as they come.
And then another generation. And these were the true 'lost boys' of dev. C is dumb, C++ is dumb, perl is dumb, it's all old, supply chain exploits don't exist, I made it go so I'm done, fuck support, look at my numbers. It's all low-attention span, baling wire and trophies because #numbersGoUp.
And let's be fair: they're good at this game, the new way of working where it's a fast finish, a head-pat, and someone else's problem. That's what the companies want, and that's what they built.
They say now that relying on Ai makes one never really exercise critical thought and problem-solving, and I see it when I'm forced to write fucking YAML for fucking Ansible. I let the GPTs do that for me, without worrying that I won't learn to code YAML for Ansible. Coding YAML for Ansible is NEVER going to be on my list of things I want to remember. But we're seeing people do that with actual work; with go and rust code, and yeah, no concept of why we want to check for completeness let alone a concept of how.
One day these new devs will proudly install a patch in the RTOS flashed into your heart monitor and that annoying beep will go away. Sleep tight.
No one wants mentors. The way to move you in IT is to switch jibes every 24 months. So when you're paying mentors huge salaries to train juniors who are velocity drags into velocity boosters, you do it knowing their are going to leave and take all that investment with them to a higher paycheck.
I don't say this is right, but that's the reality from the paycheck side is things and I think there needs to be radical change for both sides. Like a trade union or something. Union takes responsibility for certifying skills and suitability, companies can be more confident of hires, juniors have mentors to lean from, mentors ensure juniors have aptitude and intellectual curiosity necessary to do the job well, and I guess pay is more skill/experience based so developers don't have to hop jobs to get paid what they are worth.
-
What are you guys working on where chatgpt can figure it out? Honestly, I haven't been able to get a scrap of working code beyond a trivial example out of that thing or any other LLM.
Agreed. I wanted to test a new config in my router yesterday, which is configured using scripts. So I thought it would be a good idea for ChatGPT to figure it out for me, instead of 3 hours of me reading documentation and trying tutorials. It was a test scenario, so I thought it might do well.
It did not do well at all. The scripts were mostly correct but often in the wrong order (referencing a thing before actually defining it). Sometimes the syntax would be totally wrong and it kept mixing version 6 syntax with version 7 syntax (I'm on 7). It will also make mistakes and when I point out the mistake it says Oh you are totally right, I made a mistake. Then goes on to explain what mistake it did and output new code. However more often than not the new code contained the exact same mistake. This is probably because of a lack of training data, where it is referencing only one example and that example just had a mistake in it.
In the end I gave up on ChatGPT, searched for my testscenario and it turned out a friendly dude on a forum put together a tutorial. So I followed that and it almost worked right away. A couple of minutes of tweaking and testing and I got it working.
I'm afraid for a future where forums and such don't exist and sources like Reddit get fucked and nuked. In an AI driven world the incentive for creating new original content is way lower. So when AI doesn't know the answer, you are just hooped and have to re-invent the wheel yourself. In the long run this will destroy productivity and not give the gains people are hoping for at the moment.
-
What are you guys working on where chatgpt can figure it out? Honestly, I haven't been able to get a scrap of working code beyond a trivial example out of that thing or any other LLM.
I'm forced to use Copilot at work and as far as code completion goes, it gets it right 10-15% of the times... the rest of the time it just suggests random — credible-looking — noise or hallucinates variables and shit.
-
We have the same problem with literacy here in Sweden. It’s unnerving to think that these kids will need to become doctors, lawyers and police officers in the future.
Sweden of all places? What happened in the last decade that Sweden's slowly losing the fame of country to follow in social aspects?
-
No one wants mentors. The way to move you in IT is to switch jibes every 24 months. So when you're paying mentors huge salaries to train juniors who are velocity drags into velocity boosters, you do it knowing their are going to leave and take all that investment with them to a higher paycheck.
I don't say this is right, but that's the reality from the paycheck side is things and I think there needs to be radical change for both sides. Like a trade union or something. Union takes responsibility for certifying skills and suitability, companies can be more confident of hires, juniors have mentors to lean from, mentors ensure juniors have aptitude and intellectual curiosity necessary to do the job well, and I guess pay is more skill/experience based so developers don't have to hop jobs to get paid what they are worth.
Yeah those job hoppers are the worst. You can always tell right away what kind of person those are. I've had to work with a "senior" dev who had 15 years of experience and to be honest he sucked at his job. He couldn't do simple tasks, didn't think before he started writing code and often got stuck asking other people for help. But he got paid big bucks, because all he did his entire career was work somewhere for 2-3 years and then job hop and trade up. By the time the company figured out the dude was useless, he went on to the next company.
Such a shitty attitude, which is a shame because he was a good dude otherwise. I got along with him on a personal level. And honestly good on him for making the most he can, fuck the company. But I personally couldn't do that, I take pride in my work.
-
This post did not contain any content.
Not in any way a new phenomenon, there's a reason fizzbuzz was invented, there's been a steady stream of CS graduates who can't code their way out of a wet paper bag ever since the profession hit the mainstream.
Actually fucking interview your candidates, especially if you're sourcing candidates from a country with for-profit education and/or rote learning cultures, both of which suck when it comes to failing people who didn't learn anything. No BS coding tests go for "explain this code to me" kind of stuff, worst case they can understand code but suck at producing it, that's still prime QA material right there.
-
Sweden of all places? What happened in the last decade that Sweden's slowly losing the fame of country to follow in social aspects?
This is only a guess, but it could be related to increased use of technology. Many things we interact with are simplified, and if you come across a word you don’t know your phone can give you simple synonyms or if you can’t spell autocorrect will catch it.
The same problem people are talking about with LLMs with a different lens.
-
This post did not contain any content.
Very "back in my day" energy.
I do not support AI but programming is about solving problems and not writing code.
If we are concentrating on tool, no developers and use punched card as well. Is that a bad thing?