New Junior Developers Can’t Actually Code.
-
This post did not contain any content.
that s the point of being junior. Then problems show up and they are forcing them to learn to solve them
-
This post did not contain any content.
This post is literally an ad for AI tools.
No, thanks. Call me when they actually get good. As it stands, they only offer marginally better autocomplete.
I should probably start collecting dumb AI suggestions and gaslighting answers to show the next time I encounter this topic...
-
I could barely code when I landed my job and now I’m a senior dev. It’s saying a plumber’s apprentice can’t plumb - you learn on the job.
You're not learning anything if Copilot is doing it for you. That's the point.
-
This post is literally an ad for AI tools.
No, thanks. Call me when they actually get good. As it stands, they only offer marginally better autocomplete.
I should probably start collecting dumb AI suggestions and gaslighting answers to show the next time I encounter this topic...
It's actually complaining about AI, tho.
-
It's actually complaining about AI, tho.
There are at least four links leading to AI tools in this page. Why would you link something when you complain about it?
-
You're not learning anything if Copilot is doing it for you. That's the point.
100% agree.
I dont think there is no place for AI as an aid to help you find the solution, but i dont think it's going to help you learn if you just ask it for the answers.
For example, yesterday, i was trying to find out why a policy map on a cisco switch wasn't re-activating after my radius server came back up. Instead of throwing my map at the AI and asking whats wrong l, i asked it details about how a policy map is activated, and about what mechanism the switch uses to determine the status of the radius server and how a policy map can leverage that to kick into gear again.
Ultimately, AI didn't have the answer, but it put me on the right track, and i believe i solved the issue. It seems that the switch didnt count me adding the radius server to the running config as a server coming back alive but if i put in a fake server and instead altered the IP to a real server then the switch saw this as the server coming back alive and authentication started again.
In fact, some of the info, it gave me along the way was wrong. Like when it tried to give me cli commands that i already knew wouldn't work because i was using the newer C3PL AAA commands, but it was mixing them up with the legacy commands and combining them together. Even after i told it that was a made-up command and why it wouldn't work, it still tried to give me the command again later.
So, i dont think it's a good tool for producing actual work, but it can be a good tool to help us learn things if it is used that way. To ask "why" and "how" instead of "what".
-
There are at least four links leading to AI tools in this page. Why would you link something when you complain about it?
Oh lol I thought it was a text post, I didn't even click the link and just read the post description.
-
It's like useful information grows as fruit from trees in a digital forest we call the Internet. However, the fruit spoils over time (becomes less relevant) and requires fertile soil (educated people being online) that can be eroded away (not investing in education or infrastructure) or paved over (intellectual property law). LLMs are like processed food created in factories that lack key characteristics of more nutritious fresh ingredients you can find at a farmer's market. Sure, you can feed more people (provide faster answers to questions) by growing a monocrop (training your LLM on a handful of generous people who publish under Creative Commons licenses like CC BY-SA on Stack Overflow), but you also risk a plague destroying your industry like how the Panama disease fungus destroyed nearly all Gros Michel banana farming (companies firing those generous software developers who “waste time” by volunteering to communities like Stack Overflow and replacing them with LLMs).
There's some solar punk ethical fusion of LLMs and sustainable cultivation of high quality information, but we're definitely not there yet.
To extend your metaphor: be the squirrel in the digital forest. Compulsively bury acorns for others to find in time of need. Forget about most of the burial locations so that new trees are always sprouting and spreading. Do not get attached to a single trunk ; you are made to dance across the canopy.
-
I think that LLMs just made it easier for people who want to know but not learn to know. Reading all those posts all over the internet required you to understand what you pasted together if you wanted it to work (not always but the barr was higher). With ChatGPT, you can just throw errors at it until you have the code you want.
While the requirements never changed, the tools sure did and they made it a lot easier to not understand.
Have you actually found that to be the case in anything complex though? I find it just forgets parts to generate something. Stuck in an infuriating loop of fucking up.
It took us around 2 hours to run our coding questions through chatgpt and see what it gives. And it gives complete shit for most of them. One or two questions we had to replace.
If a company cannot invest even a day to go through their hiring process and AI proof it, then they have a shitty hiring process. And with a shitty hiring process, you get shitty devs.
And then you get people like OP, blaming the generation while if anything its them and their company to blame... for falling behind. Got to keep up folks. Our field moves fast.
-
But how do you find those people solely based on a short interview, where they can use AI tools to perform better if the interview is not held in person?
And mind you the SO was better because you needed to read a lot of answers there and try to understand what would work in your particular case. Learn how to ask smartly. Do your homework and explain the question properly so as not to get gaslit, etc. this is all now gone.
Pretty easy to come up with problems that chatGPT is useless at. You can test it pretty easily. Throw enough constraints at it and the transformer starts to loose attention and forget vital parts.
With a bit of effort you can make problems where chatGPT will actuallt give a misleading answer and candidates have to think critically.
Just like in the past it was pretty easy to come up with problems which werent easily found on SO.
Same landscape. If you put in the time and the effort to have a solid recruitment process, you get solid devs. If you have a lazy and shitty process, you get shitty devs.
-
All I hear is "I'm bad at mentoring"
And some sort of "no one wants to work any more".
I know young brilliant people, maybe they have to be paid correctly?
-
This post did not contain any content.
No wonder open source software becomes more efficient than proprietary one.
-
All I hear is "I'm bad at mentoring"
There is only so much mentoring can do though. You can have the best math prof. You still need to put in the exercise to solve your differential equations to get good at it.
-
Have you actually found that to be the case in anything complex though? I find it just forgets parts to generate something. Stuck in an infuriating loop of fucking up.
It took us around 2 hours to run our coding questions through chatgpt and see what it gives. And it gives complete shit for most of them. One or two questions we had to replace.
If a company cannot invest even a day to go through their hiring process and AI proof it, then they have a shitty hiring process. And with a shitty hiring process, you get shitty devs.
And then you get people like OP, blaming the generation while if anything its them and their company to blame... for falling behind. Got to keep up folks. Our field moves fast.
My rule of thumb: Use ChatGPT for questions whos answer I already know.
Otherwise it hallucinates and tries hard in convincing me of a wrong answer.
-
Have you actually found that to be the case in anything complex though? I find it just forgets parts to generate something. Stuck in an infuriating loop of fucking up.
It took us around 2 hours to run our coding questions through chatgpt and see what it gives. And it gives complete shit for most of them. One or two questions we had to replace.
If a company cannot invest even a day to go through their hiring process and AI proof it, then they have a shitty hiring process. And with a shitty hiring process, you get shitty devs.
And then you get people like OP, blaming the generation while if anything its them and their company to blame... for falling behind. Got to keep up folks. Our field moves fast.
I find ChatGPT to sometimes be excellent at giving me a direction, if not outright solving the problem, when I paste errors I'm to lazy to look search. I say sometimes because othertimes it is just dead wrong.
All code I ask ChatGPT to write is usually along the lines for "I have these values that I need to verify, write code that verifies that nothing is empty and saves an error message for each that is" and then I work with the code it gives me from there. I never take it at face value.
Have you actually found that to be the case in anything complex though?
I think that using LLMs to create complex code is the wrong use of the tool. They are better at providing structure to work from rather than writing the code itself (unless it is something simple as above) in my opinion.
If a company cannot invest even a day to go through their hiring process and AI proof it, then they have a shitty hiring process. And with a shitty hiring process, you get shitty devs.
I agree with you on that.
-
The problem is not only the coding but the thinking. The AI revolution will give birth to a lot more people without critical thinking and problem solving capabilities.
apart from that, learning programming went from something one does out of calling, to something one does to get a job. The percentage of programmers that actually like coding is going down, so on average they're going to be worse
-
Forced to use copilot? Wtf?
I would quit, immediately.
I would quit, immediately.
Pay my bills. Thanks.
I've been dusting off the CV, for multiple other reasons. -
Oh lol I thought it was a text post, I didn't even click the link and just read the post description.
The "about" page indicates that the author is a freelance frontend UI/UX dev, that's recently switched to "helping developers get better with AI" (paraphrased). Nothing about credentials/education related to AI development, only some hobby projects using preexisting AI solutions from what I saw. The post itself doesn't have any sources/links to research about junior devs either, it's all anecdotes and personal opinion. Sure looks like an AI grifter trying to grab attention by ranting about AI, with some pretty lukewarm criticism.
-
This post did not contain any content.
I could have been a junior dev that could code. I learned to do it before ChatGPT. I just never got the job.
-
I would quit, immediately.
Pay my bills. Thanks.
I've been dusting off the CV, for multiple other reasons.how surprising! /s
but seriously, it's almost never one (1) thing that goes wrong when some idiotic mandate gets handed down from management.
a manager that mandates use of copilot (or any tool unfit for any given job), that's a manager that's going to mandate a bunch of other nonsensical shit that gets in the way of work. every time.