New Junior Developers Can’t Actually Code.
-
I could barely code when I landed my job and now I’m a senior dev. It’s saying a plumber’s apprentice can’t plumb - you learn on the job.
-
It generally is a length of time. Your title depends on the years on the job.
-
The problem is not only the coding but the thinking. The AI revolution will give birth to a lot more people without critical thinking and problem solving capabilities.
-
But how do you find those people solely based on a short interview, where they can use AI tools to perform better if the interview is not held in person?
And mind you the SO was better because you needed to read a lot of answers there and try to understand what would work in your particular case. Learn how to ask smartly. Do your homework and explain the question properly so as not to get gaslit, etc. this is all now gone.
-
That's the point.
Along with censorship.
-
Evil me: Ask questions to which there is no solution but ChatGPT will happily give incorrect solutions to and will run itself in circles trying to answer correctly as you feed it error messages.
-
All I hear is "I'm bad at mentoring"
-
that s the point of being junior. Then problems show up and they are forcing them to learn to solve them
-
This post is literally an ad for AI tools.
No, thanks. Call me when they actually get good. As it stands, they only offer marginally better autocomplete.
I should probably start collecting dumb AI suggestions and gaslighting answers to show the next time I encounter this topic...
-
You're not learning anything if Copilot is doing it for you. That's the point.
-
It's actually complaining about AI, tho.
-
There are at least four links leading to AI tools in this page. Why would you link something when you complain about it?
-
100% agree.
I dont think there is no place for AI as an aid to help you find the solution, but i dont think it's going to help you learn if you just ask it for the answers.
For example, yesterday, i was trying to find out why a policy map on a cisco switch wasn't re-activating after my radius server came back up. Instead of throwing my map at the AI and asking whats wrong l, i asked it details about how a policy map is activated, and about what mechanism the switch uses to determine the status of the radius server and how a policy map can leverage that to kick into gear again.
Ultimately, AI didn't have the answer, but it put me on the right track, and i believe i solved the issue. It seems that the switch didnt count me adding the radius server to the running config as a server coming back alive but if i put in a fake server and instead altered the IP to a real server then the switch saw this as the server coming back alive and authentication started again.
In fact, some of the info, it gave me along the way was wrong. Like when it tried to give me cli commands that i already knew wouldn't work because i was using the newer C3PL AAA commands, but it was mixing them up with the legacy commands and combining them together. Even after i told it that was a made-up command and why it wouldn't work, it still tried to give me the command again later.
So, i dont think it's a good tool for producing actual work, but it can be a good tool to help us learn things if it is used that way. To ask "why" and "how" instead of "what".
-
Oh lol I thought it was a text post, I didn't even click the link and just read the post description.
-
To extend your metaphor: be the squirrel in the digital forest. Compulsively bury acorns for others to find in time of need. Forget about most of the burial locations so that new trees are always sprouting and spreading. Do not get attached to a single trunk ; you are made to dance across the canopy.
-
Have you actually found that to be the case in anything complex though? I find it just forgets parts to generate something. Stuck in an infuriating loop of fucking up.
It took us around 2 hours to run our coding questions through chatgpt and see what it gives. And it gives complete shit for most of them. One or two questions we had to replace.
If a company cannot invest even a day to go through their hiring process and AI proof it, then they have a shitty hiring process. And with a shitty hiring process, you get shitty devs.
And then you get people like OP, blaming the generation while if anything its them and their company to blame... for falling behind. Got to keep up folks. Our field moves fast.
-
Pretty easy to come up with problems that chatGPT is useless at. You can test it pretty easily. Throw enough constraints at it and the transformer starts to loose attention and forget vital parts.
With a bit of effort you can make problems where chatGPT will actuallt give a misleading answer and candidates have to think critically.
Just like in the past it was pretty easy to come up with problems which werent easily found on SO.
Same landscape. If you put in the time and the effort to have a solid recruitment process, you get solid devs. If you have a lazy and shitty process, you get shitty devs.
-
And some sort of "no one wants to work any more".
I know young brilliant people, maybe they have to be paid correctly?
-
No wonder open source software becomes more efficient than proprietary one.
-
There is only so much mentoring can do though. You can have the best math prof. You still need to put in the exercise to solve your differential equations to get good at it.