Microsoft Study Finds Relying on AI Kills Your Critical Thinking Skills
-
Umm...ok. Thanks for that relevant to the conversation bit of information.
-
I consider myself very average, and all my average interactions with AI have been abysmal failures that are hilariously wrong. I invested time and money into trying various models to help me with data analysis work, and they can't even do basic math or summaries of a PDF and the data contained within.
I was impressed with how good the things are at interpreting human fiction, jokes, writing and feelings. Which is really weird, in the context of our perceptions of what AI will be like, it's the exact opposite. The first AI's aren't emotionless robots, they're whiny, inaccurate, delusional and unpredictable bitches. That alone is worth the price of admission but certainly not worth upending society over, it's still just a huge novelty.
-
This was one of the posts of all time.
-
Should he say: "I saw this documentary" or "I read this article"?
-
I was talking to someone who does software development, and he described his experiments with AI for coding.
He said that he was able to use it successfully and come to a solution that was elegant and appropriate.
However, what he did not do was learn how to solve the problem, or indeed learn anything that would help him in future work.
-
It makes HAL 9000 from 2001: A Space Odyessy seem realistic. In the movie he is a highly technical AI but doesn't understand the implications of what he wants to do. He sees Dave as a detriment to the mission and it can be better accomplished without him... not stopping to think about the implications of what he is doing.
-
Misleading headline: No such thing as "AI". No such thing as people "relying" on it. No objective definition of "critical thinking skills". Just a bunch of meaningless buzzwords.
-
I'm a senior software dev that uses AI to help me with my job daily. There are endless tools in the software world all with their own instructions on how to use them. Often they have issues and the solutions aren't included in those instructions. It used to be that I had to go hunt down any references to the problem I was having though online forums in the hopes that somebody else figured out how to solve the issue but now I can ask AI and it generally gives me the answer I'm looking for.
If I had AI when I was still learning core engineering concepts I think shortcutting the learning process could be detrimental but now I just need to know how to get X done specifically with Y this one time and probably never again.
-
Why do you think AI doesn't exist? Or that there's "no such thing as people 'relying' on it"? "AI" is commonly used to refer to LLMs right now. Within the context of a gizmodo article summarizing a study on the subject, "AI" does exist. A lack of precision doesn't mean it's not descriptive of a real thing.
Also, I don't personally know anyone who "relies" on generative AI, but I don't see why it couldn't happen.
-
But Peterson is a fuckhead... So it's accurate in this case. Afaik he does do the things it says.
-
I feel you, but I've asked it why questions too.
-
I've found questions about niche tools tend to get worse answers. I was asking if some stuff about jpackage and it couldn't give me any working suggestions or correct information. Stuff I've asked about Docker was much better.
-
New copy pasta just dropped
-
I'd rather learn from slightly unreliable teachers than teachers who belittle me for asking questions.
-
That's the addiction talking. Use common sense! AI bad
-
Well no shit Sherlock.
-
how does he know that the solution is elegant and appropriate?
-
what got regex to do with critical thinking?
-
100% this. I generally use AI to help with edge cases in software or languages that I already know well or for situations where I really don't care to learn the material because I'm never going to touch it again. In my case, for python or golang, I'll use AI to get me started in the right direction on a problem, then go read the docs to develop my solution. For some weird ugly regex that I just need to fix and never touch again I just ask AI, test the answer it gices, then play with it until it works because I'm never going to remember how to properly use a negative look-behind in regex when I need it again in five years.
I do think AI could be used to help the learning process, too, if used correctly. That said, it requires the student to be proactive in asking the AI questions about why something works or doesn't, then going to read additional information on the topic.
-
The ability of AI to write things with lots of boilerplate like Kubernetes manifests is astounding. It gets me 90-95% of the way there and saves me about 50% of my development time. I still have to understand the result before deployment because I'm not going to blindly deploy something that AI wrote and it rarely works without modifications, but it definitely cuts my development time significantly.