Microsoft Study Finds Relying on AI Kills Your Critical Thinking Skills
-
Damn. I wonder where all the calculus identities and mathematical puzzle solving abilities in my head disappeared to then. Surely not into the void that is Wolfram Mathematica. Surely not...
-
-
Based on what? Did they get 1000 people 10 years ago, test their critical thinking skills then retest after heavy AI use? How many people were in the control?
-
Pretty shit “study”. If workers use AI for a task, obviously the results will be less diverse. That doesn’t mean their critical thinking skills deteriorated. It means they used a tool that produces a certain outcome. This doesn’t test their critical thinking at all.
“Another noteworthy finding of the study: users who had access to generative AI tools tended to produce “a less diverse set of outcomes for the same task” compared to those without. That passes the sniff test. If you’re using an AI tool to complete a task, you’re going to be limited to what that tool can generate based on its training data. These tools aren’t infinite idea machines, they can only work with what they have, so it checks out that their outputs would be more homogenous. Researchers wrote that this lack of diverse outcomes could be interpreted as a “deterioration of critical thinking” for workers.”
-
That doesn’t mean their critical thinking skills deteriorated. It means they used a tool that produces a certain outcome.
Dunning, meet Kruger
-
- Handjobs at Starbucks
Well that's just solid policy right there, cum on.
-
You mean an AI that literally generated text based on applying a mathematical function to input text doesn't do reasoning for me? (/s)
I'm pretty certain every programmer alive knew this was coming as soon as we saw people trying to use it years ago.
It's funny because I never get what I want out of AI. I've been thinking this whole time "am I just too dumb to ask the AI to do what I need?" Now I'm beginning to think "am I not dumb enough to find AI tools useful?"
-
Cars for the mind.
Cars are killing people.
-
Critical thinking skills are what hold me back from relying on ai
-
Not sure if sarcasm..
-
I agree with all of this. My comment is meant to refute the implication that not needing to memorize phone numbers is somehow analogous to critical thinking. And yes, internalized axioms are necessary, but largely the core element is memorizing how these axioms are used, not necessarily their rote text.
-
Seriously, ask AI about anything you are actually expert in. it's laughable sometimes... However you need to know, to know it's wrong. Do not trust it implicitly about anything.
-
When it was new to me I tried ChatGPT out of curiosity, like with why tech, and I just kept getting really annoyed at the expansive bullshit it gave to the simplest of input. "Give me a list of 3 X" lead to fluff-filled paragraphs for each. The bastard children of a bad encyclopedia and the annoying kid in school.
I realized I was understanding it wrong, and it was supposed to be understood not as a useful tool, but as close to interacting with a human, pointless prose and all. That just made me more annoyed. It still blows my mind people say they use it when writing.
-
How else can the "elite" seperate themselces from the common folk? The elite loves writing 90% fluff and require high word counts in academia instead of actually making consise, clear, articulate articles that are easy to understand. You have to have a certain word count to qualify for "good writing" in any elite group. Look at Law, Political Science, History, Scientific Journals, etc. I had professor who would tell me they could easily find the information in the articles they needed and that one day we would be able to as well. That's why ChatGPT spits out a shit ton of fluff.
-
never used it in any practical function. i tested it to see if it was realistic and i found it extremely wanting. as in, it sounded nothing like the prompts i gave it.
-
Yeah, if you repeated this test with the person having access to a stack exchange or not you'd see the same results. Not much difference between someone mindlessly copying an answer from stack overflow vs copying it from AI. Both lead to more homogeneous answers and lower critical thinking skills.
-
The only beneficial use I've had for "AI" (LLMs) has just been rewriting text, whether that be to re-explain a topic based on a source, or, for instance, sort and shorten/condense a list.
Everything other than that has been completely incorrect, unreadably long, context-lacking slop.
-
Weren't these assholes just gung-ho about forcing their shitty "AI" chatbots on us like ten minutes ago?
Microsoft can go fuck itself right in the gates. -
It would wake me up more than coffee that's for sure
-
Garbage in, Garbage out. Ingesting all that internet blather didn't make the ai smarter by much if anything.