Something Bizarre Is Happening to People Who Use ChatGPT a Lot
-
This post did not contain any content.
-
-
This post did not contain any content.
But how? The thing is utterly dumb. How do you even have a conversation without quitting in frustration from it's obviously robotic answers?
But then there's people who have romantic and sexual relationships with inanimate objects, so I guess nothing new.
-
This post did not contain any content.
The digital Wilson.
-
This post did not contain any content.
I plugged this into gpt and it couldn't give me a coherent summary.
Anyone got a tldr? -
But how? The thing is utterly dumb. How do you even have a conversation without quitting in frustration from it's obviously robotic answers?
But then there's people who have romantic and sexual relationships with inanimate objects, so I guess nothing new.
In some ways, it's like Wikipedia but with a gigantic database of the internet in general (stupidity included). Because it can string together confident-sounding sentences, people think it's this magical machine that understands broad contexts and can provide facts and summaries of concepts that take humans lifetimes to study.
It's the conspiracy theorists' and reactionaries' dream: you too can be as smart and special as the educated experts, and all you have to do is ask a machine a few questions.
-
But how? The thing is utterly dumb. How do you even have a conversation without quitting in frustration from it's obviously robotic answers?
But then there's people who have romantic and sexual relationships with inanimate objects, so I guess nothing new.
If you're also dumb, chatgpt seems like a super genius.
-
I plugged this into gpt and it couldn't give me a coherent summary.
Anyone got a tldr?It’s short and worth the read, however:
tl;dr you may be the target demographic of this study
-
This post did not contain any content.
people tend to become dependent upon AI chatbots when their personal lives are lacking. In other words, the neediest people are developing the deepest parasocial relationship with AI
Preying on the vulnerable is a feature, not a bug.
-
But how? The thing is utterly dumb. How do you even have a conversation without quitting in frustration from it's obviously robotic answers?
But then there's people who have romantic and sexual relationships with inanimate objects, so I guess nothing new.
You are clearly not using its advanced voice mode.
-
I plugged this into gpt and it couldn't give me a coherent summary.
Anyone got a tldr?Based on the votes it seems like nobody is getting the joke here, but I liked it at least
-
It’s short and worth the read, however:
tl;dr you may be the target demographic of this study
Lol, now I'm not sure it's the comment was satire. If so, bravo.
-
This post did not contain any content.
Long story short, people that use it get really used to using it.
-
Long story short, people that use it get really used to using it.
Or people who get really used to using it, use it
-
Lol, now I'm not sure it's the comment was satire. If so, bravo.
Probably being sarcastic, but you can't be certain unfortunately.
-
This post did not contain any content.
Same type of addiction of people who think the Kardashians care about them or schedule their whole lives around going to Disneyland a few times a year.
-
But how? The thing is utterly dumb. How do you even have a conversation without quitting in frustration from it's obviously robotic answers?
But then there's people who have romantic and sexual relationships with inanimate objects, so I guess nothing new.
At first glance I thought you wrote "inmate objects", but I was not really relieved when I noticed what you actually wrote.
-
But how? The thing is utterly dumb. How do you even have a conversation without quitting in frustration from it's obviously robotic answers?
But then there's people who have romantic and sexual relationships with inanimate objects, so I guess nothing new.
Don't forget people who act like animals... addicts gonna addict
-
This post did not contain any content.
those who used ChatGPT for "personal" reasons — like discussing emotions and memories — were less emotionally dependent upon it than those who used it for "non-personal" reasons, like brainstorming or asking for advice.
That’s not what I would expect. But I guess that’s cuz you’re not actively thinking about your emotional state, so you’re just passively letting it manipulate you.
Kinda like how ads have a stronger impact if you don’t pay conscious attention to them.
-
those who used ChatGPT for "personal" reasons — like discussing emotions and memories — were less emotionally dependent upon it than those who used it for "non-personal" reasons, like brainstorming or asking for advice.
That’s not what I would expect. But I guess that’s cuz you’re not actively thinking about your emotional state, so you’re just passively letting it manipulate you.
Kinda like how ads have a stronger impact if you don’t pay conscious attention to them.
Its a roundabout way of writing "its really shit for this usecase and people that actively try to use it that way quickly find that out"
-
people tend to become dependent upon AI chatbots when their personal lives are lacking. In other words, the neediest people are developing the deepest parasocial relationship with AI
Preying on the vulnerable is a feature, not a bug.
I kind of see it more as a sign of utter desperation on the human's part. They lack connection with others at such a high degree that anything similar can serve as a replacement. Kind of reminiscent of Harlow's experiment with baby monkeys. The videos are interesting from that study but make me feel pretty bad about what we do to nature. Anywho, there you have it.