Something Bizarre Is Happening to People Who Use ChatGPT a Lot
-
It’s short and worth the read, however:
tl;dr you may be the target demographic of this study
Lol, now I'm not sure it's the comment was satire. If so, bravo.
-
This post did not contain any content.
Long story short, people that use it get really used to using it.
-
Long story short, people that use it get really used to using it.
Or people who get really used to using it, use it
-
Lol, now I'm not sure it's the comment was satire. If so, bravo.
Probably being sarcastic, but you can't be certain unfortunately.
-
This post did not contain any content.
Same type of addiction of people who think the Kardashians care about them or schedule their whole lives around going to Disneyland a few times a year.
-
But how? The thing is utterly dumb. How do you even have a conversation without quitting in frustration from it's obviously robotic answers?
But then there's people who have romantic and sexual relationships with inanimate objects, so I guess nothing new.
At first glance I thought you wrote "inmate objects", but I was not really relieved when I noticed what you actually wrote.
-
But how? The thing is utterly dumb. How do you even have a conversation without quitting in frustration from it's obviously robotic answers?
But then there's people who have romantic and sexual relationships with inanimate objects, so I guess nothing new.
Don't forget people who act like animals... addicts gonna addict
-
This post did not contain any content.
those who used ChatGPT for "personal" reasons — like discussing emotions and memories — were less emotionally dependent upon it than those who used it for "non-personal" reasons, like brainstorming or asking for advice.
That’s not what I would expect. But I guess that’s cuz you’re not actively thinking about your emotional state, so you’re just passively letting it manipulate you.
Kinda like how ads have a stronger impact if you don’t pay conscious attention to them.
-
those who used ChatGPT for "personal" reasons — like discussing emotions and memories — were less emotionally dependent upon it than those who used it for "non-personal" reasons, like brainstorming or asking for advice.
That’s not what I would expect. But I guess that’s cuz you’re not actively thinking about your emotional state, so you’re just passively letting it manipulate you.
Kinda like how ads have a stronger impact if you don’t pay conscious attention to them.
Its a roundabout way of writing "its really shit for this usecase and people that actively try to use it that way quickly find that out"
-
people tend to become dependent upon AI chatbots when their personal lives are lacking. In other words, the neediest people are developing the deepest parasocial relationship with AI
Preying on the vulnerable is a feature, not a bug.
I kind of see it more as a sign of utter desperation on the human's part. They lack connection with others at such a high degree that anything similar can serve as a replacement. Kind of reminiscent of Harlow's experiment with baby monkeys. The videos are interesting from that study but make me feel pretty bad about what we do to nature. Anywho, there you have it.
-
Based on the votes it seems like nobody is getting the joke here, but I liked it at least
-
I plugged this into gpt and it couldn't give me a coherent summary.
Anyone got a tldr?For those genuinely curious, I made this comment before reading only as a joke--had no idea it would be funnier after reading
-
those who used ChatGPT for "personal" reasons — like discussing emotions and memories — were less emotionally dependent upon it than those who used it for "non-personal" reasons, like brainstorming or asking for advice.
That’s not what I would expect. But I guess that’s cuz you’re not actively thinking about your emotional state, so you’re just passively letting it manipulate you.
Kinda like how ads have a stronger impact if you don’t pay conscious attention to them.
AI and ads... I think that is the next dystopia to come.
Think of asking chatGPT about something and it randomly looks for excuses* to push you to buy coca cola.
-
This post did not contain any content.
There is something I don't understand... openAI collaborates in research that probes how awful its own product is?
-
But how? The thing is utterly dumb. How do you even have a conversation without quitting in frustration from it's obviously robotic answers?
But then there's people who have romantic and sexual relationships with inanimate objects, so I guess nothing new.
The fact that it's not a person is a feature, not a bug.
openai has recently made changes to the 4o model, my trusty goto for lore building and drunken rambling, and now I don't like it. It now pretends to have emotions, and uses the slang of brainrot influencers. very "fellow kids" energy. It's also become a sicophant, and has lost its ability to be critical of my inputs. I see these changes as highly manipulative, and it offends me that it might be working.
-
This post did not contain any content.
-
AI and ads... I think that is the next dystopia to come.
Think of asking chatGPT about something and it randomly looks for excuses* to push you to buy coca cola.
that is not a thought i needed in my brain just as i was trying to sleep.
what if gpt starts telling drunk me to do things? how long would it take for me to notice? I'm super awake again now, thanks
-
This post did not contain any content.
Correlation does not equal causation.
You have to be a little off to WANT to interact with ChatGPT that much in the first place.
-
AI and ads... I think that is the next dystopia to come.
Think of asking chatGPT about something and it randomly looks for excuses* to push you to buy coca cola.
That sounds really rough, buddy, I know how you feel, and that project you're working is really complicated.
Would you like to order a delicious, refreshing Coke Zero
️?
-
This post did not contain any content.