When I read about dialogues with AI, where people try to get life advice, support, and therapy from the algorithm, I'm reminded of this photograph.
-
This post did not contain any content.
ELIZA, the first chatbot created in the 60s just used to parrot your response back to you:
I'm feeling depressed
Why do you think you're feeling depressed
It was incredibly basic and the inventor Weizenbaum didn't think it was particularly interesting but got his secretary to try it and she became addicted. So much so that she asked him to leave the room while she "talked" to it.
She knew it was just repeating what she said back to her in the form of a question but she formed a genuine emotional bond with it.
Now that they're more sophisticated it really highlights how our idiot brains just want something to talk to whether we know it's real or not doesn't really matter.
-
This post did not contain any content.
This is an interesting comparison because the wire monkey study suggests that we need physical contact from a caregiver more than nourishment. In the case of AI, we’re getting some sort of mental nourishment from the AI, but no physical contact.
The solution? AI tools integrated into either hyper-realistic humanoid robots, or human robo-puppets.
Or, we could also leverage our advancing technology to support the working class by implementing UBI through a reduction in production costs and an evening out of wealth and resources.
But who wants that? I, a billionaire, sure don’t.
-
ELIZA, the first chatbot created in the 60s just used to parrot your response back to you:
I'm feeling depressed
Why do you think you're feeling depressed
It was incredibly basic and the inventor Weizenbaum didn't think it was particularly interesting but got his secretary to try it and she became addicted. So much so that she asked him to leave the room while she "talked" to it.
She knew it was just repeating what she said back to her in the form of a question but she formed a genuine emotional bond with it.
Now that they're more sophisticated it really highlights how our idiot brains just want something to talk to whether we know it's real or not doesn't really matter.
One of the last posts I read on Reddit was a student in a CompSci class where the professor put a pair of googly eyes on a pencil and said, "I'm Petie the Pencil! I'm not sentient but you think I am because I can say full sentences." The professor then snapped the pencil in half that made the students gasp.
The point was that humans anamorphize things that seem human, assigning them characteristics that make us bond to things that aren't real.
-
This is an interesting comparison because the wire monkey study suggests that we need physical contact from a caregiver more than nourishment. In the case of AI, we’re getting some sort of mental nourishment from the AI, but no physical contact.
The solution? AI tools integrated into either hyper-realistic humanoid robots, or human robo-puppets.
Or, we could also leverage our advancing technology to support the working class by implementing UBI through a reduction in production costs and an evening out of wealth and resources.
But who wants that? I, a billionaire, sure don’t.
How about just hug a real human. Problem solved
-
How about just hug a real human. Problem solved
slavery was made illegal decades ago
-
slavery was made illegal decades ago
Was it really tho?
-
Was it really tho?
officially iirc yes, in practice no but because prisoners aren't considered human ¯\(ツ)/¯
-
This post did not contain any content.
get that monkey a real buddy. NOW!
-
One of the last posts I read on Reddit was a student in a CompSci class where the professor put a pair of googly eyes on a pencil and said, "I'm Petie the Pencil! I'm not sentient but you think I am because I can say full sentences." The professor then snapped the pencil in half that made the students gasp.
The point was that humans anamorphize things that seem human, assigning them characteristics that make us bond to things that aren't real.
that's just a bit from community
-
This post did not contain any content.
It reminds me of this one
-
This post did not contain any content.
People are using it wrong. Use it to figure out what sort of behaviors would help you get out of your funk and come up with ideas that motivate you. The AI's answers won't make you feel better, but they can help you help yourself.
-
officially iirc yes, in practice no but because prisoners aren't considered human ¯\(ツ)/¯
Neither are the disabled apparently
-
This post did not contain any content.
In a world with probably over 8 billion humans, people are so desperately lonely, they view a machine built to trick them into thinking it's sentient as the best option for conversation.