Lemmy be like
-
“Guns don’t kill people, people kill people”
Edit:
Controversial reply, apparently, but this is literally part of the script to a Philosophy Tube video (relevant part is 8:40 - 20:10)
We sometimes think that technology is essentially neutral. It can have good or bad effects, and it might be really important who controls it. But a tool, many people like to think, is just a tool. "Guns don't kill people, people do." But some philosophers have argued that technology can have values built into it that we may not realise.
...
The philosopher Don Idhe says tech can open or close possibilities. It's not just about its function or who controls it. He says technology can provide a framework for action.
...
Martin Heidegger was a student of Husserl's, and he wrote about the ways that we experience the world when we use a piece of technology. His most famous example was a hammer. He said when you use one you don't even think about the hammer. You focus on the nail. The hammer almost disappears in your experience. And you just focus on the task that needs to be performed.
Another example might be a keyboard. Once you get proficient at typing, you almost stop experiencing the keyboard. Instead, your primary experience is just of the words that you're typing on the screen. It's only when it breaks or it doesn't do what we want it to do, that it really becomes visible as a piece of technology. The rest of the time it's just the medium through which we experience the world.
Heidegger talks about technology withdrawing from our attention. Others say that technology becomes transparent. We don't experience it. We experience the world through it. Heidegger says that technology comes with its own way of seeing.
...
Now some of you are looking at me like "Bull sh*t. A person using a hammer is just a person using a hammer!" But there might actually be some evidence from neurology to support this.
If you give a monkey a rake that it has to use to reach a piece of food, then the neurons in its brain that fire when there's a visual stimulus near its hand start firing when there's a stimulus near the end of the rake, too! The monkey's brain extends its sense of the monkey body to include the tool!
And now here's the final step. The philosopher Bruno Latour says that when this happens, when the technology becomes transparent enough to get incorporated into our sense of self and our experience of the world, a new compound entity is formed.
A person using a hammer is actually a new subject with its own way of seeing - 'hammerman.' That's how technology provides a framework for action and being. Rake + monkey = rakemonkey. Makeup + girl is makeupgirl, and makeupgirl experiences the world differently, has a different kind of subjectivity because the tech lends us its way of seeing.
You think guns don't kill people, people do? Well, gun + man creates a new entity with new possibilities for experience and action - gunman!
So if we're onto something here with this idea that tech can withdraw from our attention and in so doing create new subjects with new ways of seeing, then it makes sense to ask when a new piece of technology comes along, what kind of people will this turn us into.
I thought that we were pretty solidly past the idea that anything is “just a tool” after seeing Twitler scramble Grok’s innards to advance his personal politics.
Like, if you still had any lingering belief that AI is “like a hammer”, that really should’ve extinguished it.
But I guess some people see that as an aberrant misuse of AI, and not an indication that all AI has an agenda baked into it, even if it’s more subtle.
Yet gun control works.
Same idea.
-
The LLM shills have made "AI" refer exclusively to LLMs.
Yes, I agree and it's unacceptable for me. Now most people here are also falling in the same hole. I'm here not to promote/support/standing with LLM or Gen-AI, I want to correct what is wrong. You can hate something but please, be objective and rational.
Language is descriptive not prescriptive.
If people use the term "AI" to refer to LLMs, then it's correct by definition.
-
Dankie!! Jy is die eerste wat agter kom.
As jy nog nie weet van [email protected] weet nie gaan loer daar rond. Ek probeer `n bietjie van 'n gemeenskap daar bou.
Ek sal dit nagaan, dankie vir die voorstel! Ek het van Suid-Afrika af geïmmigreer toe ek 4 was, so ek kan dit redelik goed praat, maar my lees- en skryfbegrip is nie-bestaande, so ek gebruik Google Translate om te help lol.
-
This post did not contain any content.
It Is true thou, ai bad
-
Yeah. I hate the naming of it too. It's not AI in the sense how science fiction saw it. History repeats itself in the name of marketing. I'm still very annoyed with these marketers destroying the term "hover board".
AI includes a lot of things
The way ghosts in pacman chase you is AI
-
Bad faith comparison.
The reason we can argue for banning guns and not hammers is specifically because guns are meant to hurt people. That's literally their only use. Hammers have a variety of uses and hurting people is definitely not the primary one.
AI is a tool, not a weapon. This is kind of melodramatic.
GenAI is a bad tool that does bad things in bad ways.
-
“Guns don’t kill people, people kill people”
Edit:
Controversial reply, apparently, but this is literally part of the script to a Philosophy Tube video (relevant part is 8:40 - 20:10)
We sometimes think that technology is essentially neutral. It can have good or bad effects, and it might be really important who controls it. But a tool, many people like to think, is just a tool. "Guns don't kill people, people do." But some philosophers have argued that technology can have values built into it that we may not realise.
...
The philosopher Don Idhe says tech can open or close possibilities. It's not just about its function or who controls it. He says technology can provide a framework for action.
...
Martin Heidegger was a student of Husserl's, and he wrote about the ways that we experience the world when we use a piece of technology. His most famous example was a hammer. He said when you use one you don't even think about the hammer. You focus on the nail. The hammer almost disappears in your experience. And you just focus on the task that needs to be performed.
Another example might be a keyboard. Once you get proficient at typing, you almost stop experiencing the keyboard. Instead, your primary experience is just of the words that you're typing on the screen. It's only when it breaks or it doesn't do what we want it to do, that it really becomes visible as a piece of technology. The rest of the time it's just the medium through which we experience the world.
Heidegger talks about technology withdrawing from our attention. Others say that technology becomes transparent. We don't experience it. We experience the world through it. Heidegger says that technology comes with its own way of seeing.
...
Now some of you are looking at me like "Bull sh*t. A person using a hammer is just a person using a hammer!" But there might actually be some evidence from neurology to support this.
If you give a monkey a rake that it has to use to reach a piece of food, then the neurons in its brain that fire when there's a visual stimulus near its hand start firing when there's a stimulus near the end of the rake, too! The monkey's brain extends its sense of the monkey body to include the tool!
And now here's the final step. The philosopher Bruno Latour says that when this happens, when the technology becomes transparent enough to get incorporated into our sense of self and our experience of the world, a new compound entity is formed.
A person using a hammer is actually a new subject with its own way of seeing - 'hammerman.' That's how technology provides a framework for action and being. Rake + monkey = rakemonkey. Makeup + girl is makeupgirl, and makeupgirl experiences the world differently, has a different kind of subjectivity because the tech lends us its way of seeing.
You think guns don't kill people, people do? Well, gun + man creates a new entity with new possibilities for experience and action - gunman!
So if we're onto something here with this idea that tech can withdraw from our attention and in so doing create new subjects with new ways of seeing, then it makes sense to ask when a new piece of technology comes along, what kind of people will this turn us into.
I thought that we were pretty solidly past the idea that anything is “just a tool” after seeing Twitler scramble Grok’s innards to advance his personal politics.
Like, if you still had any lingering belief that AI is “like a hammer”, that really should’ve extinguished it.
But I guess some people see that as an aberrant misuse of AI, and not an indication that all AI has an agenda baked into it, even if it’s more subtle.
-
This post did not contain any content.
lemmycirclejerk
️
-
Language is descriptive not prescriptive.
If people use the term "AI" to refer to LLMs, then it's correct by definition.
It's partially correct but AI don't always mean it's LLM. Etymology is important here. Don't normalize illiteracy.
-
Seriously, the AI hate gets old fast. Like you said it's a tool,
geyget over it people.wrote last edited by [email protected]gey over it
️
️
️
-
It's partially correct but AI don't always mean it's LLM. Etymology is important here. Don't normalize illiteracy.
This is how etymology works.
Do you think all the words we use today meant exactly the same thing 300 years ago?
No, people used it "incorrectly" and that usage gains popularity, and that makes it correct.What you call illiteracy is literally how etymology works.
-
gey over it
️
️
️
Edited. That's what I get for trying to type fast while my dog is heading for the door after doing her business.
-
GenAI is a bad tool that does bad things in bad ways.
wrote last edited by [email protected]"Video games are dangerous."
-
Guns don’t kill people. People with guns kill people.
Ftfy
Secure your guns. Toddlers kill an insane number of people. https://www.snopes.com/fact-check/toddlers-killed-americans-terrorists/
-
This post did not contain any content.
Pfft, as if a post on lemmy would ever get more than 1K upvotes.
-
Ek sal dit nagaan, dankie vir die voorstel! Ek het van Suid-Afrika af geïmmigreer toe ek 4 was, so ek kan dit redelik goed praat, maar my lees- en skryfbegrip is nie-bestaande, so ek gebruik Google Translate om te help lol.
Ah lekker, meeste mense in SA kan goed Engels praat, daai Community is meestal Engels.
-
This post did not contain any content.wrote last edited by [email protected]
I don't hate the concept as is, I hate how it is being marketed and shoved everywhere and into everything by sheer hype and the need for returns on the absurd amounts of money that were thrown at it.
Companies use it to justify layoffs, create cheap vibed up products, delegate responsibilities to an absolutely not sentient or intelligent computer program. Not even mentioning the colossal amount of natural and financial resources being thrown down this drain.
I read a great summary yesterday somewhere on here that essentially said "they took a type of computer model made to give answers to very specific questions it has been trained on, and then trained it on everything to make a generalist". Except that doesn't work, the broader the spectrum the model is covering the less accurate it will be.
Identifying skin cancer? Perfect tool for the job.
Giving drones the go ahead on an ambiguous target? Providing psychological care to people in distress? FUCK NO.
-
Think about your argument for a minute.
I know you think this will harm you and everyone you know, but it'll be much better if you just stay quiet instead of vocally opposing it
When has that ever been good advice?
So everything related to AI is negative ?
If so do you understand why we can't have any conversation on the subject ?
-
AI includes a lot of things
The way ghosts in pacman chase you is AI
There is a distinction between video game AI and computer science AI. People know that video game AI isn't really AI. How LLM is marketed by using terms like "super intelligence" is deception.
-
AI is an umbrella term that holds many thing. We have been referring to simple path finding algorithms in video games as AI for two decades, llms are AIs.
There is a distinction between video game AI and computer science AI. People know that video game AI isn't really AI. How LLM is marketed by using terms like "super intelligence" is deception.
No one is typing prompts out to NPC asking if dogs can eat chocolate.