Lemmy be like
-
Does this count? https://sopuli.xyz/post/1138547
ah, mid 2023, the honeymoon times
-
May I ask for a link ?
Never saw that in the communities I consult. Never.
Or at least not above 5 downvotes.I'll keep an eye out but I don't have votes visible, so can only really tell sentiment from comments.
Aside, but I highly recommend hiding vote counts. They're even more pointless here than they were on redit. They're meaningless noise on the frontend.
-
Good lord stop comparing LLMs to airplanes in your replies. This is why you think "AI bad" is an unserious statement.
wrote last edited by [email protected]I used that comparison a total of two times (and might use it more), how about refute my argument instead of getting mad at me for using a good comparison twice.
Airplanes emit SHITLOADS of carbon into the atmosphere, they have directly caused the death of tens of thousands of people. Airplanes are heavily used in war and to spy on people. Airplanes are literally used to spray pesticides and other chemicals into the air etc.
They can mostly just be used by the rich etc.Just like with AI, there are many reasons airplanes are bad, that doesn't mean we should get rid of them.
-
Yeah. I hate the naming of it too. It's not AI in the sense how science fiction saw it. History repeats itself in the name of marketing. I'm still very annoyed with these marketers destroying the term "hover board".
-
Give me one real world use that is worth the downside.
As dev I can already tell you it's not coding or around code. Project get spamed with low quality nonsensical bug repport, ai generated code rarely work and doesn't integrate well ( on top on pushing all the work on the reviewer wich is already the hardest part of coding ) and ai written documentation is ridled with errors and is not legible.
And even if ai was remotly good at something it still the equivalent of a microwave trying to replace the entire restaurant kitchen.
wrote last edited by [email protected]I can run a small LLM locally which I can talk to using voice to turn certain lights on and off, set reminders for me, play music etc.
There are MANY examples of LLM's being useful, it has its drawbacks just like any big technology, but saying it has no uses that aren't worth it, is ridiculous.
-
Yes. AI can be used for spam, job cuts, and creepy surveillance, no argument there, but pretending it’s nothing more than a corporate scam machine is just lazy cynicism. This same “automatic BS” is helping discover life-saving drugs, diagnosing cancers earlier than some doctors, giving deaf people real-time conversations through instant transcription, translating entire languages on the fly, mapping wildfire and flood zones so first responders know exactly where to go, accelerating scientific breakthroughs from climate modeling to space exploration, and cutting out the kind of tedious grunt work that wastes millions of human hours a day. The problem isn’t that AI exists, it’s that a lot of powerful people use it selfishly and irresponsibly. Blaming the tech instead of demanding better governance is like blaming the printing press for bad propaganda.
This same “automatic BS” is helping discover life-saving drugs, diagnosing cancers earlier than some doctors
Not the same kind of AI. At all. Generative AI vendors love this motte-and-bailey.
-
I personally think of AI as a tool, what matters is how you use it. I like to think of it like a hammer. You could use a hammer to build a house, or you could smash someone's skull in with it. But no one's putting the hammer in jail.
wrote last edited by [email protected]“Guns don’t kill people, people kill people”
Edit:
Controversial reply, apparently, but this is literally part of the script to a Philosophy Tube video (relevant part is 8:40 - 20:10)
We sometimes think that technology is essentially neutral. It can have good or bad effects, and it might be really important who controls it. But a tool, many people like to think, is just a tool. "Guns don't kill people, people do." But some philosophers have argued that technology can have values built into it that we may not realise.
...
The philosopher Don Idhe says tech can open or close possibilities. It's not just about its function or who controls it. He says technology can provide a framework for action.
...
Martin Heidegger was a student of Husserl's, and he wrote about the ways that we experience the world when we use a piece of technology. His most famous example was a hammer. He said when you use one you don't even think about the hammer. You focus on the nail. The hammer almost disappears in your experience. And you just focus on the task that needs to be performed.
Another example might be a keyboard. Once you get proficient at typing, you almost stop experiencing the keyboard. Instead, your primary experience is just of the words that you're typing on the screen. It's only when it breaks or it doesn't do what we want it to do, that it really becomes visible as a piece of technology. The rest of the time it's just the medium through which we experience the world.
Heidegger talks about technology withdrawing from our attention. Others say that technology becomes transparent. We don't experience it. We experience the world through it. Heidegger says that technology comes with its own way of seeing.
...
Now some of you are looking at me like "Bull sh*t. A person using a hammer is just a person using a hammer!" But there might actually be some evidence from neurology to support this.
If you give a monkey a rake that it has to use to reach a piece of food, then the neurons in its brain that fire when there's a visual stimulus near its hand start firing when there's a stimulus near the end of the rake, too! The monkey's brain extends its sense of the monkey body to include the tool!
And now here's the final step. The philosopher Bruno Latour says that when this happens, when the technology becomes transparent enough to get incorporated into our sense of self and our experience of the world, a new compound entity is formed.
A person using a hammer is actually a new subject with its own way of seeing - 'hammerman.' That's how technology provides a framework for action and being. Rake + monkey = rakemonkey. Makeup + girl is makeupgirl, and makeupgirl experiences the world differently, has a different kind of subjectivity because the tech lends us its way of seeing.
You think guns don't kill people, people do? Well, gun + man creates a new entity with new possibilities for experience and action - gunman!
So if we're onto something here with this idea that tech can withdraw from our attention and in so doing create new subjects with new ways of seeing, then it makes sense to ask when a new piece of technology comes along, what kind of people will this turn us into.
I thought that we were pretty solidly past the idea that anything is “just a tool” after seeing Twitler scramble Grok’s innards to advance his personal politics.
Like, if you still had any lingering belief that AI is “like a hammer”, that really should’ve extinguished it.
But I guess some people see that as an aberrant misuse of AI, and not an indication that all AI has an agenda baked into it, even if it’s more subtle.
-
Peak misunderstanding between AI and LLM
Why the hell are you being downvoted? You are completely right.
People will look back at this and "hover boards" and will think "are they stupid!?"
Mislabeling a product isn't great marketing, it's false advertisement.
-
I personally think of AI as a tool, what matters is how you use it. I like to think of it like a hammer. You could use a hammer to build a house, or you could smash someone's skull in with it. But no one's putting the hammer in jail.
wrote last edited by [email protected]Seriously, the AI hate gets old fast. Like you said it's a tool,
geyget over it people. -
“Guns don’t kill people, people kill people”
Edit:
Controversial reply, apparently, but this is literally part of the script to a Philosophy Tube video (relevant part is 8:40 - 20:10)
We sometimes think that technology is essentially neutral. It can have good or bad effects, and it might be really important who controls it. But a tool, many people like to think, is just a tool. "Guns don't kill people, people do." But some philosophers have argued that technology can have values built into it that we may not realise.
...
The philosopher Don Idhe says tech can open or close possibilities. It's not just about its function or who controls it. He says technology can provide a framework for action.
...
Martin Heidegger was a student of Husserl's, and he wrote about the ways that we experience the world when we use a piece of technology. His most famous example was a hammer. He said when you use one you don't even think about the hammer. You focus on the nail. The hammer almost disappears in your experience. And you just focus on the task that needs to be performed.
Another example might be a keyboard. Once you get proficient at typing, you almost stop experiencing the keyboard. Instead, your primary experience is just of the words that you're typing on the screen. It's only when it breaks or it doesn't do what we want it to do, that it really becomes visible as a piece of technology. The rest of the time it's just the medium through which we experience the world.
Heidegger talks about technology withdrawing from our attention. Others say that technology becomes transparent. We don't experience it. We experience the world through it. Heidegger says that technology comes with its own way of seeing.
...
Now some of you are looking at me like "Bull sh*t. A person using a hammer is just a person using a hammer!" But there might actually be some evidence from neurology to support this.
If you give a monkey a rake that it has to use to reach a piece of food, then the neurons in its brain that fire when there's a visual stimulus near its hand start firing when there's a stimulus near the end of the rake, too! The monkey's brain extends its sense of the monkey body to include the tool!
And now here's the final step. The philosopher Bruno Latour says that when this happens, when the technology becomes transparent enough to get incorporated into our sense of self and our experience of the world, a new compound entity is formed.
A person using a hammer is actually a new subject with its own way of seeing - 'hammerman.' That's how technology provides a framework for action and being. Rake + monkey = rakemonkey. Makeup + girl is makeupgirl, and makeupgirl experiences the world differently, has a different kind of subjectivity because the tech lends us its way of seeing.
You think guns don't kill people, people do? Well, gun + man creates a new entity with new possibilities for experience and action - gunman!
So if we're onto something here with this idea that tech can withdraw from our attention and in so doing create new subjects with new ways of seeing, then it makes sense to ask when a new piece of technology comes along, what kind of people will this turn us into.
I thought that we were pretty solidly past the idea that anything is “just a tool” after seeing Twitler scramble Grok’s innards to advance his personal politics.
Like, if you still had any lingering belief that AI is “like a hammer”, that really should’ve extinguished it.
But I guess some people see that as an aberrant misuse of AI, and not an indication that all AI has an agenda baked into it, even if it’s more subtle.
My skull-crushing hammer that is made to crush skulls and nothing else doesn't crush skulls, people crush skulls
In fact, if more people had skull-crushing hammers in their homes, i'm sure that would lead to a reduction in the number of skull-crushings, the only thing that can stop a bad guy with a skull-crushing hammer, is a good guy with a skull-crushing hammer -
“Guns don’t kill people, people kill people”
Edit:
Controversial reply, apparently, but this is literally part of the script to a Philosophy Tube video (relevant part is 8:40 - 20:10)
We sometimes think that technology is essentially neutral. It can have good or bad effects, and it might be really important who controls it. But a tool, many people like to think, is just a tool. "Guns don't kill people, people do." But some philosophers have argued that technology can have values built into it that we may not realise.
...
The philosopher Don Idhe says tech can open or close possibilities. It's not just about its function or who controls it. He says technology can provide a framework for action.
...
Martin Heidegger was a student of Husserl's, and he wrote about the ways that we experience the world when we use a piece of technology. His most famous example was a hammer. He said when you use one you don't even think about the hammer. You focus on the nail. The hammer almost disappears in your experience. And you just focus on the task that needs to be performed.
Another example might be a keyboard. Once you get proficient at typing, you almost stop experiencing the keyboard. Instead, your primary experience is just of the words that you're typing on the screen. It's only when it breaks or it doesn't do what we want it to do, that it really becomes visible as a piece of technology. The rest of the time it's just the medium through which we experience the world.
Heidegger talks about technology withdrawing from our attention. Others say that technology becomes transparent. We don't experience it. We experience the world through it. Heidegger says that technology comes with its own way of seeing.
...
Now some of you are looking at me like "Bull sh*t. A person using a hammer is just a person using a hammer!" But there might actually be some evidence from neurology to support this.
If you give a monkey a rake that it has to use to reach a piece of food, then the neurons in its brain that fire when there's a visual stimulus near its hand start firing when there's a stimulus near the end of the rake, too! The monkey's brain extends its sense of the monkey body to include the tool!
And now here's the final step. The philosopher Bruno Latour says that when this happens, when the technology becomes transparent enough to get incorporated into our sense of self and our experience of the world, a new compound entity is formed.
A person using a hammer is actually a new subject with its own way of seeing - 'hammerman.' That's how technology provides a framework for action and being. Rake + monkey = rakemonkey. Makeup + girl is makeupgirl, and makeupgirl experiences the world differently, has a different kind of subjectivity because the tech lends us its way of seeing.
You think guns don't kill people, people do? Well, gun + man creates a new entity with new possibilities for experience and action - gunman!
So if we're onto something here with this idea that tech can withdraw from our attention and in so doing create new subjects with new ways of seeing, then it makes sense to ask when a new piece of technology comes along, what kind of people will this turn us into.
I thought that we were pretty solidly past the idea that anything is “just a tool” after seeing Twitler scramble Grok’s innards to advance his personal politics.
Like, if you still had any lingering belief that AI is “like a hammer”, that really should’ve extinguished it.
But I guess some people see that as an aberrant misuse of AI, and not an indication that all AI has an agenda baked into it, even if it’s more subtle.
Guns don’t kill people. People with guns kill people.
Ftfy
-
Why the hell are you being downvoted? You are completely right.
People will look back at this and "hover boards" and will think "are they stupid!?"
Mislabeling a product isn't great marketing, it's false advertisement.
IDK LMAO, that's what I really hate about Reddit/Lemmy, the voting system. People downvote but don't tell where I'm wrong in their opinion. I mean, at least argue — say out loud your (supposedly harmless) opinion. I even added a disclaimer there that I don't promote LLM and such stuff. I don't really care either, I stand with correctness and do what I can to correct what is wrong. I totally agree with @[email protected] tho.
-
This post did not contain any content.
But like... Good.
-
Guns don’t kill people. People with guns kill people.
Ftfy
Hey, that level of pedantry is my job
-
I personally think of AI as a tool, what matters is how you use it. I like to think of it like a hammer. You could use a hammer to build a house, or you could smash someone's skull in with it. But no one's putting the hammer in jail.
Yeah, except it's a tool that most people don't know how to use but everyone can use, leading to environmental harm, a rapid loss of media literacy, and a huge increase in wealth inequality due to turmoil in the job market.
So... It's not a good tool for the average layperson to be using.
-
Why the hell are you being downvoted? You are completely right.
People will look back at this and "hover boards" and will think "are they stupid!?"
Mislabeling a product isn't great marketing, it's false advertisement.
AI is an umbrella term that holds many thing. We have been referring to simple path finding algorithms in video games as AI for two decades, llms are AIs.
-
One could have said many of the same thigs about a lot of new technologies.
The Internet,
Nuclear,
Rockets,
Airplanes etc.Any new disruptive technology comes with drawbacks and can be used for evil.
But that doesn't mean it's all bad, or that it doesn't have its uses.
Of those, only the internet was turned loose on an unsuspecting public, and they had decades of the faucet slowly being opened, to prepare.
Can you imagine if after WW2, Werner Von Braun came to the USA and then just like... Gave every man woman and child a rocket, with no training? Good and evil wouldn't even come into, it'd be chaos and destruction.
Imagine if every household got a nuclear reactor to power it, but none of the people in the household got any training in how to care for it.
It's not a matter of good and evil, it's a matter of harm.
-
Yes. AI can be used for spam, job cuts, and creepy surveillance, no argument there, but pretending it’s nothing more than a corporate scam machine is just lazy cynicism. This same “automatic BS” is helping discover life-saving drugs, diagnosing cancers earlier than some doctors, giving deaf people real-time conversations through instant transcription, translating entire languages on the fly, mapping wildfire and flood zones so first responders know exactly where to go, accelerating scientific breakthroughs from climate modeling to space exploration, and cutting out the kind of tedious grunt work that wastes millions of human hours a day. The problem isn’t that AI exists, it’s that a lot of powerful people use it selfishly and irresponsibly. Blaming the tech instead of demanding better governance is like blaming the printing press for bad propaganda.
Arent those different types of AI?
I dont think anyone hating AI is referring to the code that makes enemies move, or sort things into categories
-
I can run a small LLM locally which I can talk to using voice to turn certain lights on and off, set reminders for me, play music etc.
There are MANY examples of LLM's being useful, it has its drawbacks just like any big technology, but saying it has no uses that aren't worth it, is ridiculous.
wrote last edited by [email protected]That's like saying "asbestos has some good uses, so we should just give every household a big pile of it without any training or PPE"
Or "we know leaded gas harms people, but we think it has some good uses so we're going to let everyone access it for basically free until someone eventually figures out what those uses might be"
It doesn't matter that it has some good uses and that later we went "oops, maybe let's only give it to experts to use". The harm has already been done by eager supporters, intentional or not.
-
“Guns don’t kill people, people kill people”
Edit:
Controversial reply, apparently, but this is literally part of the script to a Philosophy Tube video (relevant part is 8:40 - 20:10)
We sometimes think that technology is essentially neutral. It can have good or bad effects, and it might be really important who controls it. But a tool, many people like to think, is just a tool. "Guns don't kill people, people do." But some philosophers have argued that technology can have values built into it that we may not realise.
...
The philosopher Don Idhe says tech can open or close possibilities. It's not just about its function or who controls it. He says technology can provide a framework for action.
...
Martin Heidegger was a student of Husserl's, and he wrote about the ways that we experience the world when we use a piece of technology. His most famous example was a hammer. He said when you use one you don't even think about the hammer. You focus on the nail. The hammer almost disappears in your experience. And you just focus on the task that needs to be performed.
Another example might be a keyboard. Once you get proficient at typing, you almost stop experiencing the keyboard. Instead, your primary experience is just of the words that you're typing on the screen. It's only when it breaks or it doesn't do what we want it to do, that it really becomes visible as a piece of technology. The rest of the time it's just the medium through which we experience the world.
Heidegger talks about technology withdrawing from our attention. Others say that technology becomes transparent. We don't experience it. We experience the world through it. Heidegger says that technology comes with its own way of seeing.
...
Now some of you are looking at me like "Bull sh*t. A person using a hammer is just a person using a hammer!" But there might actually be some evidence from neurology to support this.
If you give a monkey a rake that it has to use to reach a piece of food, then the neurons in its brain that fire when there's a visual stimulus near its hand start firing when there's a stimulus near the end of the rake, too! The monkey's brain extends its sense of the monkey body to include the tool!
And now here's the final step. The philosopher Bruno Latour says that when this happens, when the technology becomes transparent enough to get incorporated into our sense of self and our experience of the world, a new compound entity is formed.
A person using a hammer is actually a new subject with its own way of seeing - 'hammerman.' That's how technology provides a framework for action and being. Rake + monkey = rakemonkey. Makeup + girl is makeupgirl, and makeupgirl experiences the world differently, has a different kind of subjectivity because the tech lends us its way of seeing.
You think guns don't kill people, people do? Well, gun + man creates a new entity with new possibilities for experience and action - gunman!
So if we're onto something here with this idea that tech can withdraw from our attention and in so doing create new subjects with new ways of seeing, then it makes sense to ask when a new piece of technology comes along, what kind of people will this turn us into.
I thought that we were pretty solidly past the idea that anything is “just a tool” after seeing Twitler scramble Grok’s innards to advance his personal politics.
Like, if you still had any lingering belief that AI is “like a hammer”, that really should’ve extinguished it.
But I guess some people see that as an aberrant misuse of AI, and not an indication that all AI has an agenda baked into it, even if it’s more subtle.
Bad faith comparison.
The reason we can argue for banning guns and not hammers is specifically because guns are meant to hurt people. That's literally their only use. Hammers have a variety of uses and hurting people is definitely not the primary one.
AI is a tool, not a weapon. This is kind of melodramatic.