Lemmy be like
-
Ai is literally making people dumber:
And books destroyed everyone's memory. People used to have fantastic memories.
They are a massive privacy risk:
No different than the rest of cloud tech. Run your AI local like your other self hosting.
Are being used to push fascist ideologies into every aspect of the internet:
Hitler used radio to push fascism into every home. It's not the medium, it's the message.
And they are a massive environmental disaster:
AI uses a GPU just like gaming uses a GPU. Building a new AI model uses the same energy that Rockstar spent developing GTA5. But it's easier to point at a centralized data center polluting the environment than thousands of game developers spread across multiple offices creating even more pollution.
Stop being a corporate apologist
Run your own AI! Complaining about "corporate AI" is like complaining about corporate email. Host it yourself.
Run your own AI!
Oh sure, let me just pull a couple billion out of the couch cushions to spin up a data center in the middle of the desert.
-
natural is a superfluous word that could mean multiple things. genuine means specific and vetted.
ie: "Natural Flavors"
As in Genuine Leather?
-
Do you really need to have a list of why people are sick of LLM and Ai slop?
With the number of times that refrain is regurgitated here ad nauseum, need is an odd way to put it.
Sick of it might fit sentiments better.
Done with this & not giving a shit is another.Evil must be fought as long as it exists.
-
If we develop them into tools that help assist work.
Spoilers: We will not
I believe the AI business and the tech hype cycle is ultimately harming the field.
I think this is just an American way of doing business. And it's awful, but at the end of the day people will adopt technology if it makes them greater profit (or at least screws over the correct group of people).
But where the Americanized AI seems to suffer most is in their marketing fully eclipsing their R&D. People seem to have forgotten how DeepSeek spiked the football on OpenAI less than a year ago by making some marginal optimizations to their algorithm.
The field isn't suffering from the hype cycle nearly so much as it suffers from malinvestment. Huge efforts to make the platform marketable. Huge efforts to shoehorn clumsy chat bots into every nook and cranny of the OS interface. Vanishingly little effort to optimize material consumption or effectively process data or to segregate AI content from the human data it needs to improve.
Spoilers: We will not
Generative inpainting/fill is enormously helpful in media production.
-
Depending on context, jargon and terminology change.
In this context, I'd agree that LLMs are a subset tech under the umbrella term "AI". But in common English discourse, LLM and AI are often used interchangeably. That's not wrong because correctness is defined by the actual real usage of native speakers of the language.I also come from a tech background. I'm a developer with 15 years experience, and I work for a large company, and my job is currently integrating LLMs and more traditional ML models into our products, because our shareholders think we need to.
Specificity is useful in technical contexts, but in these public contexts, almost everyone knows what we're talking about, so the way we're using language is fine.You know it's bad when someone with my username thinks you're being too pedantic lol. Dont be a language prescriptivist.
wrote last edited by [email protected]... almost everyone knows what we're talking about, so the way we're using language is fine.
You said it — almost. Not everyone knows or understands, so wouldn’t it be better to use the correct term instead of still using the wrong one? You’re saying almost because we’re on Lemmy, and yes, most Fediverse software users are techies. I have friends who talk about “AI,” I edited this to lessen confusion to the second paragraph. For more specific, they were talking that 'AI is going to take our job, it can do copywriting for me' but when I ask further, they’re actually talking about LLMs — which is not the same thing. And you yourself know it’s wrong, since you work in the related field. When I hear that, I just tell them, “It’s LLM, and LLMs are bla bla bla.” Whether they nod or not is on them, but at least they’ve been told the correct thing.
I accept being called a language prescriptivist in this case, because we’re here on Lemmy, most people are techies or nerds, and we’re discussing technology. In everyday conversation I’m not pedantic, but in technical contexts, precision matters.
This isn’t ‘whataboutism.’ I’m not opposing the substance of what’s being said, I’m pointing out how it’s being said. If we already know the correct term, why not use it? That’s not gatekeeping — that’s making the discussion clearer for everyone. As already being said on my previous comment, as an activist, that's also your role being an educator. Without education, activism turns into noise.
I think this is how it should end. I agree with the substance of what’s being said, and you’ve already acknowledged my earlier point about where LLMs fit within the AI field. Since saying “AI is bad” as activism should also involve educating people with the correct term, I see this as a technical context rather than a public one. I respect your view since you’ve provided argumentation. Thanks.
-
Whether intentional or not, this is gaslighting. "Here's the trendy reaction those wacky lemmings are currently upvoting!"
Getting to the core issue, of course we're sick of AI, and have a negative opinion of it! It's being forced into every product, whether it makes sense or not. It's literally taking developer jobs, then doing worse. It's burning fossil fuels and VC money and then hallucinating nonsense, but still it's being jammed down our throats when the vast majority of us see no use-case or benefit from it. But feel free to roll your eyes at those acknowledging the truth...
it's literally making its users nuts, or exacerbating their existing mental illness. not hyperbole, according to psychologists. and this isn't conjecture:
https://futurism.com/openai-investor-chatgpt-mental-health
https://futurism.com/chatgpt-psychosis-antichrist-aliens
-
Ai takes far more power to serve a single request than a website does though.
And remember, AI requires those websites too, for training data.
So it's not just more power hungry, it also has thw initial power consumption added on top
A good chunk of the Internet usage is HD videos which is far more power hungry than AI. I agree it's added on top...just like streaming did in 2010, and as things will continue to do.
-
It's a comparison of people, not of subjects. In becoming blind with rage upon seeing the letters A and I you act the same as a conservative person seeing the word "pronouns."
Commit to this. Let AI write all your responses from now on.
-
Do you think hammers grow out of the ground? Or that the magically spawn the building materials to work on?
Everything we do has a cost. We should definitely strive for efficiency and responsibile use of resources. But to use this as an excuse, while you read this in a device made of metals mined by children, is pretty hypocritical.
No consumption is ehical under capitalism, take responsibility instead for what you do with that consumption.
Most of the material used in hammers does come from the ground.
-
If we develop them into tools that help assist work.
Spoilers: We will not
I believe the AI business and the tech hype cycle is ultimately harming the field.
I think this is just an American way of doing business. And it's awful, but at the end of the day people will adopt technology if it makes them greater profit (or at least screws over the correct group of people).
But where the Americanized AI seems to suffer most is in their marketing fully eclipsing their R&D. People seem to have forgotten how DeepSeek spiked the football on OpenAI less than a year ago by making some marginal optimizations to their algorithm.
The field isn't suffering from the hype cycle nearly so much as it suffers from malinvestment. Huge efforts to make the platform marketable. Huge efforts to shoehorn clumsy chat bots into every nook and cranny of the OS interface. Vanishingly little effort to optimize material consumption or effectively process data or to segregate AI content from the human data it needs to improve.
-
Because when your employer catches on, they'll bring you back up to 40 anyway.
And probably because those 15 hours now produce shit quality.
My employer is pushing AI usage, if the work is done the work is done. This is the reality we're supposed to be living in with AI, just conforming to the current predatory system because "AI bad" actively harms more than it helps.
-
Whether intentional or not, this is gaslighting. "Here's the trendy reaction those wacky lemmings are currently upvoting!"
Getting to the core issue, of course we're sick of AI, and have a negative opinion of it! It's being forced into every product, whether it makes sense or not. It's literally taking developer jobs, then doing worse. It's burning fossil fuels and VC money and then hallucinating nonsense, but still it's being jammed down our throats when the vast majority of us see no use-case or benefit from it. But feel free to roll your eyes at those acknowledging the truth...
What is the gaslighting here? A trend, or the act of pointing out a trend, do not seem like gaslighting to me. At most it seems like bandwagon propaganda or the satire thereof.
For the second paragraph, I agree we (Lemmings) are all pretty against it and we can be echo-chambery about it. You know, like Linux!
But I would also DISagree that we (population of earth) are all against it.
-
The point is, most wouldn’t.
People currently want it despite it being stupid which is why corporations are in a frenzy to be the monopoly that provides it. People want all sorts of stupid things. A different system wouldn't change that.
Define people. Because obviously people don't here. The average person I talk to IRL on a daily basis don't know what it is, have never used it, and likely never will. And a system where the people currently pushing this wouldn't exist would certainly change things.
Your argument basically amounts to "nu uh".
-
Do you really need to have a list of why people are sick of LLM and Ai slop?
Ai is literally making people dumber:
https://www.theregister.com/2025/06/18/is_ai_changing_our_brains/
They are a massive privacy risk:
https://www.youtube.com/watch?v=AyH7zoP-JOg&t=3015s
Are being used to push fascist ideologies into every aspect of the internet:
https://newsocialist.org.uk/transmissions/ai-the-new-aesthetics-of-fascism/
And they are a massive environmental disaster:
https://news.mit.edu/2025/explained-generative-ai-environmental-impact-0117
Stop being a corporate apologist and stop wreaking the environment with this shit technology.
Edit: thank you to every Ai apologist outing themselves in the comments. Thank you for making blocking you easy.
Gish gallop
-
You asked for one example, I gave you one.
It's not just voice, I can ask it complex questions and it can understand context and put on lights or close blinds based on that context.
I find it very useful with no real drawbacks
The fact that was the best you could come up with is far more damning than not even having one.
-
Spoilers: We will not
Generative inpainting/fill is enormously helpful in media production.
Implicit costs refer to the opportunity costs associated with a firm's resources, representing the income that could have been earned if those resources were employed in their next best alternative use.
-
When someone disagrees with me - echo chamber.
When someone agrees with me - logical discussion.
Then why are you guys avoiding a logical discussion around environmental impact instead of spouting misinformation?
The fact of the matter is eating a single steak or lb of ground beef will eclipse all most peoples AI usage. Obviously most can't escape driving, but for those of us in cities biking will far eclipse your environmental impact than not using AI.
Serving AI models aren't even as bad as watching Netflix, this counterculture to AI is largely misdirected anger that thrown towards unregulated capitalism. Unregulated data centers. Unregulated growth.
Training is bad but training is a small piece of the puzzle that happens infrequently, and again circles back to the unregulated problem.
-
Implicit costs refer to the opportunity costs associated with a firm's resources, representing the income that could have been earned if those resources were employed in their next best alternative use.
I don't see the relevance here. Inpainting saves artists from time-consuming and repetitive labor for (often) no additional cost. Many generative inpainting models will run locally, but they're also just included with an Adobe sub.
-
AI use = small dick energy.
Yeah, go cry about it. People use AI to help themselves while you’re just being technophobic, shouting ‘AI is bad’ without even saying which AI you mean. And you’re doing it on Lemmy, a tiny techno-bubble. Lmao.
-
I don't see the relevance here. Inpainting saves artists from time-consuming and repetitive labor for (often) no additional cost. Many generative inpainting models will run locally, but they're also just included with an Adobe sub.
wrote last edited by [email protected]I don’t see the relevance here
Anthropic is losing $3 billion or more after revenue in 2025
OpenAI is on track to lose more than $10 billion.
xAI, makers of “Grok, the racist LLM,” losing it over $1 billion a month.
I don't know that generative infill justifies these losses.