Is It Just Me?
-
"Stolen" images.. if its on the web its free to learn from.
wrote last edited by [email protected]That's not how intellectual property rights work.
In the US, an artistic work is automatically protected by copyright when it is created. Displaying the art publicly does not remove the artist's copyright. Only the artist/copyright owner can grant someone else rights to use the work. Again, public accessibility of the work does not degrade the copyright.
-
Not necessarily. There are "do not crawl" tags bypassed by AI bots, burdening sites with greater server load.
Seriously, copyright doesn't just go away because it's online. The concept of "right of reproduction" is a vast and well defined area of law.
You can argue copyright law is garbage and archaic and needs to be overhauled sure, but right now "if it's on the Web it's free" only counts if you're Meta and can pay off a judge or something
-
Unfortunately the masses will do as they're told. Our society has been trained to do this. Even those that resist are playing their part.
On the contrary: society has repeatedly rejected a lot of ideas that industries have come up with.
HD DVD, 3D TV, Crypto Currency, NFT's, Laser Discs, 8-track tapes, UMD's. A decade ago everyone was hyping up how VR would be the future of gaming, yet it's still a niche novelty today.
The difference with AI is that I don't think I've ever seen a supply side push this strong before. I'm not seeing a whole lot of demand for it from individual people. It's "oh this is a neat little feature I can use" not "this technology is going to change my life" the way that the laundry machine, the personal motor vehicle, the telephone, or the internet did. I could be wrong but I think that as long as we can survive the bubble bursting, we will come out on the other side with LLM's being a blip on the radar. And one consequence will be that if anyone makes a real AI they will need to call it something else for marketing purposes because "AI" will be ruined.
-
This post did not contain any content.
Is there a way for me to take a picture of a food and find nutritional values without AI? I sometimes use duck.ai to ask because, when making tortilla for example idk what could be exact because while I can read values for a tortilla, I don't have a way to check the same for meat and other similar stuff I put in tortilla.
-
This post did not contain any content.
I feel the same way. I was talking with my mom about AI the other day and she was still on the "it's not good that AI is trained on stolen images, how it's making people lazy and taking jobs away from ppl" which is good, but I had to explain to her how much one AI prompt costs in energy and resources, how many people just mindlessly make hundreds of prompts a day for largely stupid shit they don't need and how AI hallucinates, is actively used by bad actors to spread mis- and disinformation and how it is literally being implemented into search engines everywhere so even if you want to avoid it as a normal person, you may still end up participating in AI prompting every single fucking time you search for anything on Google. She was horrified.
There definitely are some net positives to AI, but currently the negatives outweigh the positives and most people are not using AI responsibly at all. I have little to no respect for people who use AI to make memes or who use it for stupid everyday shit that they could have figured out themselves.
The most dystopian shit I have seen recently was when my boyfriend and I went to watch Weapons in cinema and we got an ad for an AI assistent. The ad is basically this braindead bimbo at a laundry mat deciding to use AI to tell her how to wash her clothes instead of looking at the fucking flips on her clothes and putting two and two together. She literally takes a picture of the flip and has the AI assistent tell her how to do it and then going "thank you so much, I could have never done this without you".
I fucking laughed in the cinema. Laughed and turned to my boyfriend and said: this is so fucking dystopian, dude.
I feel insane for seeing so many people just mindlessly walking down this path of utter retardation. Even when you tell them how disastrous it is for the planet, it doesn't compute in their heads because it is not only convenient to have a machine think for you. It's also addictive.
-
I can't take anyone seriously that says it's "trained on stolen images."
Stolen, you say? Well, I guess we're going to have to force those AI companies to put those images back! Otherwise, nobody will be able to see them!
...because that's what "stolen" means. And no, I'm not being pendantic. It's a really fucking important distinction.
The correct term is, "copied" but that doesn't sound quite as severe. Also, if we want to get really specific, the images are presently on the Internet. Right now. Because that's what ImageNET (and similar) is: A database of URLs that point to images that people are offering up for free to anyone that wants on the Internet.
Did you ever upload an image anywhere publicly, for anyone to see? Chances are someone could've annotated it and included it in some AI training database. If it's on the Internet, it will be copied and used without your consent or knowledge. That's the lesson we learned back in the 90s and if you think that's not OK then go try to get hired by the MPAA/RIAA and you can try to bring the world back to the time where you had to pay $10 for a ringtone and pay again if you got a new phone (because—to the big media companies—copying is stealing!).
Now that's clear, let's talk about the ethics of training an AI on such data: There's none. It's an N/A situation! Why? Because until the AI models are actually used for any given purpose they're just data on a computer somewhere.
What about legally? Judges have already ruled in multiple countries that training AI in this way is considered fair use. There's no copyright violation going on... Because copyright only covers distribution of copyrighted works, not what you actually do with them (internally; like training an AI model).
So let's talk about the real problems with AI generators so people can take you seriously:
- Humans using AI models to generate fake nudes of people without their consent.
- Humans using AI models to copy works that are still under copyright.
- Humans using AI models to generate shit-quality stuff for the most minimal effort possible, saying it's good enough, then not hiring an artist to do the same thing.
The first one seems impossible to solve (to me). If someone generates a fake nude and never distributes it... Do we really care? It's like a tree falling in the forest with no one around. If they (or someone else) distribute it though, that's a form of abuse. The act of generating the image was a decision made by a human—not AI. The AI model is just doing what it was told to do.
The second is—again—something a human has to willingly do. If you try hard enough, you can make an AI image model get pretty close to a copyrighted image... But it's not something that is likely to occur by accident. Meaning, the human writing the prompt is the one actively seeking to violate someone's copyright. Then again, it's not really a copyright violation unless they distribute the image.
The third one seems likely to solve itself over time as more and more idiots are exposed for making very poor decisions to just "throw it at the AI" then publish that thing without checking/fixing it. Like Coca Cola's idiotic mistake last Christmas.
There might be as many shit takes in this post as there are em dashes. I mean, wow.
-
This post did not contain any content.wrote last edited by [email protected]
Yes, you're the weird one. Once you realize that 43% of the USA is FUNCTIONALLY ILLITERATE you start realizing why people are so enamored with AI. (since I know some twat is gonna say shit: I'm using the USA here as an example, I'm not being us-centric)
Our artificial intelligence, is smarter than 50% of the population (don't get started on 'hallucinations'...do you know how many hallucinations the average person has every day?!) -- and is stupider than the top 20% of the population.
The top 20%, wonder if everyone has lost their fucking minds, because to them it looks like it is completely worthless.
It's more just that the top 20% are naive to the stupidity of the average person.
-
This post did not contain any content.
The worst is in the workplace. When people routinely tell me they looked something up with AI, I now have to assume that I can't trust what they say anylonger because there is a high chance they are just repeating some AI halucination. It is really a sad state of affairs.
-
This post did not contain any content.
The way I look at it is that I haven't heard anything about NFTs in a while. The bubble will burst soon enough when investors realize that it's not possible to get much better without a significant jump forward in computing technology.
We're running out of atomic room to make thing smaller just a little more slowly than we're running out of ways to even make smaller things, and for a computer to think like, as well as as quickly or faster than a person we need processing power to continue to increase exponentially per unit of space. Silicon won't get us there.
-
The worst is in the workplace. When people routinely tell me they looked something up with AI, I now have to assume that I can't trust what they say anylonger because there is a high chance they are just repeating some AI halucination. It is really a sad state of affairs.
Do you also check if they listen to Joe Rogan? Fox news? Nobody can be trusted. AI isn't the problem, it's that it was trained on human data -- of which people are an unreliable source of information.
-
That's not how intellectual property rights work.
In the US, an artistic work is automatically protected by copyright when it is created. Displaying the art publicly does not remove the artist's copyright. Only the artist/copyright owner can grant someone else rights to use the work. Again, public accessibility of the work does not degrade the copyright.
Yeah, but fuck copyright.
-
This post did not contain any content.wrote last edited by [email protected]
I absolutely agree that AI is becoming a mental crutch that a disturbing number of people are snatching up and hobbling around on. It feels like the setup of Wall-E, where everyone is rooted in their floating rambler scooters.
I think the fixation on individual consumer use of AI is overstated. The bulk of the AI's energy/water use is in the modeling and endless polling. The random guy asking "@Grok is this true?" is having a negligible impact on energy usage, particularly in light of the number of automated processes that are hammering the various AI interfaces far faster than any collection of humans could.
I'm not going to use AI to write my next adventure or generate my next character. I'm not going to bemoan a player who shows up to game with a portrait with melted fingers, because they couldn't find "elf wizard in bearskin holding ice wand while standing on top of glacier" in DeviantArt.
For the vast majority of users, this is a novelty. What's more, its a novelty that's become a stand-in for the OG AI of highly optimized search engines that used to fulfill the needs we're now plugging into the chatbot machine. I get why people think it sucks and abstain from using it. I get why people who use it too much can straight up drive themselves insane. I get that our Cyberpunk style waste management strategy is going to get one of the news few generations into a nightmarish blight. But I'm not going to hang that on the head of someone who wants to sit down at a table with their friends, look them in the eye, and say "Check out this cool new idea I turned into a playable character".
Because if you're at the table and you're excited to play with other humans in a game about going out into the world on adventures, that's as good an antedote to AI as I could come up with.
And hey, as a DM? If you want to introduce the Mind Flayer "Idea Sucker" machine that lures people into its brain-eating maw by promising to give them genius powers? And maybe you want to name the Mind Flayer Lord behind the insidious plot Beff Jezos or Mealon Husk or something? Maybe that's a good way to express your frustration with the state of things.
-
On the contrary: society has repeatedly rejected a lot of ideas that industries have come up with.
HD DVD, 3D TV, Crypto Currency, NFT's, Laser Discs, 8-track tapes, UMD's. A decade ago everyone was hyping up how VR would be the future of gaming, yet it's still a niche novelty today.
The difference with AI is that I don't think I've ever seen a supply side push this strong before. I'm not seeing a whole lot of demand for it from individual people. It's "oh this is a neat little feature I can use" not "this technology is going to change my life" the way that the laundry machine, the personal motor vehicle, the telephone, or the internet did. I could be wrong but I think that as long as we can survive the bubble bursting, we will come out on the other side with LLM's being a blip on the radar. And one consequence will be that if anyone makes a real AI they will need to call it something else for marketing purposes because "AI" will be ruined.
VR was and is also still a very inaccessible tool for most people. It costs a lot of money and time to even get to the point where you're getting the intended VR experience and that is what it mostly boils down to: an experience. It isn't convenient or useful and people can't afford it. And even though there are many gamers out there, most people aren't gamers and don't care about mounting a VR headset on their cranium and getting seasick for a few minutes.
AI is not only accessible and convenient, it is also useful to the everyday person, if the AI doesn't hallucinate like hell, that is. It has the potential to optimize workloads in jobs with a lot of paperwork, calculations and so on.
I completely agree with you that AI is being pushed very aggressively in ways we haven't seen before and that is because the tech people and their investors poured a lot of money into developing these things. They need it to be a success so they can earn their money back and they will be successful eventually because everybody with money and power has a huge interest in this tool becoming a part of everyday life. It can be used to control the masses in ways we cannot even imagine yet and it can earn the creators and investors a lot of money.
They are already making AI computers. According to some it will entirely replace the types of computers we are used to today. From what I can understand, it will be preferable to the open AI setups we have currently that are burning our planet to a crisp with the amount of data centers that need to keep them active. Supposedly the AI computer will have it be a local thing on the laptop and it will therefore demand less resources, but I'm so fucking skeptic about all this shit that I'm waiting to see how much power a computer with an AI operating system will need to swallow in energy. I'm too tech-ignorant to understand the ins and outs of what this and that means, but we are definitely going to have to accept that AI is here to stay and the current setup with open AIs and forced LLM's in every search engine is a massive environmental nightmare. It probably won't stop or change a fucking lick because people don't give a fuck as long as they are comfortable and the companies are getting people to use their trash tech just like they wanted so they won't stop it either.
-
I absolutely agree that AI is becoming a mental crutch that a disturbing number of people are snatching up and hobbling around on. It feels like the setup of Wall-E, where everyone is rooted in their floating rambler scooters.
I think the fixation on individual consumer use of AI is overstated. The bulk of the AI's energy/water use is in the modeling and endless polling. The random guy asking "@Grok is this true?" is having a negligible impact on energy usage, particularly in light of the number of automated processes that are hammering the various AI interfaces far faster than any collection of humans could.
I'm not going to use AI to write my next adventure or generate my next character. I'm not going to bemoan a player who shows up to game with a portrait with melted fingers, because they couldn't find "elf wizard in bearskin holding ice wand while standing on top of glacier" in DeviantArt.
For the vast majority of users, this is a novelty. What's more, its a novelty that's become a stand-in for the OG AI of highly optimized search engines that used to fulfill the needs we're now plugging into the chatbot machine. I get why people think it sucks and abstain from using it. I get why people who use it too much can straight up drive themselves insane. I get that our Cyberpunk style waste management strategy is going to get one of the news few generations into a nightmarish blight. But I'm not going to hang that on the head of someone who wants to sit down at a table with their friends, look them in the eye, and say "Check out this cool new idea I turned into a playable character".
Because if you're at the table and you're excited to play with other humans in a game about going out into the world on adventures, that's as good an antedote to AI as I could come up with.
And hey, as a DM? If you want to introduce the Mind Flayer "Idea Sucker" machine that lures people into its brain-eating maw by promising to give them genius powers? And maybe you want to name the Mind Flayer Lord behind the insidious plot Beff Jezos or Mealon Husk or something? Maybe that's a good way to express your frustration with the state of things.
What's more, its a novelty that's become a stand-in for the OG AI of highly optimized search engines that used to fulfill the needs we're now plugging into the chatbot machine.
I don’t think it’s temporary. That was the whole goal - suck up everybody’s work, dark-magick it into a chatbot and voilá: no more need for anyone’s webpage.
The fact that it's broken what was working is more than just a metaphor for gen-AI in any setting. It’s fundamentally changed it for the worse and we’ll never get the unfucked version back.
-
Yes, you're the weird one. Once you realize that 43% of the USA is FUNCTIONALLY ILLITERATE you start realizing why people are so enamored with AI. (since I know some twat is gonna say shit: I'm using the USA here as an example, I'm not being us-centric)
Our artificial intelligence, is smarter than 50% of the population (don't get started on 'hallucinations'...do you know how many hallucinations the average person has every day?!) -- and is stupider than the top 20% of the population.
The top 20%, wonder if everyone has lost their fucking minds, because to them it looks like it is completely worthless.
It's more just that the top 20% are naive to the stupidity of the average person.
Our artificial intelligence, is smarter than 50% of the population
"Smartness" and illiteracy are certainly different things, though. You might be incapable of reading, yet be able to figure out a complex escape room via environmental cues that the most high quality author couldn't, as an example.
There are many places an AI might excel compared to these people, and many areas it will fall behind. Any sort of unilateral statement here disguises the fact that while a lot of Americans are illiterate, stupid, or even downright incapable of doing simple tasks, "AI" today is very similar, just that it will complete a task incorrectly, make up a fact instead of just "not knowing" it, or confidently state a summary of a text that is less accurate than first grader's interpretation.
Sometimes it will do better than many humans. Other times, it will do much worse, but with a confident tone.
AI isn't necessarily smarter in most cases, it's just more confident sounding in its incorrect answers.
-
On the contrary: society has repeatedly rejected a lot of ideas that industries have come up with.
HD DVD, 3D TV, Crypto Currency, NFT's, Laser Discs, 8-track tapes, UMD's. A decade ago everyone was hyping up how VR would be the future of gaming, yet it's still a niche novelty today.
The difference with AI is that I don't think I've ever seen a supply side push this strong before. I'm not seeing a whole lot of demand for it from individual people. It's "oh this is a neat little feature I can use" not "this technology is going to change my life" the way that the laundry machine, the personal motor vehicle, the telephone, or the internet did. I could be wrong but I think that as long as we can survive the bubble bursting, we will come out on the other side with LLM's being a blip on the radar. And one consequence will be that if anyone makes a real AI they will need to call it something else for marketing purposes because "AI" will be ruined.
AI's biggest business is (if not already, it will be) surveillance systems sold to authoritarian governments worldwide. Israel is using it in Gaza. It's both used internally and exported as a product by China. Not just cameras on street corners doing facial recognition, but monitoring the websites you visit, the things you buy, the people you talk to. AI will be used on large datasets like these to label people as dissidents, to disempower them financially, and to isolate them socially. And if the AI hallucinates in this endeavor, that's fine. Better to imprison 10 innocent men than to let 1 rebel go free.
In the meantime, AI is being laundered to the individual consumer as a harmless if ineffective toy. "Make me a portrait, give me some advice, summarize a meeting," all things it can do if you accept some amount of errors. But given this domain of problems it solves, the average person would never expect that anyone would use it to identify the first people to pack into train cars.
-
This post did not contain any content.wrote last edited by [email protected]
Billionaires: invests heavily in water.
Billionaires: "In the future there's going to be water wars. You need to invest NOW! Quick before it's too late. I swear I'm not just trying to pump the stock."
Billionaires: "Water isn't accruing value fast enough. Let's invent a product that uses a shit ton of it!"
Billionaires: "No one likes or is using the product. Force them to. Include it in literally all software and every website. Make it so they're using the product even when they don't know they're using it. Include it in every web search. I want that water gone by the end of this quarter!"
-
Is there a way for me to take a picture of a food and find nutritional values without AI? I sometimes use duck.ai to ask because, when making tortilla for example idk what could be exact because while I can read values for a tortilla, I don't have a way to check the same for meat and other similar stuff I put in tortilla.
You're probably just gonna have to get better at guesstimating, (e.g. by comparing to similar pre-made options and their nutrition labels), or use an app for tracking nutrition that integrates with OpenFoodFacts and get a scale to weigh your ingredients. (or a similar database, though most use OpenFoodFacts even if they have their own, too)
I don't really know of any other good ways to just take photos and get a good nutritional read, and pretty much any implementation would use "AI" to some degree, though probably more a dedicated machine learning model over an LLM, which would use more power and water, but the method of just weighing out each part of a meal and putting it in an app works pretty well.
Like, for me, I can scan the barcode of the tortillas I buy to import the nutrition facts into the (admittedly kind of janky) app I use (Waistline), then plop my plate on my scale, put in some ground beef, scan the barcode from the beef packaging, and then I can put in how many grams I have. Very accurate, but a little time consuming.
Not sure if that's the kind of thing you're looking for, though.
-
Do you also check if they listen to Joe Rogan? Fox news? Nobody can be trusted. AI isn't the problem, it's that it was trained on human data -- of which people are an unreliable source of information.
AI also just makes things up. Like how RFKJr's "Make America Healthy Again" report cites studies that don't exist and never have, or literally a million other examples. You're not wrong about Fox news and how corporate and Russian backed media distorts the truth and pushes false narratives, and you're not wrong that AI isn't the problem, but it is certainly a problem and a big one at that.
-
Do you also check if they listen to Joe Rogan? Fox news? Nobody can be trusted. AI isn't the problem, it's that it was trained on human data -- of which people are an unreliable source of information.
Joe Rogan doesn't tell them false domain kowledge