AI Training Slop
-
Ah when stuff is behind a password but not encrypted and still on their servers. Yes.
Correct.
-
The point is that OP (most probably) didn’t train it — they downloaded a pre-trained model and only did fine-tuning and inference.
Right, my point is exactly that though, that OP by having just downloaded it might not realize the training costs. They might be low but on average they are quite high, at least relative to fine-tuning or inference. So my question was precisely to highlight that running locally while not knowing the training cost is naive, ecologically speaking. They did clarify though that they do not care so that's coherent for them. I'm insisting on that point because maybe others would think "Oh... I can run a model locally, then it's not <<evil>>" so I'm trying to clarify (and please let me know if I'm wrong) that it is good for privacy but the upfront training cost are not insignificant and might lead some people to prefer NOT relying on very costly to train models and prefer others, or a even a totally different solution.
-
This post did not contain any content.
I've given up and assume that my friends and family have already handed over my contact info, pictures, messages, DNA, etc
-
Right, my point is exactly that though, that OP by having just downloaded it might not realize the training costs. They might be low but on average they are quite high, at least relative to fine-tuning or inference. So my question was precisely to highlight that running locally while not knowing the training cost is naive, ecologically speaking. They did clarify though that they do not care so that's coherent for them. I'm insisting on that point because maybe others would think "Oh... I can run a model locally, then it's not <<evil>>" so I'm trying to clarify (and please let me know if I'm wrong) that it is good for privacy but the upfront training cost are not insignificant and might lead some people to prefer NOT relying on very costly to train models and prefer others, or a even a totally different solution.
The model exists already — abstaining from using it doesn’t make the energy consumption go away. I don’t think it’s reasonable to let historic energy costs drive what you do, else you would never touch a computer.
-
The model exists already — abstaining from using it doesn’t make the energy consumption go away. I don’t think it’s reasonable to let historic energy costs drive what you do, else you would never touch a computer.
Indeed, the argument is mostly for future usage and future models. The overall point being that assuming training costs are negligible is either naive or showing that one does not care much for the environment.
From a business perspective, if I'm Microsoft or OpenAI, and I see a trend to prioritize models that minimize training costs, or even that users are avoiding costly to train model, I will adapt to it. On the other hand if I see nobody cares for that, or that even building more data center drives the value up, I will build bigger models regardless of usage or energy cost.
The point is that training is expensive and that pointing only to inference is like the Titanic going full speed ahead toward the iceberg saying how small it is. It is not small.
-
Just curious, do you know how many trees were MOLESTED to create that air you're breathing?
I know at least seven were. It would've been more but I got a splinter and that really turned me off.
-
Straw-hat much or just learning about logistics and sourcing in our globalized supply chain?
Satirically pointing out that worrying about electricity usage for model creation is ridiculous.
It's already spent. The model exists. It's probably MORE moral to use it as much as possible to get some positive value out of it. Otherwise it was just wasted.
-
Satirically pointing out that worrying about electricity usage for model creation is ridiculous.
It's already spent. The model exists. It's probably MORE moral to use it as much as possible to get some positive value out of it. Otherwise it was just wasted.
wrote on last edited by [email protected]Yes indeed, yet my point is that we keep on training models TODAY so if keep on not caring, then we do postpone the same problem, cf https://lemmy.world/post/30563785/17400518
Basically yes, use trained model today if you want but if we don't set a trend then despite the undeniable ecological impact, there will be no corrective measure.
It's not enough to just say "Oh well, it used a ton of energy. We MUST use it now."
Anyway, my overall point was that training takes a ton of energy. I'm not asking your or OP or anyone else NOT to use such models. I'm solely pointing out that doing so without understand the process that lead to such models, including but not limited to energy for training, is naive at best.
Edit: it's also important to point out alternatives that are not models, namely there are already plenty of specialized tools that are MORE efficient AND accurate today. So even if the model took a ton of energy to train, in such case it's still not rational to use it. It's a sunk cost.
-
I've given up and assume that my friends and family have already handed over my contact info, pictures, messages, DNA, etc
Sorry.
I didn't know you minded.
-
This post did not contain any content.
hmm. tools are useful for what they are designed for. maybe design a bot to design bots.
-
Just curious, do you know even as a rough estimation (maybe via the model card) how much energy was used to train the initial model and if so how do you believe it was done so in an ecologically justifiable way?
wrote on last edited by [email protected]Just curious, do you know how much energy went into powering every computer and office room for 3 years while the latest videogame/hollyowood movie/etc was being made used up?
Should we ban every single non-essential thing in the world or only the ones you don't enjoy?
And please hop-off Lemmy, do you know how much power the devs used to program this site!
-
I just saw an ad for a “training course” to “qualify” people to interact with AI as a profession.
Are you in Europe? The AI Act requires some unspecified "AI literacy" from staff working with AI. Some sort of grift, I guess.
-
This post did not contain any content.wrote on last edited by [email protected]
I remember years ago someone in my class decided to make Russian look alike pictures of everyone in the class and post them as a gag on the doors. I forget what it was called, but several of my classmates were angry that the person had taken their pictures without consent and given them to some weird Russian picture algorithm.
At this point in time, I have no doubt that all kinds of pictures and information regarding me is in the hands of people and companies I don't care for. A lot of it is my own doing and some is out of my hands.
It is hard to avoid when you don't have any control over your own information because people share your pictures and your info without consulting you. All the time and without malice. It is what it is.
-
Are you in Europe? The AI Act requires some unspecified "AI literacy" from staff working with AI. Some sort of grift, I guess.
Not at the time I saw it. It was just an internet ad.
-
I remember years ago someone in my class decided to make Russian look alike pictures of everyone in the class and post them as a gag on the doors. I forget what it was called, but several of my classmates were angry that the person had taken their pictures without consent and given them to some weird Russian picture algorithm.
At this point in time, I have no doubt that all kinds of pictures and information regarding me is in the hands of people and companies I don't care for. A lot of it is my own doing and some is out of my hands.
It is hard to avoid when you don't have any control over your own information because people share your pictures and your info without consulting you. All the time and without malice. It is what it is.
Pff it's easy
Cut contact with all friends and family, get plastic surgery, live as a hermit in the mountains -
Indeed, the argument is mostly for future usage and future models. The overall point being that assuming training costs are negligible is either naive or showing that one does not care much for the environment.
From a business perspective, if I'm Microsoft or OpenAI, and I see a trend to prioritize models that minimize training costs, or even that users are avoiding costly to train model, I will adapt to it. On the other hand if I see nobody cares for that, or that even building more data center drives the value up, I will build bigger models regardless of usage or energy cost.
The point is that training is expensive and that pointing only to inference is like the Titanic going full speed ahead toward the iceberg saying how small it is. It is not small.
If you're a company you don't care what the home user does. They didn't pay for the model and so their existence in the first place indicates a missed opportunity for market share.
No one is saying training costs are negligible. They're saying the cost has already been paid and they had no say in influencing it then or in the future. If you don't pay for it and they can't tell how often you use it they can't really be influenced by your behavior.
It's like being overly concerned with the impact of a microwave you found by the road. The maker doesn't care about your opinion of it because you don't give them money. The don't even know you exist. The only thing you can meaningfully influence is how it's used today.
-
I've given up and assume that my friends and family have already handed over my contact info, pictures, messages, DNA, etc
Honestly giving up is reasonable. We need EVERYONE to respect privacy for this whole thing to work.
You could be the most privacy focused individual and your mom's facebook page would still have your graduation picture with name of the highschool you went and your home address in the back somewhere.
-
Honestly giving up is reasonable. We need EVERYONE to respect privacy for this whole thing to work.
You could be the most privacy focused individual and your mom's facebook page would still have your graduation picture with name of the highschool you went and your home address in the back somewhere.
so what you are saying is that its already over and we lost.
-
Yes indeed, yet my point is that we keep on training models TODAY so if keep on not caring, then we do postpone the same problem, cf https://lemmy.world/post/30563785/17400518
Basically yes, use trained model today if you want but if we don't set a trend then despite the undeniable ecological impact, there will be no corrective measure.
It's not enough to just say "Oh well, it used a ton of energy. We MUST use it now."
Anyway, my overall point was that training takes a ton of energy. I'm not asking your or OP or anyone else NOT to use such models. I'm solely pointing out that doing so without understand the process that lead to such models, including but not limited to energy for training, is naive at best.
Edit: it's also important to point out alternatives that are not models, namely there are already plenty of specialized tools that are MORE efficient AND accurate today. So even if the model took a ton of energy to train, in such case it's still not rational to use it. It's a sunk cost.
How much electricity was wasted for you to post, and us to receive, your human slop
-
Yes I have.
With a model I fine tuned myself and ran locally on my own hardware.
Suck it
yea this attitude right here is why ai bros are so beloved