AI Training Slop
-
so what you are saying is that its already over and we lost.
Overwhelming so.
-
Pff it's easy
Cut contact with all friends and family, get plastic surgery, live as a hermit in the mountainsAka living the dream
-
No one is saying training costs are negligible.
It's literally what the person I initially asked said though, they said they don't know and don't care.
That's far from saying they're negligible. What they're saying is inline with my point. If you find a microwave are you going to research how green it's manufacturing was so you can ensure you only find good ones for free in the future?
Irrelevant or moot is different from negligible. One says it's small enough to not matter, and the other says it doesn't affect your actions.
I play with AI models on my own computer. I think the training costs are far from negligible and for the most part shouldn't have been bothered with. (I'm very tolerant of research models that are then made public. Even though the tech isn't scalable or as world changing as some think doesn't mean it isn't worth understanding or that it won't lead to something more viable later. Churning it over and over without open results or novelty isn't worth it though).
I also think that the training costs are irrelevant with regards to how I use it at home. They're spent before I knew it existed, and they never have or will see information or feedback from me.
My home usage had less impact than using my computer for games has. -
Pff it's easy
Cut contact with all friends and family, get plastic surgery, live as a hermit in the mountainsMy dream. Get a death certificate and become invisible. Live in mountains. Raise chicken. And live a peaceful life
-
Honestly giving up is reasonable. We need EVERYONE to respect privacy for this whole thing to work.
You could be the most privacy focused individual and your mom's facebook page would still have your graduation picture with name of the highschool you went and your home address in the back somewhere.
That's also ignoring how all of your actual personal information (full name, address, social security, phone number, email, etc) have already been leaked 16 times this year alone
-
Are you in Europe? The AI Act requires some unspecified "AI literacy" from staff working with AI. Some sort of grift, I guess.
Do you have a source for this? Not doubting you, I'm just not European so I must have missed this
-
Do you have a source for this? Not doubting you, I'm just not European so I must have missed this
I'm always glad when someone is interested and conscientious enough to ask for a source.
Article 4 in full:
Providers and deployers of AI systems shall take measures to ensure, to their best extent, a sufficient level of AI literacy of their staff and other persons dealing with the operation and use of AI systems on their behalf, taking into account their technical knowledge, experience, education and training and the context the AI systems are to be used in, and considering the persons or groups of persons on whom the AI systems are to be used.
AI Act -> https://eur-lex.europa.eu/eli/reg/2024/1689/oj/eng
BTW. That site is the official repository for EU law. It's also how EU law is promulgated. What you find there is, by definition, the correct version (unless stated otherwise).
-
Pics or it didn't happen.
-
I remember years ago someone in my class decided to make Russian look alike pictures of everyone in the class and post them as a gag on the doors. I forget what it was called, but several of my classmates were angry that the person had taken their pictures without consent and given them to some weird Russian picture algorithm.
At this point in time, I have no doubt that all kinds of pictures and information regarding me is in the hands of people and companies I don't care for. A lot of it is my own doing and some is out of my hands.
It is hard to avoid when you don't have any control over your own information because people share your pictures and your info without consulting you. All the time and without malice. It is what it is.
There's stuff I could do, like remove tags from myself on fb (is that possible?) or delete my account, but it's enough work and enough of a loss (what if I need to find an old contact) that I just ignore the problem.
-
There's stuff I could do, like remove tags from myself on fb (is that possible?) or delete my account, but it's enough work and enough of a loss (what if I need to find an old contact) that I just ignore the problem.
It sure is possible, because I untagged myself from all pictures people had tagged me on before deleting all comments I ever wrote, all pictures I ever posted myself and then deleted my Facebook after that.
For years, the only thing that kept me on Facebook was that I had a few people I only had contact with through messenger due to us being from differnet countries.
When I learned about Signal, I immediately got those people onto that app so we could stay in contact and then I went on a mass destruction rampage of my profile. Literally went from "but I have to keep it because of my connections" to "let me simulate digital dementia, bitch".
I understand that most people can't do what I did. For me it was several years of gradual detachment from the platform that made it super easy to pull the plug in the end. It's a bit harder for those who actively use fb every day for social connections and jobs and so on. So I get it.
But yeah, you can't really control whether or not people keep posting about you after you leave. I have already had that happen after visiting an old friend and honestly, I cannot bring myself to care about it.
-
That's far from saying they're negligible. What they're saying is inline with my point. If you find a microwave are you going to research how green it's manufacturing was so you can ensure you only find good ones for free in the future?
Irrelevant or moot is different from negligible. One says it's small enough to not matter, and the other says it doesn't affect your actions.
I play with AI models on my own computer. I think the training costs are far from negligible and for the most part shouldn't have been bothered with. (I'm very tolerant of research models that are then made public. Even though the tech isn't scalable or as world changing as some think doesn't mean it isn't worth understanding or that it won't lead to something more viable later. Churning it over and over without open results or novelty isn't worth it though).
I also think that the training costs are irrelevant with regards to how I use it at home. They're spent before I knew it existed, and they never have or will see information or feedback from me.
My home usage had less impact than using my computer for games has.I'm playing games at home. I'm running models at home (I linked in other similar answers to it) for benchmarking.
My point is that models are just like anything I bring into my home I try to only buy products that are manufactured properly. Someone else in this thread asked me about child labor for electronics and IMHO that was actually a good analogy. You here mention buying a microwave and that's another good example.
Yes, if we do want to establish feedback in the supply chain, we must know how everything we rely on is made. It's that simple.
There are already quite a few initiatives for that with e.g. coffee with Fair Trade Certification or ISO 14001, in electronics Fair Materials, etc.
The point being that there are already mechanisms for feedback in other fields and in ML there are already model cards with a
co2_eq_emissions
field, so why couldn't feedback also work in this field?