AI Training Slop
-
This post did not contain any content.
It's already done, if you have any photographs of yourself on the internet. No need to fight that battle, accept and push forward.
-
how naive of him to think companies didn't already scrape his facial data from anywhere he might have had a picture 10 years ago
Yup. Last year some Harvard students put together a demo where they used Meta's smart glasses and commercial apps to scan people's faces, find their social media profiles, and summarize info about them, like where they live, work, their phone numbers, and names of their relatives in real time.
-
Your "facial data" isn't private information. You give it away every time you go outside.
You're talking about the American concept of having no privacy in public. Not all countries are like that.
-
Don't paparazzi make plenty of money off of selling unauthorized photos of celebrities? Celebrities can control some uses of their likeness, but not all of them.
True, though for now paparazzi photos generally are “here’s the celebrity in real life doing [x]” whereas AI is “celebrity never did this thing and we applied their image / voice to it like they did.” Really difficult for celebs to shut down tabloid or fan ai-generated garbage, but I think the bigger issue for them right now is film or music studios just using their likeness to keep the profits churning
-
This is my source : Forbes.
The source of the article is Imperva 2024 Bad Bot Report, but I cannot download the report. I do not know how they measured traffic. In this age of social media, I am going to guess it is by data volume and site visits.
Here’s the report:
-
It's already done, if you have any photographs of yourself on the internet. No need to fight that battle, accept and push forward.
I suppose I'll accept it and just start pushing forward with setting fires.
-
This post did not contain any content.
the cool thing about consent is that you're allowed to attack everyone who pretends it isn't real with any amount of force
-
Yup. Last year some Harvard students put together a demo where they used Meta's smart glasses and commercial apps to scan people's faces, find their social media profiles, and summarize info about them, like where they live, work, their phone numbers, and names of their relatives in real time.
wrote on last edited by [email protected]So basically Watch_Dogs profilers IRL
-
Like private subreddits or private messages.
Ah when stuff is behind a password but not encrypted and still on their servers. Yes.
-
It's already done, if you have any photographs of yourself on the internet. No need to fight that battle, accept and push forward.
And what if there's no photograph of myself online?
-
This post did not contain any content.
wow that's evil
-
the cool thing about consent is that you're allowed to attack everyone who pretends it isn't real with any amount of force
i mean, I would allow you, but the law doesn't unfortunately
-
And what if there's no photograph of myself online?
Be happy
-
wow that's evil
It's cleverly addressing a valid point. If your face is visible on the internet it can be used in an ai database without your consent. That's just where we're at.
-
Like private subreddits or private messages.
Reddit is about to make that somewhat more "public", I heard they are changing the pm and DMs to a chat system
-
It's cleverly addressing a valid point. If your face is visible on the internet it can be used in an ai database without your consent. That's just where we're at.
yeah fair enough but every use of the studio Ghibli image generator is one too many
-
And what if there's no photograph of myself online?
Pics or it didn't happen.
-
This post did not contain any content.
Yes I have.
With a model I fine tuned myself and ran locally on my own hardware.
Suck it
-
Yes I have.
With a model I fine tuned myself and ran locally on my own hardware.
Suck it
Just curious, do you know even as a rough estimation (maybe via the model card) how much energy was used to train the initial model and if so how do you believe it was done so in an ecologically justifiable way?
-
Your "facial data" isn't private information. You give it away every time you go outside.
Your face being outside isn't your "facial data". It has to at least have that image, sure, in good enough quality, easy enough, linked to any piece of your identity, e.g. name or security number. If you just walk around and people take photo of your face, they don't have your "facial data". That's the entire reason why reverse image search and similar services exist. It is NOT an easy problem technically speaking.