GenAI website goes dark after explicit fakes exposed
-
Even when consent is informed it can still be fucky. Do you think I want to consent to an arbitration agreement with my employer or a social media platform? Fuck no, but I want a job and interaction so I go where the money/people are. I can't hunt around for a place that will hire me and also doesn't have arbitration.
Consent at the barrel of a gun, No matter how well informed, is no consent at all.
Do you think I want to consent to an arbitration agreement
In many countries mandatory arbitration agreements in a B2C context are invalid. They have no legal power.
-
It fetishes the subject's images, and nobody knows if it would lead to recivitism in child predators. It is generally accepted that producing drawings of CP alone is bad, let alone by AI. I remember some dude getting arrested at the Canadian border for sexual drawings of Bart and Lisa. Regardless, I would say that it is quite controversial and probably not what you'd want your company to be known for ...
Japan has a vibrant drawn cp market yet they not not even close to the highest rate of child abuse. https://undispatch.com/here-is-how-every-country-ranks-on-child-safety/
-
That's fair, but I still think it shouldn't be accepted or allowed.
I agree it shouldn't be accepted, but I disagree on being allowed. I think it should be allowed because it doesn't hurt anyone.
-
Very true, thanks for your sensitivity @dumbass
It's pronounced "doo mah."
-
It's pronounced "doo mah."
Wow so its from the duh region I'm france, here I thought it was just sparkling dumbass
-
Japan has a vibrant drawn cp market yet they not not even close to the highest rate of child abuse. https://undispatch.com/here-is-how-every-country-ranks-on-child-safety/
-
Gen AI doesn't take cp content and recreates it. There wouldn't be a point of gen AI if that is the case. It knows what regular porn looks like and what a child looks like and it generates an image. With those inputs it can create something new and at the same time hurt nobody.
Prove it. Please, show me the full training data to guarantee you're right.
But also, all the kids used for "kids face data" didn't sign up to be porn
-
Plenty of hentai out there covering questionable subjects to train AI on as well.
-
It's pronounced "doo mah."
Shawshank reference?
-
This post did not contain any content.
Generated AI CP should be illegalized even if its creation did not technically harm anyone. The reason is, presumably it looks too close to real CP, so close that it: 1) normalizes consumption of CP, 2) grows a market for CP, and 3) Real CP could get off the hook by claiming it is AI.
While there are similar reasons to be against clearly not real CP (e.g. hentai), this type at least does not have problem #3. For example, there doesnt need to be an investigation into whether a picture is real or not.
-
Generated AI CP should be illegalized even if its creation did not technically harm anyone. The reason is, presumably it looks too close to real CP, so close that it: 1) normalizes consumption of CP, 2) grows a market for CP, and 3) Real CP could get off the hook by claiming it is AI.
While there are similar reasons to be against clearly not real CP (e.g. hentai), this type at least does not have problem #3. For example, there doesnt need to be an investigation into whether a picture is real or not.
It is here at least.
If it wasn't you could just flood everything with it and it would be impossible to go after actual cp
-
Shawshank reference?
A&W root beer
-
Prove it. Please, show me the full training data to guarantee you're right.
But also, all the kids used for "kids face data" didn't sign up to be porn
I don't need to. It's is just the way gen AI works. It takes images of things it knows and then generates NEW content based on what it think you want with your prompts.
If I'm looking for a infant flying an airplane, gen AI knows what a pilot looks like and what a child looks like and it creates something new.
Also kids face data doesn't mean they take the actual face of the actual child and paste it on a body. It might take an eyebrow and a freckle from one kidand use a hair style from another and eyes from someone else.
Lastly, the kids parents consented when they upload images of their kids on social media.
-
Im not advocating for cp. I'm advocating for freedom.
A crime is only a crime if someone is negative effected. Gen AI is just a more accessible Photoshop.
-
Generated AI CP should be illegalized even if its creation did not technically harm anyone. The reason is, presumably it looks too close to real CP, so close that it: 1) normalizes consumption of CP, 2) grows a market for CP, and 3) Real CP could get off the hook by claiming it is AI.
While there are similar reasons to be against clearly not real CP (e.g. hentai), this type at least does not have problem #3. For example, there doesnt need to be an investigation into whether a picture is real or not.
The biggest issue with this line of thinking is, how do you prove it's CP without a victim. I suppose at a certain threshold it becomes obvious, but that can be a very blurry line (there was a famous case where a porn star had to be flown to a court case to prove the video wasn't CP, but can't find the link right now).
So your left with a crime that was committed with no victim and no proof, which can be really easy to abuse.
-
Who actually gets hurt in AI generated cp? The servers?
Making a photo of a child based off of real photos in a sexual manner is essentially using said child in the training data as the one in the act..
-
Generated AI CP should be illegalized even if its creation did not technically harm anyone. The reason is, presumably it looks too close to real CP, so close that it: 1) normalizes consumption of CP, 2) grows a market for CP, and 3) Real CP could get off the hook by claiming it is AI.
While there are similar reasons to be against clearly not real CP (e.g. hentai), this type at least does not have problem #3. For example, there doesnt need to be an investigation into whether a picture is real or not.
AI CP seems like a promising way to destroy demand for the real thing. How many people would risk a prison sentence making or viewing the real thing when they could push a button and have a convincing likeness for free with no children harmed? Flood the market with cheap fakes and makers of the real thing may not find it profitable enough to take the risk.
-
Probably got all the data to train for it from the pentagon. They're known for having tons of it and a lot of their staff (more than 25%) are used to seeing it frequently.
Easily searchable, though I don't like to search for that shit, but here's 1 post if you literally add pentagon to c____ p___ in a search a million articles on DIFFERENT subjects (than this house bill) come up https://thehill.com/policy/cybersecurity/451383-house-bill-aims-to-stop-use-of-pentagon-networks-for-sharing-child/
When my dad worked for the DoD, he was assigned a laptop for work that had explicit photos of children on it.
-
Making a photo of a child based off of real photos in a sexual manner is essentially using said child in the training data as the one in the act..
But who is actually getting hurt? No kid has gotten hurt using Gen AI.
-
A&W root beer