European police say KidFlix, "one of the largest pedophile platforms in the world," busted in joint operation.
-
Well, some pedophiles have argued that AI generated child porn should be allowed, so real humans are not harmed, and exploited.
I'm conflicted on that. Naturally, I'm disgusted, and repulsed.
But if no real child is harmed...
I don't want to think about it, anymore.
wrote 4 days ago last edited byUnderstand you’re not advocating for it, but I do take issue with the idea that AI CSAM will prevent children from being harmed. While it might satisfy some of them (at first, until the high from that wears off and they need progressively harder stuff), a lot of pedophiles are just straight up sadistic fucks and a real child being hurt is what gets them off. I think it’ll just make the “real” stuff even more valuable in their eyes.
-
This post did not contain any content.wrote 4 days ago last edited by
Geez, two million? Good riddance. Great job everyone!
-
This post did not contain any content.wrote 4 days ago last edited by
Maybe Jeff Bezos will write an article about him and editorialize about "personal liberty". I have to keep posting this because every day another MAGA/lover - religious bigot or otherwise pretend upstanding community member is indicted or arrested for heinous acts against women and children.
-
With the amount of sites that are easily accessed on the dark net though the hidden wiki and other sites. This might as well be a honeypot from the start. And it's doesn't only apply to cp but to drugs, fake ids and other shit.
wrote 4 days ago last edited byNo judge would authorise a honeypot that runs for multiple years, hosting original child abuse material meaning that children are actively being abused to produce content for it. That would be an unspeakable atrocity. A few years ago the Australian police seized a similar website and ran it for a matter of weeks to gather intelligence and even that was considered too far for many.
-
Even then, a common bit you'll hear from people actually defending pedophilia is that the damage caused is a result of how society reacts to it or the way it's done because of the taboo against it rather than something inherent to the act itself, which would be even harder to do research on than researching pedophilia outside a criminal context already is to begin with. For starters, you'd need to find some culture that openly engaged in adult sex with children in some social context and was willing to be examined to see if the same (or different or any) damages show themselves.
And that's before you get into the question of defining where exactly you draw the age line before it "counts" as child sexual abuse, which doesn't have a single, coherent answer. The US alone has at least three different answers to how old someone has to be before having sex with them is not illegal based on their age alone (16-18, with 16 being most common), with many having exceptions that go lower (one if the partners are close "enough" in age are pretty common). For example in my state, the age of consent is 16 with an exception if the parties are less than 4 years difference in age. For California in comparison if two 17 year olds have sex they've both committed a misdemeanor unless they are married.
wrote 4 days ago last edited bynone of this applies to the comment they cited as an example of defending pedophilia.
-
Understand you’re not advocating for it, but I do take issue with the idea that AI CSAM will prevent children from being harmed. While it might satisfy some of them (at first, until the high from that wears off and they need progressively harder stuff), a lot of pedophiles are just straight up sadistic fucks and a real child being hurt is what gets them off. I think it’ll just make the “real” stuff even more valuable in their eyes.
wrote 4 days ago last edited byI feel the same way. I've seen the argument that it's analogous to violence in videogames, but it's pretty disingenuous since people typically play videogames to have fun and for escapism, whereas with CSAM the person seeking it out is doing so in bad faith. A more apt comparison would be people who go out of their way to hurt animals.
-
I used to work in netsec and unfortunately government still sucks at hiring security experts everywhere.
That being said hiring here is extremely hard - you need to find someone with below market salary expectation working on such ugly subject. Very few people can do that. I do believe money fixes this though. Just pay people more and I'm sure every European citizen wouldn't mind 0.1% tax increase for a more effective investigation force.
wrote 4 days ago last edited byMost cases of "we can't find anyone good for this job" can be solved with better pay. Make your opening more attractive, then you'll get more applicants and can afford to be picky.
Getting the money is a different question, unless you're willing to touch the sacred corporate profits....
-
as said before, that person was not advocating for anything. he made a qualified statement, which you answered to with examples of kids in cults and flipped out calling him all kinds of nasty things.
wrote 4 days ago last edited byLmfao as I stated, they said that physical sexual abuse "PROBABLY" harms kids but they have only done research into their voyeurism kink as it applies to children.
Go off defending pedos, though
-
qualifying that as advocating for pedophilia is crazy. all that said is they don't know about studies regarding it so they said probably instead of making a definitive statement. your response is extremely over the top and hostile to someone who didn't advocate for what you're saying they advocate for.
It's none of my business what you do with your time here but if I were you I'd be more cool headed about this because this is giving qanon.
wrote 4 days ago last edited byThey literally investigated specific time frames of their voyeurism kink in medieval times extensively, but couldn't be bothered to do the most basic of research that sex abuse is harmful to children.
-
Search “AI woman porn miniskirt,”
Did it with safesearch off and got a bunch of women clearly in their late teens or 20s. Plus, I don't want to derail my main point but I think we should acknowledge the difference between a picture of a real child actively being harmed vs a 100% fake image. I didn't find any AI CP, but even if I did, it's in an entire different universe of morally bad.
r/jailbait
That was, what, fifteen years ago? It's why I said "in the last decade".
wrote 4 days ago last edited by"Clearly in their late teens," sure.
Obviously there's a difference with AI porn vs real, that's why I told you to search AI in the first place??? The convo isn't about AI porn, but AI porn uses images to seed their new images including CSAM
-
I feel like what he’s trying to say it shouldn’t be the end of the world if a kid sees a sex scene in a movie, like it should be ok for them to know it exists. But the way he phrases it is questionable at best.
When I was a kid I was forced to leave the room when any intimate scenes were in a movie and I honestly do feel like it fucked with my perception of sex a bit. Like it’s this taboo thing that should be hidden away and never discussed.
wrote 4 days ago last edited byHe wants children to full on watch adults have sex because he has a voyeurism kink. Solved that for you.
-
They literally investigated specific time frames of their voyeurism kink in medieval times extensively, but couldn't be bothered to do the most basic of research that sex abuse is harmful to children.
wrote 4 days ago last edited by"they knew some things and didn't know some things" isn't worth getting so worked up over. they knew the mere concept of sex being taboo negatively affected them and didn't want to make definitive statements about things they didn't research. believe it or not lemmy comments are not dissertations and most people just talk and don't bother researching every tangential topic just to make a point they want to make.
-
typical file-sharing networks
Tox messaging network
Matrix channels
I would consider all of these to be trawling dark waters.
wrote 4 days ago last edited byFile-sharing and online chat seem like basic internet activities to me.
-
No judge would authorise a honeypot that runs for multiple years, hosting original child abuse material meaning that children are actively being abused to produce content for it. That would be an unspeakable atrocity. A few years ago the Australian police seized a similar website and ran it for a matter of weeks to gather intelligence and even that was considered too far for many.
wrote 4 days ago last edited by"That would be an unspeakable atrocity", yet there is contradiction in the final sentence. The issue is, what evidence is there to prove such thing operation actually works, as my last point implied - what stops the government from abusing this sort of operation. With "covert" operations like this the outcome can be catastrophic for everyone.
-
This post did not contain any content.wrote 4 days ago last edited by
The name of it sounds like a streaming service for children's movies and TV shows. Like, Netflix for kids.
In the past 5 years I have seen at least 3 deepweb social communities that started out normally, with a lot of people talking shit and enjoying anonymous free speech. Then I log in a couple weeks or months later to find CP being posted and no mods doing anything to stop it. In all those cases, I reported the site to the FBI anonymously and erased my login from my password manager. -
During the investigation, Europol’s analysts from the European Cybercrime Centre (EC3) provided intensive operational support to national authorities by analysing thousands of videos.
I don't know how you can do this job and not get sick because looking away is not an option
wrote 4 days ago last edited byI'm sure many of them numb themselves to it, and pretend it isn't real in order to do the job. Then unfortunately, I'm sure some of them get addicted themselves.
Similar to undercover cops who do drugs while undercover, then get addicted to the drugs.
-
During the investigation, Europol’s analysts from the European Cybercrime Centre (EC3) provided intensive operational support to national authorities by analysing thousands of videos.
I don't know how you can do this job and not get sick because looking away is not an option
wrote 4 days ago last edited byThis kind of shit is why i noped out of the digital forensics field. I would have killed myself if I had to see that shit everyday.
-
it can hide in plain sight, and then when you dig into someones profile, it can lead to someone or a group discussing CSAM and beastility, not just CP. like a site similar to r/pics, or porn site.
wrote 4 days ago last edited byI can definitely see how people could find it while looking for porn. I don't understand how people can do this stuff out in the open with no consequences .
-
"Clearly in their late teens," sure.
Obviously there's a difference with AI porn vs real, that's why I told you to search AI in the first place??? The convo isn't about AI porn, but AI porn uses images to seed their new images including CSAM
wrote 4 days ago last edited byIt's fucking AI, the face is actually like 3 days old because it is NOT A REAL PERSON'S FACE.
-
wrote 4 days ago last edited by
This ain't the early 2000s. The unwashed masses have found the internet, and it has been cleaned for them. 97% of the internet has no idea what Matrix channels even are.