European police say KidFlix, "one of the largest pedophile platforms in the world," busted in joint operation.
-
with a catchy name clearly thought up by a marketing person
A marketing person? They took "Netflix" and changed the first three letters lol
wrote 8 days ago last edited bySo you are saying it is too creative for the average person in marketing?
-
Gonna ruin me, but seconding. Brick in the window video?
wrote 8 days ago last edited byThey're probably referencing the video where a woman was killed after a brick flew through the windshield. I haven't watched it, but it is on YouTube and I've heard that the husband's cries are not so nice.
I don't remember if it was kids throwing bricks off of a bridge or if it was something else.
-
This post did not contain any content.wrote 8 days ago last edited by
1.8 million users and they only caught 1000?
-
It's not an overpass. A loose brick falls off a truck going in the opposite direction, bounces off the pavement once, then goes through the windshield.
wrote 8 days ago last edited byWell, I know what other video I'm never watching.
And people wonder why I don't like being around any vehicle that carries things...
-
Which countries do you have in mind where videos of sexual child abuse are legal?
wrote 8 days ago last edited byContext is important I guess. So two things.
Is something illegal if it's not prosecuted?
Is it CSA if the kid is 9 but that's marrying age in that country?
If you answer yes, then no, then we'll not agree on this topic.
-
Does it feel odd to anyone else that a platform for something this universally condemned in any jurisdiction can operate for 4 years, with a catchy name clearly thought up by a marketing person, its own payment system and nearly six figure number of videos? I mean even if we assume that some of those 4 years were intentional to allow law enforcement to catch as many perpetrators as possible this feels too similar to fully legal operations in scope.
wrote 8 days ago last edited byWith the amount of sites that are easily accessed on the dark net though the hidden wiki and other sites. This might as well be a honeypot from the start. And it's doesn't only apply to cp but to drugs, fake ids and other shit.
-
This has nothing to do with privacy! Criminals have their techniques and methods to protect themselves and their "businesses" from discovery, both in the real world and in the online world. Even in a complete absence of privacy they would find a way to hide their stuff from the police - at least for a while.
In the real world, criminals (e.g. drug dealers) also use cars, so you could argue, that druck trafficking is a side effect of people having cars...
wrote 8 days ago last edited byThis platform used Tor. And because we want to protect privacy, they can make use of it.
-
Context is important I guess. So two things.
Is something illegal if it's not prosecuted?
Is it CSA if the kid is 9 but that's marrying age in that country?
If you answer yes, then no, then we'll not agree on this topic.
wrote 8 days ago last edited byI am not talking about CSA, I am talking about video material of CSA. Most countries with marriage ages that low have much more wide-spread bans on videos including sex of any kind.
As for prosecution, yes, it is still illegal if it is not prosecuted. There are many reasons not to prosecute something ranging all the way from resource and other means related concerns to intentionally turning a blind eye and only a small minority of them would lead that country to actively sabotage a major international investigation, especially after the trade-offs are considered (such as loss of international reputation by refusing to cooperate).
-
This platform used Tor. And because we want to protect privacy, they can make use of it.
wrote 8 days ago last edited byThis particular platform used tor. It doesn't mean all platforms are using privacy centric anonymous networks. There are incidents with people using kik, Snapchat, Facebook and other clear net services to perform criminal actions such as drugs or cp.
-
This post did not contain any content.wrote 8 days ago last edited by
Every now and again I am reminded of my sentiment that the introduction of "media" onto the Internet is a net harm. Maybe 256 dithered color photos like you'd see in Encarta 95 and that's the maximum extent of what should be allowed. There's just so much abuse from this kind of shit... despicable.
-
On average, around 3.5 new videos were uploaded to the platform every hour, many of which were previously unknown to law enforcement.
Absolutely sick and vile. I hope they honey potted the site and that the arrests keep coming.
wrote 8 days ago last edited byI just got ill
-
Every now and again I am reminded of my sentiment that the introduction of "media" onto the Internet is a net harm. Maybe 256 dithered color photos like you'd see in Encarta 95 and that's the maximum extent of what should be allowed. There's just so much abuse from this kind of shit... despicable.
wrote 8 days ago last edited byI think it just shows all the hideousness of humanity and all it's glory in a way that we have never confronted before. It's shatters the illusion the humanity has grown from its barbaric ways.
-
Every now and again I am reminded of my sentiment that the introduction of "media" onto the Internet is a net harm. Maybe 256 dithered color photos like you'd see in Encarta 95 and that's the maximum extent of what should be allowed. There's just so much abuse from this kind of shit... despicable.
wrote 8 days ago last edited byLet’s get rid of the printing press because it can be used for smut. /s
-
Every now and again I am reminded of my sentiment that the introduction of "media" onto the Internet is a net harm. Maybe 256 dithered color photos like you'd see in Encarta 95 and that's the maximum extent of what should be allowed. There's just so much abuse from this kind of shit... despicable.
wrote 8 days ago last edited byRaping kids has unfortunately been a thing since long before the internet. You could legally bang a 13 year old right up to the 1800s and in some places you still can.
As recently as the 1980s people would openly advocate for it to be legal, and remove the age of consent altogether. They'd get it in magazines from countries where it was still legal.
I suspect it's far less prevalent now than it's ever been. It's now pretty much universally seen as unacceptable, which is a good start.
-
Every now and again I am reminded of my sentiment that the introduction of "media" onto the Internet is a net harm. Maybe 256 dithered color photos like you'd see in Encarta 95 and that's the maximum extent of what should be allowed. There's just so much abuse from this kind of shit... despicable.
wrote 8 days ago last edited byIt is easy to very feel disillusioned with the world, but it is important to remember that there are still good people all around willing to fight the good fight. And it is also important to remember that technology is not inherently bad, it is a neutral object, but people could use it for either good or bad purposes.
-
Let’s get rid of the printing press because it can be used for smut. /s
wrote 8 days ago last edited bygreat pointless strawman. nice contribution.
-
great pointless strawman. nice contribution.
wrote 8 days ago last edited byIt’s satire of your suggestion that we hold back progress but I guess it went over your head.
-
Raping kids has unfortunately been a thing since long before the internet. You could legally bang a 13 year old right up to the 1800s and in some places you still can.
As recently as the 1980s people would openly advocate for it to be legal, and remove the age of consent altogether. They'd get it in magazines from countries where it was still legal.
I suspect it's far less prevalent now than it's ever been. It's now pretty much universally seen as unacceptable, which is a good start.
wrote 8 days ago last edited byThe youngest Playboy model, Eva Ionesco, was only 12 years old at the time of the photo shoot, and that was back in the late 1970’s... It ended up being used as evidence against the Eva’s mother (who was also the photographer), and she ended up losing custody of Eva as a result. The mother had started taking erotic photos (ugh) of Eva when she was only like 5 or 6 years old, under the guise of “art”. It wasn’t until the Playboy shoot that authorities started digging into the mother’s portfolio.
But also worth noting that the mother still holds copyright over the photos, and has refused to remove/redact/recall photos at Eva’s request. The police have confiscated hundreds of photos for being blatant CSAM, but the mother has been uncooperative in a full recall. Eva has sued the mother numerous times to try and get the copyright turned over, which would allow her to initiate the recall instead.
-
This post did not contain any content.wrote 8 days ago last edited by
Here’s a reminder that you can submit photos of your hotel room to law enforcement, to assist in tracking down CSAM producers. The vast majority of sex trafficking media is produced in hotels. So being able to match furniture, bedspreads, carpet patterns, wallpaper, curtains, etc in the background to a specific hotel helps investigators narrow down when and where it was produced.
-
Honestly, if the existing victims have to deal with a few more people masturbating to the existing video material and in exchange it leads to fewer future victims it might be worth the trade-off but it is certainly not an easy choice to make.
wrote 8 days ago last edited byIt doesn't though.
The most effective way to shut these forums down is to register bot accounts scraping links to the clearnet direct-download sites hosting the material and then reporting every single one.
If everything posted to these forums is deleted within a couple of days, their popularity would falter. And victims much prefer having their footage deleted than letting it stay up for years to catch a handful of site admins.
Frankly, I couldn't care less about punishing the people hosting these sites. It's an endless game of cat and mouse and will never be fast enough to meaningfully slow down the spread of CSAM.
Also, these sites don't produce CSAM themselves. They just spread it - most of the CSAM exists already and isn't made specifically for distribution.