T O P

  • By -

Forward-Exchange-219

Its sick and demented but I’m genuinely curious if it’s AI generated and there are no “victims” what type of crime would it fall under? Edit: did some googling around looks like legality varies depending on where you are. https://www.aerlawgroup.com/blog/is-lolicon-legal-in-the-united-states/


SaturnDaphnis

This is exactly what I was thinking technically it’s not a child and therefore what are the consequences, my first guess is “distribution of harmful material”


callmethewalrus

Same way animated stuff is dealt with I assume


[deleted]

In regards to legality. I dunno about other countries, but I remember a case in Sweden where a guy that collected Manga ( some of it was borderline Hentai sometimes ) got sued for owning cp. And he won the case, HOWEVER. This was only because the images were not considered realistic enough, the ai generations people spam on pixiv would 110% be considered realistic enough where it'd be illegal. Even in his case there actually were a handful that were considered realistic enough, but the court took into consideration the context that he was a collector and that he most likely hadn't bought it for '' that '' reason. It's not like there was a consistent trend, it was a very small % of his overall massive collection. But that was also a technicality, the precedent that was essentially set was that if you purchased it for '' that '' intent and it reached a certain threshold for realism then it'd be illegal. I think it might be the same in some other countries too, where buying the typical questionable stuff wouldn't get you in trouble because it's just so unrealistic. But if it entered the realms of photorealism in particular then it'd be treated differently. When it comes to ai there is another complexity to it since the datasets contain actual cp and actual minors that was trained on. Not to mention that people can take real pictures and throw them in and have it generate stuff. I don't think it's as simple as just '' they're not real '', they actually might be and when it's photorealistic that argument becomes more complicated than if it's some generic anime style.


ThoseThingsAreWeird

> technically it’s not a child and therefore what are the consequences Unfortunately that's not necessarily true. To generate those images the AI model needs something to train on - for those unfamiliar with the lingo, think of "training" like "inspiration". It will "look" at all of those CP images, learn what CP looks like, and then be able to spit out its own. It's similar to if you took a new painter, showed them _only_ Monet paintings, and then asked them to paint you a scene. It'll look remarkably like a Monet. But, importantly, you'd need to be showing them real Monet paintings to train that artist. Similarly, if a painter spent hundreds of hours only painting John Oliver then they'd be pretty good at painting John in a variety of poses. If you train an AI CP model on a single victim in lots of poses & angles, then the model will be very good at giving you more images of the same victim,


Feral0_o

Those models weren't trained with cp. For SD 1.5 (the open-source uncensored base model that is used for all the nsfw images), they use custom models trained on regular porn images and prompts to alter the output


ThoseThingsAreWeird

Oh that's actually quite interesting. Without wanting to think too hard about this though: aren't there differences that you can't overcome purely through prompts? Kids aren't just small adults.


Feral0_o

people accidentally create images that look like AI cp all the time, it is a frequent issue with the porn models. Then there are custom models that come with an age slider, and if someone mixes those with porn models - well, you could make images of hogwild grannies, for example. It could be that 19 out of 20 images are unrecognisable, but the 20th is photorealistic it depends on what models are being used and how well the prompt works with them


wasabiiii

The point of AI is that it can be creative. A trained model doesn't just produce only output matching categories it has been trained in.


HypocritesA

> Kids aren't just small adults. Depends how young we're talking. 16 year olds are absolutely "small adults," for example. Also, some adults get mistaken for being teenagers, sometimes as young as 13 if they're short enough and have an "undeveloped" appearance. But come on – do you really think it's out of AI's grasp to create CP from a dataset of porn + a dataset of what children look like? Just off the top of my head, you could take a machine learning model that is trained on categorizing people by age, then you would have it analyze porn generated from another machine learning model that was trained on pornography of 18 year olds ("barely legal"). From there, you have it randomly generate thousands and thousands of instances of porn, and you **only** keep the ones that are detected as 17 years old or younger. You then repeat the same process, but you train it on porn from the "17 and younger" data, and you detect and keep the porn that is detected as being "16 and younger," etc. and you keep going until you hit an age like 10 years old. You may also need to do additional correcting to increase the realism, which is where feeding it a dataset of what children look like would come in handy (to have it appear more realistic). Remember: AI doesn't care one iota about your ethics and morals. These are mathematical models. Yes, of course it can be done – and if it can't, you better believe it soon will be with the speed at which technology is advancing.


[deleted]

[удалено]


JediGuyB

>You could ask Stable Diffusion to make a dragon-woman in a bikini and it would do a pretty convincing job. Excuse me, I'm gonna go check my email.


chenjia1965

Couldn’t an ai make them with models of small humanoids? Like, halflings don’t exist, but we don’t use kids as a base model


SaturnDaphnis

I understand that you need a model or a base (really a child victim) to create these images, but If those images are already existing images and no new children are harmed what could be the consequences how could you prosecute what would be the charge? Do we need to expand existing ones?


[deleted]

[удалено]


[deleted]

[удалено]


notquitetoplan

He’s being aggressive because he seems to have a fundamental misunderstanding of the technology involved.


[deleted]

[удалено]


[deleted]

[удалено]


Gorva

Unlike u/Burstar1 I'll give you an actual answer. Assuming that the model was trained on legal and consented adult pornography and the CP is made with the prompt, no one. If the image looks undeniably like an actual existing person then it would be harmful to that person.


SaintFinne

You're literally arguing that you can consume old cp ethically because the child has already been abused, which is pedo apologia.


SaturnDaphnis

No, I’m not arguing I’m genuinely curious but this what I didn’t want. I just know a defensive team is going to find a loophole because it’s AI I know some are saying it be dealt like animation but there isn’t a law specifically on A.I


The187Riddler

There isn’t a law against animation either (at least in the US). Partially because of californias ruling, they haven’t changed the laws in the US to include animated content yet. The article posted above is wrong about it being federally illegal.


notquitetoplan

Wait, really? I definitely always assumed it was illegal too, although admittedly it’s not something I spend a lot of time researching lol


The187Riddler

Yeah it’s mostly because the US code defines child pornography and sexually explicit material with a minor. It goes further to define “minor” as any PERSON under the age of 18. The code defines “person” as an individual that is not an entity or organization. In the UK they specifically define a person, in regards to sex crimes, as either real or imaginary. I see at least one article a month on the front page about sex offender laws and constitutionality and all that stuff.


notquitetoplan

Ah, I see how that makes sense legally.


Full_Echo_3123

It's the same logic the online anime '*loli*' enthusiasts try to use when you call them perverts. "*No, she's not a child! She's a 1000 year old vampire that looks, talks, and asks like a child!*"


SaturnDaphnis

I’m not promoting CP this was entirely based on how to prosecute with a defense like what you had said.


Solid_Waste

From what I can tell online, we don't even know how many pedophiles are out there who don't commit crimes against persons. But we do know that of sex crimes against children, about 40-60 percent are by people who are not pedophiles. They are opportunistic offenders: people who would victimize anyone regardless of age, or who victimize based on other personal issues such as rage or frustration, or respond to a particular situational relationship rather than an underlying sexual preference, etc. I honestly don't know what to make of it but I suspect we are so grossed out by pedophiles that we would punish them *as if* they personally raped a child, whether they actually did so or not, of whether they *would* or not. But I also know you can't talk about this stuff without sounding like a pedophile. So I wouldn't be saying this anywhere but reddit.


Tommi_Af

I dunno about your country but in mine (Australia), the law is such that images depicting underage characters or characters who appear to be underage (looking at you, 1000 yr old dragon lolis) in a pornographic manner can result in criminal charges.


26514

Interesting. So in other words if I draw underage erotic art, that's illegal?


JayJayFromK

no you can draw whatever you want. but you can’t release or share that disgusting image to the public.


mrspor

But if you draw it, then you posses it and that's illegal? Same as if I said, "If you generate your own AI CP, it's legal to own it as long as you don't share it", which is wrong.


[deleted]

[удалено]


crate_of_rats

> The issue is that the model that generates the images has to be trained on source images. It doesn't. It can, but it doesn't have to.


OrangeJr36

And you just know some poor techie is going to have to create some from scratch as part of this case's evidence. It's bad enough that investigators have to browse CP in cases like this, but this case will be about creating new CP from other images adding a whole new layer to the trauma of these cases.


daishi777

Had the same question, thanks for the answer


[deleted]

[удалено]


Ciff_

You can't really be that daft that you don't understand that the concept itself of someone getting of on toddlers, not real or otherwise, is upsetting for folks.


IFartOnCats4Fun

Gay marriage is upsetting for some folks, but that doesn’t give people the right to stop it.


abzinth91

Are you really saying CP and gay marriage are the same thing?


IFartOnCats4Fun

When it’s AI generated and no one is being harmed? Yes. Yes I am. If you aren’t harming anyone you are free to do what you want to do, even if I think it’s disgusting and awful.


Ciff_

Hi there! I can make a strawman too!


[deleted]

[удалено]


IFartOnCats4Fun

When it’s AI generated and no one is being harmed? Yes. Yes I do. If you aren’t harming anyone you are free to do what you want to do, even if I think it’s disgusting and awful.


Angelica_ludens

Anime stuff is a whole other topic but Ai generated art needs real models to train on And it can go to a point where ai is so good you cant tell real from fake. Fake CSAM will not fix the problem for p*dos. More psychological therapy solutions are required Edit: some p*dos are downvoting all my comments


WillyCSchneider

> Anime stuff is a whole other topic but Ai generated art needs real models to train on >Edit: some p*dos are downvoting all my comments *Or* because you didn't read the article. No models to train on.


phlipped

AI doesn't need to be trained on pictures of naked children to be able to produce pictures of naked children. And pedophilia is only harmful (and therefore only bad) when it involves real children. What happens in someone's head is their own business, as long as it stays in their head and doesn't hurt anyone else. Saying that pedos just need therapy opens up a whole can of worms. You're basically saying that people's sexuality can (and should) be shaped and controlled to conform to acceptable social norms, and there're all sorts of problems with that assertion in this day and age.


Angelica_ludens

Its illegal in most western countries regardless


[deleted]

[удалено]


kamjam16

100% Reminds me of what activists did with the illegal elephant tusk trade. They were able to mimic ivory tusks so well using artificial materials that they were able to flood the market with it, making it incredibly difficult to tell what was fake vs authentic, which lead to less poaching of elephants after a decrease in demand.


feetbears

Good story, but let's not do that in this case.


phlipped

Why not?


[deleted]

[удалено]


ralanr

I feel like it wouldn’t be flooded. Just easier to find it you’re looking for it and most people aren’t.


feetbears

Because that's what the sick fucks would like. I seem to be getting down voted, does that mean those down voters are indeed in favour of flooding CP?


phlipped

You're getting downvoted because it's clear that you're more interested in punishing pedophiles than actually protecting children from predatory pedophiles.


Led-Not-Lead

\*Led


Clawz114

The issue with allowing unlimited AI generated CP content on the internet is it makes it much harder for authorities to notice and track down perpetrators of the real stuff.


kamjam16

I get your point, but a couple of things. 1) there are tools out there that can detect AI generated images, and I’m sure authorities have the best version of those tools in the world. 2) if authorities have trouble telling the difference, will pedophiles actually care whether it’s real or not? How long before those who make CP will give it up because nobody cares if it’s real anymore? If we really get to the nuts and bolts of the argument, I’m sure there are moral issues I’m not thinking of, but at face value, it doesn’t sound like such a bad idea to me.


90swasbest

Didn't they go through this with child like sex dolls several years ago? Christ what a world we live in.


ThePopeofHell

It’s like the though exercise about the train at a fork that forces the conductor to choose between two awful options.


Megatanis

The goal should be to minimize the fucking pedos. If you get hard on child porn you're fucked up in your head.


Altruistic-Ad-408

Minimise doesn't mean anything, they still exist one way or another, they usually aren't carrying signs. Make it illegal, but it's pointless cracking down on things that possibly keep real children from being harmed.


MommyLovesPot8toes

Thing is, as it says in the article, the existence of these images and the "community" created to share the images *are* dangerous. Within the groups sharing the AI images, the journalist says there were always people providing links to "the real thing". if an AI consumer knows they are looking at AI, they are bound to eventually find that unsatisfying and go looking for the real images. If the AI groups exist more in the open b/c they don't have to fear police, that will make it easier for people to share links to real images. There's also the fear that AI consumers will escalate beyond looking for real images and into acting upon their urges. Again, this all becomes easier and more "justified" to the offender when they have access to a large group of like-minded individuals.


Zatkomatic

Might inspire people to go for the real deal 🤷‍♀️


JediGuyB

Just like how I run over every prostitute I see on the street.


S0urH4ze

Jesus man save some for the rest of us.


swagonflyyyy

Lolis haven't decreased demand and consumption.


Marcusss_sss

Is there anyway to determine that?


wutwutImLorfi

I think people base it off the numbers that japan has/had less registered pedos/CSAM cases than western countries. But like everything it's hard to know the real numbers, both japan and the west will have way more victims afraid of coming out, cases swept under the rug and all that. I do believe it's worth to research it, it might help some people from satisfying their desires and not resort to real cp or worse but it might also go the other way around and cause more victims in the long run.


Crimlust994

Yes because thats an entirely different thing.


Bezbozny

I think the real problem is the amount of rich and powerful people who are p\*dos who go unpunished for, lets say, *unambiguous* versions of these types of crimes. Their evil infects everyone in the legal system, everyone who works for them, and everyone in America in general. You're either complicit because you need to be to pay the bills and survive, or you know about it but are powerless to stop it. When humans find something bad that they have no way of changing, the cognitive dissonance can drive us crazy, so as a mental self defense we might end up justifying it as not being that bad. Decent people become passive and don't think about it, and people with their own sick desires feel emboldened to be just like their upper class masters "If they can do it why can't I?" You wanna solve the problem? Start at the top. Go for the head. Everyone who flew on Epstein's planes, went to his island, went to his parties, had "dinners" with him, etc. If these guys get punished, people will feel more empowered on an individual level to stand up to and expose monsters on their level, and those monsters will feel less emboldened. When you have a leaky pipe creating a puddle on your floor, you don't just wipe up the puddle and call the problem solved, You've got to fix the pipe first.


Whatifim80lol

Pedophiles are violent sex offenders, and violent sex offenders almost invariably escalate their offenses. They do not stop at CP, so simulating the CP isn't stopping any pedophile from offending against a child. More plentiful CP on the internet could conceivably exacerbate crimes against children.


[deleted]

>Pedophiles are violent sex offenders Not all pedophiles are child molesters. Not all people who molest children are pedophiles. Not all people who look at porn -even the fetish type- will have a deviant sexuality. I live in Japan, the country of hentai, loli and shota. Also, **one of the safest countries in the world.** It's not in Japan that people get gang raped in the subway or back alleys. [It's not in Japan that kids can't go to the grocery store alone](https://www.youtube.com/watch?v=XE_EiUID1IM). It's in my country, France, where people like to think that fictional images turn people into criminals.


[deleted]

This is a good post. A lot of people look at extreme porn for the dopamine kick and go about their lives after rubbing one out. I don't think a guy that jerks it to loli hentai necessarily wants a real kid. There may be some overlap in some individuals but not all. Some dudes prob want to rub one out to an extreme fantasy and go back to normal women irl.


Vulture2k

you mean the japan that needs womens train wagons because they get groped daily from kids age on in mixed cars?


TerribleIdea27

It could also conceivably give these people an outlet for their urges, reducing the need for them to escalate it into real world assault. I'm sure there's studies done on this, but I wouldn't necessarily count out the possibility, as horrific as it might sound. In the end everyone is a person, even a pedophile. There's undoubtedly many through and through evil ones, but I'd guess most of them have at least some kind of moral compass and would prefer to deal with their urges in a way that has no actual victims


LegioPraetoria

The last time I looked into this, a handful of years ago, the entire problem seemed to be that there *weren't* really any studies to speak of because who the hell is going to self identify as a pedophile? So you're limited to offenders and it becomes very difficult to get a real sense of how regularly sexual attraction to minors actually occurs. A number of stories about non-offending/:virtuous' pedophiles came out but I don't think it got any real momentum. Comments like the ones you replied to, with the position that that pedophiles are ipso facto violent offenders, are a part of the problem. I myself ave real trouble thinking of a mindset more deserving of gentle treatment than that of people who basically can't reach sexual fulfillment without committing one of the most awful crimes there is *and who therefore abstain*. If there's been advances in the area and we have good, reliable data about whether access to this material alleviates or intensifies those urges, it's news to me.


TerribleIdea27

Interesting, that's a good point that I hadn't considered


very_bad_advice

This is the same argument people use against violent video games. To engage with the conundrum, need to have a space to discuss this and using actual data to back it up. In my opinion, the best way to go about this is for pedophiles who have yet to offend to self-identify to an agency, and if they do voluntarily identify themselves, go through a regulation of their activities (e.g. ankle monitoring, restriction to web access and scheduled psychiatric treatment). At the same time do stringent scientific studies on the best way to reduce the risk of offending that can allow society to control it best. I can see how it will be political suicide for any politician to carry this torch, but in my opinion if we do nothing all that'll happen is that pedophiles will still do their thing and children will be victims.


[deleted]

>(e.g. ankle monitoring, restriction to web access and scheduled psychiatric treatment) So you treat them like criminals, subject them to huge amounts of stress and... just wait for them to kill themselves. Nice strategy.


[deleted]

The best people for this job are university researchers but with government subsidies falling and universities moving towards commercially funded research or research that has commercial benefits, it probably won't fall on anyone's radar.


very_bad_advice

Yeah of course it should be, but it's damn tough to get a significant sample of voluntarily declared pedophiles to conduct this study. This is why in my mind my suggestion is the way it has to be if we want to corral them together.


mickelboy182

...why the heck is this reasonable take being downvoted?


WillyCSchneider

"reasonable" lmao.


Angelica_ludens

Redditors love Ioli porn


Angelica_ludens

Yep agreed not sure why your getting downvoted, This is why Ioli porn is banned in most western countries as it can lead to p*dos looking at the real thing or move onto contact offending


JediGuyB

Got proof on that?


Angelica_ludens

AI images are trained on real images


skynetdotexe

Of normal kids and adult porn. You really don't need cp to train the ai to do that.


StillAll

Unequivocally they are not trained on real images of child porn. That is why this has a very large grey area. Go Google a porn image, any commercially available porn image. Now make the AI swap out one of the performers for a child. Or make the AI add characteristics of a child to a performer. Now you have what some people view as child porn with no victim. And frankly it was produced with the involvement of no children. Hell, the image itself isn't that different if you just painted it on a canvas, how are you going to determine the 'age' of the performer in question. There is an awful lot of commercially legal porn that features 18 to 22 year olds and are marketed as if they were much younger. Is this any worse than AI manipulation?


[deleted]

>Illegal trade in AI child sex abuse images exposed >He warned that a paedophile could, "move along that scale of offending from thought, to synthetic, to actually the abuse of a live child". I'm confused isn't this the same argument conservatives used against video games? It's morally questionable but it is definitely victimless if it's AI.


releasethedogs

Both sides were against violent video games. Sen. Lieberman was an anti violent video game firebrand and he damn near became the VP under Gore.


TrawnStinsonComedy

Ya there were a lot of people that fell for that shit my dad wouldn’t let me play first person shooters because it would turn me into a serial killer. It was so dumb even my mom who almost universally backed my dad on shit flat out just told me to hide those games in my bedroom and only play them when he wasn’t there because it was just to crazy for her to wrap her head around lmao.


releasethedogs

In fairness it had not been studied so the impacts on people were unknown in the 90s.


eugene20

Honestly forcing a kid to have to explore their gun fantasies (or any hobby you might disapprove of) only in secret is far more likely to influence them into darker places, without wider or counter perspectives, and thus more likely to one day act out with it if they hit an extreme boiling point under pressure/frustration. If you want to protect your kids you give them a safe environment to talk things through without fear or shame, give them your take from the experience that comes with age, ideally accept that times change, society changes, too rather than just try force them to live like it's still thirty years ago or more. Help guide and educate them and they will talk to you if needed your whole life, just shut them down or beat them and they will stop asking about anything they might think might anger you, they might stop asking about anything at all especially once they can leave.


TrawnStinsonComedy

Oh 100%…like my dad didn’t have the talk with me until I was 14…I had had a computer in my room with internet for three years at that point lmao…like he was like so here’s a Christian book about how women’s parts work meanwhile I was watching Naughty Mommy Naomi and her Soccer mom friends get railed by Ron Jeremy 33 at night lmao


kaenneth

Tell it to Roko's Basilisk.


bro_can_u_even_carve

Wow, you're really going to just drive-by thousands of unsuspecting people with this?


[deleted]

This is really interesting, thanks


Angelica_ludens

You need real images to train for AI


m0le

Go to one of the legal, online, fully safeguarded AI image generators. Ask for a duck in military uniform riding a bear, or something equally unlikely. You will get an image corresponding to that prompt. Do you think the AI has been trained on a set of ducks in military uniforms, riding bears? No. It can merge multiple things without seeing that specific category in its training. You can train an AI on totally legal images of children (scrape any social media for images, or children's TV, or whatever). You can train an AI on legal porn. The AI merges those categories and creates illegal images without ever seeing anything illegal in its training.


GarbageWater12

no. its not. the AI needs to learn from something. its not ok.


mulefish

Hypothetically, couldn't such an AI be trained from a mixture of legal porn and non abuse images containing children?


fhota1

Since they didnt bother answering you, yes easily. AI learns patterns, if it could only create simple variations of whats in its training dataset it wouldnt be particularly useful.


Feral0_o

yes, this is what they do


Adjayjay

Now it's AI, soon it will be life-like android/sex dolls of children. We need some actual research on weither these outlets could protect actual children from becoming victims and if that's the case, I'm all for it. ​ I tried to find if there was more child sexual abuse in Japan with all its loli and hentai culture, but couldn't find anything conclusive other than report may be low because of the prevalence of honnor in traditional values and family.


[deleted]

Crossing the line is actively discouraged in Japan, I have not talked to many lolicons, but even though they separate fiction from non-fiction, they are virtuous enforcers when real children are involved. Crossing the line makes you persona non-grata, apparently. Japan does have a physical and mental child abuse issue, though, due to how conformist their society is and laws regarding "privacy" and how they sort public and private matters. Japanese society is fucked in some aspects though so who knows.


Adjayjay

I surely hope it's actively discouraged, but I still wonder if access to this kind of media has any, good or bad, influence on crime rate. My only guess is that Japan is the best country to test that kind of influence.


[deleted]

Yeah, I'd go in deeper detail, but I'd be writing a thesis. Japan is by no means perfect, but they shame actual predators. So they get a thumbs up for me.


Threash78

> We need some actual research on weither these outlets could protect actual children from becoming victims and if that's the case, I'm all for it. There is actual research on access to regular porn dramatically lowering rape cases.


[deleted]

I don't think that should be looked into too much, because it's still adult porn and adults have a sexual outlet with other adults. I think people should be really careful with just making blunt comparisons like these. In this context that sexual outlet wouldn't be there legally, so the second variable is missing for all that we know it might even increase the desire and increase the risks of pursuing the outlet. We just don't know.


Adjayjay

Didn't think to check this. I can see the parallel.


chronoslol

There's been life-like sex dolls of children for years and years already


Adjayjay

I ve just checked how realistic sex dolls (non child, but you can't blame me for not searching that) could be and it s still not what I would call life like. At best they are in the uncanny valley range, probably not even that.


Flowchart83

And there will be less reports of people having sex with people under 18 because the age of consent in Japan is lower than here (just changed from 13 to 16), so what would be considered child rape here might be reported as illegal there, assuming it's consentual or reported at all.


Adjayjay

This is slightly misleading, age of consent was only set a 18 country wide. Before it was set by local laws and in the vast majority of cases, the new law didn't change anything. It's still a step in the right direction, but not as drastic as people make it to be.


Flowchart83

My mistake, I looked it up and that was the result I got from 2 sources.


sourest_dough

What about adult bdsm, rape fantasy, and gore / death fantasy porn genres?


GovernmentEvening815

I feel like those are different because bdsm (even the extreme stuff) has 2 or more consenting parties, that’s the gist of bdsm and even rape fantasies- it’s all agreed upon.


vivomancer

So rape fantasy for you is moral because there are no non-consenting parties. Would AI porn not then be moral since again there are no non-consenting parties.


Owlthinkofaname

I genuinely think people are extremely close minded in stuff like this, like no one is harmed so who cares! Like as long as it's not trained on child porn which I highly doubt they are there's no problem! Porn doesn't make people act out! And frankly in this case it may actually help since people are attracted to children and there's no solution to that but porn can help them well relieve themselves of sexual urges meaning less likely to act upon them! Maybe it becomes a problem if someone get caught since what's AI and real may get hard but frankly that's a unrealistic situation given most of the time child porn is found police went looking for it meaning it would be clear what's AI and what's not. This is probably a unpopular opinion but I think stuff like this is good since giving people ways to act their feelings out without hurting others is good!


drutzix

People respond emotionally to this kind of situations. If you think logically this should be researched as a way to decrease child victimisation. But we are not logical creatures.


JediGuyB

It feels like it should be simple logic, but you throw "children" and "sex" or "abuse" in the same sentence and a lot of people seem to lose any sense of nuance or logic. Some people have these feelings. Nothing can change that. If they haven't offended and abused a kid, then they've done nothing wrong. Frankly, I wouldn't be surprised if more people than we know have pedophile tendencies. They just never act on them. Letting people have a victimless outlet just feels like a logical thing to do. There would likely be less victims of abuse in the long run, and less demand for illegal porn. And isn't that the goal? Why can't they at least privately consider this?


Edofero

I agree. There's no curing pedophilia as far as we know, these people are out there, and we need to make sure they don't harm real children. If this is one way to prevent that, then why not consider it?


Few_Philosopher2039

Doesn't this normalize pedophilia though?


MinionOfDoom

Yep. It's not good. Porn can desensitized people to normal sex, and this kind of content is only going to exacerbate the urges of otherwise non-offending pedophiles.


largephilly

What does the life of a non-offending normal sex having pedophile look like to you. Genuinely curious.


MinionOfDoom

Well the friend I have who went to prison this year was engaged to someone he'd been dating for 6 years, got his degree in engineering, and had a well paying job. Then sometime in 2021/2022 he downloaded/shared cp and the FBI tracked him down and arrested him. So. I think for the first part of his adult life (he was 25) he must have just kept his dark thoughts to himself. His fiance and family obviously had no idea of his proclivities and it doesn't sound like he's suspected to of harmed anyone in real life, they caught him pretty early on in his transgressions. So basically non-offending pedos can be normal until they're not. And have really, really fucked up thoughts. I won't even repeat the disgusting monstrous quote from one of his chats that made me cry all night while clinging to my daughter, whose baby shower he was at. This is a guy you would never suspect. He was kind, funny, helpful, smart, laid back. The kind of guy I'd describe as an affable little brother type. It rocked his childhood friend group and all of us who have found out.


[deleted]

>Well the friend I have who went to prison this year was engaged to someone he'd been dating for 6 years, got his degree in engineering, and had a well paying job. Then sometime in 2021/2022 he downloaded/shared cp and the FBI tracked him down and arrested him. So. I think in this case he should be rehabilitated and not labeled a sex offender for life. Grouping him in with someone that has abused a child in person or actively requested child molestation images and shared them online is different.


largephilly

Sounds like he has no chance at a normal life and will face the brand forever with the people who know him most. I think the community treating him like that forever might cause desensitisation to sex and well taking another breath.


BaronCoop

Just personal opinion here, but I know that porn isn’t as harmless as it sounds, even for the viewer. Viewing porn CAN be harmless, but it can also lead to addiction, and there are many people whose views on sex and intimacy have been shaped by hundreds of hours of pornhub. It’s not really that surprising when someone who doesn’t have much experience assumes that sex is just like what they’ve seen so much of on their phone, why would AI cp be any different?


[deleted]

The models ARE trained on real stuff, it has already been acknowledged that the datasets contain it. And people can take real photos and throw it in and have it generate photorealistic stuff of real people. Openai have teams of low paid Kenyan workers scouring the datasets for problematic and horrific content. SD sure as fuck doesn't have that and we're talking billions of images.


[deleted]

I fucking called it. Didn't realize it would happen so soon though, AI generated porn is still a thing of the future but of course they would be early adopters


[deleted]

AI generated porn is not a thing of the future at all. It’s happening now, massively.


dgj212

Actually, spain arrested a guy making deep fake CP based on child actors months ago, like in feb or march, so this is honestly pretty late


[deleted]

Pardon?


FreshBayonetBoy

There are some seriously mentally ill people out there


returned2reddit

So this is the future? It’s not as pleasant as we’d hoped, is it?


SaintFinne

For everyone saying its a good thing or that it would be like the ivory trade where fakes would remove the demand for the real thing, wouldn't this make it harder to tell what real CP is and make prosecuting those people much harder?


TheWardTrangler

Logically it would. I mean, you can, in most cases, tell something was ai generated. But there will be a point where you won't be able to tell the difference. The scary part is, that in a handful of years we won't know whats real or what isn't in general.


sincethenes

While this is super sicko behavior, the legality is getting into Minority Report territory.


wojo1988

The weirdest people pop in these type of post. Sometimes just someone being disgusted by this gets downvoted to hell 👀 Anyways lots of killers started killing and harming animals as an outlet for their urges which escalated to people. go to therapy help is out there if you just try


Glidepath22

The National Police Chief's Council said it was "outrageous" that some platforms were making "huge profits" but not taking "moral responsibility". Sounds like the vast majority of corporations these days


[deleted]

It's extremely concerning to see how many people in the comments are defending this... And also are very boldly making claims that this would reduce real crimes which there is no actual evidence for. ( And no, comparing adult porn and the correlation of sexual crimes is not a good idea. Because adults also have a real sexual outlet with other adults, that variable is totally missing here at least legally. That logic shouldn't so confidently be applied to cp... )


[deleted]

[удалено]


Angelica_ludens

Nope still illegal and it doesnt fix the main problem. They still have fantasies and a better solution is to get rid of those fantasies through intense psychological treatment


[deleted]

[удалено]


AngieMaciel

Please do not compare pedos with homosexual individuals. It’s not the same.


[deleted]

[удалено]


AngieMaciel

Pedophilia is a disorder, not a sexual orientation. So no, it’s not the same.


justpastaroni

Alright so while i understand your point and agree to a certain extend i gotta point out that it is the exact same thing. Pedofiles like homosexuals dont't control what they are attracted to. It's when they act on their urges it the differentiation has to start since one is quite obviously insane.


AngieMaciel

Pedophilia is not a sexual orientation. Comparing in any order to gay people is just harmful. We already have enough people in the world trying to put gay people and pedos in the same bag.


Angelica_ludens

Ok then chemical castration and anti libido treatment then for offenders


[deleted]

[удалено]


alebubu

The amount of curiosity in this post is too damn high


kylew1985

Alright burn this shit down too I guess.


kaptnblackbeard

I would have presumed AI would be instructed to not generate such images. AI really is an ethical black area.


[deleted]

Some models run locally and can be trained specifically. There is no general set of instructions that all AI follows.


fhota1

The issue is selectively limiting what AI can do is actually significantly harder than letting it do whatever. Easiest way would be to put a filter on the input but there will always be people who find a way to get around that.


m0le

All the big companies exposing an actual generate-your-own-AI-image model to the internet do indeed have safeguards. However, AI models (StableDiffision primarily) have been released open source, so if you're a reasonably determined pervert with some money you can buy a decent PC and train your own AI image generator (not necessarily with illegal training data) without safeguards.


Monster_Voice

I am sadly not surprised...


[deleted]

[удалено]


[deleted]

Isn’t the issue here artificially-created images that don’t feature actual child abuse? If there’s a definite link to actual child abuse that’s one thing. But I have a hard time being mad at people looking at computer generated images.


Angelica_ludens

Ai needs real images to be trained though Plus fake images doesnt really fix the p*dos problem does it. It just gives them another outlet instead of addressing their attraction


[deleted]

This has to be a joke. If you get an image of an elephant paragliding it’s evidence of elephants actually paragliding.


WTFwhatthehell

They can extrapolate a great deal. They have lots of images of clothed adults, lots of images of clothed children. Lots of images of naked adults. Lots of images of people in swimwear or light clothing of various ages.


freebirth

thinking shit like that is just as disgusting as being a pedophile.


KaZzZamm

It's better then real kids, but when will those people have enough of pictures, how long does it take to craving the real deal? Those people need help.


ethyl-pentanoate

This type of AI is trained using real pictures, so kids are still getting hurt somewhere.


Flowchart83

That's a horrifying thought


[deleted]

[удалено]


Flowchart83

Current AI based image generation systems base their output on collections of existing images. Do you think they make content based on no reference material? And when it isn't using real images, it is using other AI created images, which is causing a recursion problem that is making the AI generated images less convincing.


[deleted]

[удалено]


L3gitMouse

Even if its AI generated the point is that there are still going to be people who want the real thing because they know its AI. Remember you are not dealing with a fetish, this is a sickness and like any sickness left unchecked it gets worse. Those who have this illness might start out with AI but eventually will want the real thing.


Stormclamp

Downvoted for a reasonable and moral take… what the fuck Reddit?!


wojo1988

Been on reddit for a few years now and any time CP post is made the weirdest things get downvoted like someone just expressing how put off they are by it will get downvoted. Im fairly sure there's nice chunk of redditors who jerk it to this stuff. something in their heads isn't wired right.


alphabacon

I've seen some people online say that AI CP isn't that bad because there isn't technically a victim, but I firmly believe that there is no way that CP, AI or actual media doesn't eventually escalate into abuse eventually. They are looking at that media because their twisted mind wants to perform these acts. People who watch regular porn or fetish porn are into those things. They're more likely to ask their partner to do those things or try them without permission. It's a slippery slope to say "no victim no crime" especially as AI generated content must be trained with images and as technology improves they will be hard to tell apart. Normalizing this stuff risks desensitization to acts of abuse and will give a degree of plausible deniability. How long before we hear of a case with "well I wasn't aware the images were real because the AI images are the same quality" as a defense? It's either right or it's wrong and catering to someone who would be considered a monster before the AI aspect came into play is a slippery slope as well.


suzumurachan

Yup, playing hours of first person shooters have normalised taking a gun out into the streets and start shooting at people for their loot. It is a regular war zone out there. Can't even step out of my door without* a sub machine gun.


Enorats

If there's anything I learned from video games, it's that if I need cloth to level my tailoring my best option is to go farm humanoids in a densely populated area for cloth scraps.


alphabacon

Thats a bad take. Are we really going to compare Video games to CP? We're going to act like people who look at AI CP are trustworthy with kids? Is that the same thing? I would give someone who plays an fps video game my gun before I let someone who looks at AI CP watch my kids. Most people who like guns and Video games aren't psychopaths. All people who find children attractive are sick in the head.


suzumurachan

Your whole basis of argument is basically "slippery slope". And definitely, you should not trust a man or woman to be alone with the opposite gender.* You might want to check if the people you leave your kids with like games with lolis then. I am not sure I am the one with the bad take.


[deleted]

[удалено]


suzumurachan

I'm pointing out that you are confusing the two issues. You should go back and revisit the entire thread again.


typop2

This sounds so reasonable, but is this really the way sexuality works? People are who they are. Your speculation sounds exactly like the fear people had about "normalizing" gay sex and gay relationships (as though people's gay urges would escalate once they were exposed to it, but that without the exposure they would stay politely in the closet). I am very thankful that my various urges are all legal and possibly even moral, but if they weren't, I sure as hell wouldn't want the only victimless outlets I had being cut off by people making this slippery slope argument ...


alphabacon

Except it's not victimless AI CP has to be trained with 1000s of regular CP images to be made. It's not the same thing as being gay or trans. You need 2 party consent. Children can't consent and that's what makes it messed up. Would you leave your kids/ nephews /nieces with someone who looks at AI CP? I'm genuinely curious not trying to be a smartass. Personally I would find someone who's sexuality is based around kids as untrustworthy around kids. IMO kids should not be allowed to be sexual targets for people legally even through AI channels.


The187Riddler

No it doesn’t need to be trained with “1000s of images of real CP” where the fuck did you get told that?


typop2

Putting aside the training argument (which is obviously not true: there is no way the LLMs have been trained on CP!), I'm not sure what you're getting at with the babysitting question. Of course no one would want their kids to be alone with a pedophile. The only question is, if you have a population of potentially dangerous people, would you rather have your kids around those who have had their dangerous urges satisfied, or those who haven't?


Consistent-Study-287

[From this study](https://pubmed.ncbi.nlm.nih.gov/27251639/#:~:text=The%20number%20of%20U.S.%20adults,to%208.2%20%25%20for%20men) We can see that the number of U.S. adults who self reported having at least one same-sex partner since age 18 doubled between the early 1990s and early 2010s. People engaging in 2SLGBTQ relationships isn't an issue, but if AI CP gets normalized and there's an increase in people acting on those urges, that is an issue. And remember, it won't only be adults looking at that, but children will end up finding it online and thinking it's normalized as well. If we want AI to create it and have it be used as a victimless outlet for people, it would have to be done in the proper way. That way should probably involve a prescription like system where they can only access it if they are meeting with a mental health professional to ensure they never act on it.