A Constitutionally Dubious California Bill Would Ban Possession of AI-Generated Child Pornography
The proposal seems to conflict with a Supreme Court ruling against laws that criminalize mere possession of obscene material.

Back in 2016, a study found that it was increasingly difficult for subjects to distinguish between actual photographs of people and computer-generated simulations of them. The researchers suggested that development would complicate prosecution of child pornography cases. That concern has been magnified by rapid improvements in artificial intelligence, prompting a California bill that would, among other things, make it a felony to possess virtual child pornography when it qualifies as "obscene." This provision seems constitutionally problematic in light of the U.S. Supreme Court's holding that the First Amendment bars legislators from criminalizing the mere possession of obscene material.
Assembly Bill 1831, introduced by Assemblymember Marc Berman (D–Palo Alto) on January 12, aims to expand the state's definition of child pornography to include "representations of real or fictitious persons generated through use of artificially intelligent software or computer-generated means, who are, or who a reasonable person would regard as being, real persons under 18 years of age, engaging in or simulating sexual conduct." Since that new definition would pose obvious First Amendment problems as applied to constitutionally protected images, the bill specifies that such representations must meet the state's definition of obscenity: material that "to the average person, applying contemporary statewide standards, appeals to the prurient interest"; "depicts or describes sexual conduct in a patently offensive way"; and "taken as a whole, lacks serious literary, artistic, political, or scientific value."
That definition of obscenity tracks the test that the Supreme Court established in the 1973 case Miller v. California. But four years earlier in Stanley v. Georgia, the Court unanimously rejected a state law that made it a crime to possess "obscene matter." Writing for the Court, Justice Thurgood Marshall drew a distinction between that ban and other obscenity laws: "Whatever may be the justifications for other statutes regulating obscenity, we do not think they reach into the privacy of one's own home. If the First Amendment means anything, it means that a State has no business telling a man, sitting alone in his own house, what books he may read or what films he may watch. Our whole constitutional heritage rebels at the thought of giving government the power to control men's minds."
Berman evidently did not view the Supreme Court's reading of the First Amendment as an obstacle to his goals, and he is by no means alone in that. Way back in 1996, Congress tried to ban "any visual depiction, including any photograph, film, video, picture, or computer or computer-generated image or picture," that "is, or appears to be, of a minor engaging in sexually explicit conduct." The Supreme Court deemed that law unconstitutional in the 2002 case Ashcroft v. Free Speech Coalition, noting that "the literal terms of the statute embrace a Renaissance painting depicting a scene from classical mythology" as well as "Hollywood movies, filmed without any child actors, if a jury believes an actor 'appears to be' a minor engaging in 'actual or simulated…sexual intercourse.'"
Congress tried again in 2003. The PROTECT Act covered any "digital image, computer
image, or computer-generated image" that is "indistinguishable" from "that of a minor engaging in sexually explicit conduct." Unlike Berman's bill, it did not require that such material qualify as obscene, making it even more constitutionally questionable. But it did include an obscenity test for another category of proscribed material: "a visual depiction of any kind, including a drawing, cartoon, sculpture, or painting," that "depicts a minor engaging in sexually explicit conduct." And the law applied a less demanding test to any visual depiction of "a minor engaging in graphic bestiality, sadistic or masochistic abuse, or sexual intercourse, including genital-genital, oral-genital, anal-genital, or oral-anal." The PROTECT Act made such material illegal if it "lacks serious literary, artistic, political, or scientific value," dispensing with the other two prongs of the obscenity test.
In 2008, the U.S. Court of Appeals for the 4th Circuit considered the case of a Virginia man, Dwight Whorley, who was charged with violating the PROTECT Act by "knowingly receiving on a computer 20 obscene Japanese anime cartoons depicting minors engaging in sexually explicit conduct." Whorley argued that the law's prohibition on receiving obscene images was "facially unconstitutional" because "receiving materials is an incident of their possession, and possession of obscene materials is protected by the holding of Stanley v. Georgia."
The 4th Circuit rejected that claim. "Stanley's holding was a narrow one, focusing only on the possession of obscene materials in the privacy of one's home," the majority said. "The Court's holding did not prohibit the government from regulating the channels of commerce." The appeals court perceived the provision under which Whorley was charged as "focusing on the movement of obscene material in channels of commerce, and not on its mere possession." So even though receiving, viewing, and possessing images are all essentially the same thing in the context of the internet, the appeals court concluded that Whorley's prosecution did not run afoul of Stanley. But even that debatable reading does not seem to help Berman's bill, which explicitly applies to "every person who knowingly possesses or controls" the newly prohibited images.
Whorley also argued that the PROTECT Act was "unconstitutional under the First Amendment, as applied to cartoons, because cartoons do not depict actual minors." The 4th Circuit also rejected that argument, noting that cartoons are covered by the law only when they are "obscene" and that obscenity is not protected by the First Amendment.
That point does aid the defense of Berman's bill, but again not insofar as it applies to mere possession. In other cases involving cartoons, such as manga, Simpsons porn, and "incest comics," federal defendants have pleaded guilty to possession charges, avoiding a constitutional test.
As applied to distribution, A.B. 1831's obscenity requirement follows the approach that New York University law professor Rosalind Bell recommended in a 2012 law review article. Bell argued that the PROTECT Act provision covering digital images "indistinguishable" from the real thing, which does not require a finding of obscenity, is clearly unconstitutional.
In the 1982 case New York v. Ferber, Bell noted, "the Court established that the First Amendment does not extend to child pornography because the state has a special interest in protecting children from harm." That interest, the Court held eight years later in Osborne v. Ohio, justifies even a ban on private possession of child pornography. But those cases involved actual child pornography, and the Court's reasoning focused on the injury that its production and dissemination inflicts on the children whose abuse it documents.
"Post-Ferber child pornography regulation and court decisions interpreting this regulation have become untethered from the Supreme Court's crucial limiting interest in protecting children from physical and emotional harm," Bell wrote. "Increasingly, congressional action and court opinions reflect concerns about controlling private thoughts rather than preventing and punishing direct harm."
Bell noted that Adrian Lyne's 1997 film adaptation of "Vladimir Nabokov's famous novel Lolita" went "straight to cable" because distributors worried that law enforcement agencies might deem it child pornography. "Writers and artists have explored the theme of adolescent sexuality in countless valuable works," she wrote. "By banning non-obscene virtual depictions of child sexuality without reference to their social value, we exceed the First Amendment's crucial dictates and jeopardize these works, including acclaimed films like Romeo and Juliet, The Tin Drum, American Beauty, and Taxi Driver."
The "serious value" of such material presumably would protect it from Berman's bill, which is why the obscenity requirement is crucial. But the ban on possession still flies in the face of the Supreme Court's conclusion that "a State has no business telling a man" what he can look at while "sitting alone in his own house." Although the Court later made an exception for pornography involving actual children, that exception does not encompass images that can be produced without violating anyone's rights.
Editor's Note: As of February 29, 2024, commenting privileges on reason.com posts are limited to Reason Plus subscribers. Past commenters are grandfathered in for a temporary period. Subscribe here to preserve your ability to comment. Your Reason Plus subscription also gives you an ad-free version of reason.com, along with full access to the digital edition and archives of Reason magazine. We request that comments be civil and on-topic. We do not moderate or assume any responsibility for comments, which are owned by the readers who post them. Comments do not represent the views of reason.com or Reason Foundation. We reserve the right to delete any comment and ban commenters for any reason at any time. Comments may only be edited within 5 minutes of posting. Report abuses.
Please
to post comments
Is that you, buttplug?
Sullum's not fat enough to be Jeffy.
'Back in 2016, a study found that it was increasingly difficult for subjects to distinguish between actual photographs of people and computer-generated simulations of them. The researchers suggested that development would complicate prosecution of child pornography cases.'
Yeah, but it makes it SOOOO much easier to prosecute undesirable people based on photographic "evidence".
So if something creates a crappy drawing of a naked 17 1/2 year old, then they should get imprisoned? Sounds both extreme and absurd.
And yes this law makes it even easier to plant illegal material on someone's phone or computer, and arrest them.
Who is harmed by artificially generated images?
democrats.
If you need to view child pornography, the state will assign you items from the approved list, as soon as you register.
Most likely, democrats will have the government schools implement shadow grooming programs to match young children with ‘big brothers’……. like Pluggo. As an extension of the tranny grooming shadow grooming programs already in effect in democrat controlled states.
Sex workers. What happens when your OnlyFans and feet pics can be generated by AI?
Let the sex workers… Learn to code!!! AND then also let them eat cake ass well!!!!
(They already eat ass; now let them eat-the-meat and the ass-cakes ass well!!! If'n they will only cum to MY place, I will tell them that I baked them an ass-cake!!! Hemorrhoids on the side!)
Underage persons are presumably too immature to understand the full ramifications of permitting others to create and distribute sexually explicit pictures of them, so any consent they give is deemed invalid by law. As such, they would be harmed if they gave consent to an AI to take sexually explicit pictures and video of them.
One justification for a law prohibiting the use of AI to create kiddie porn is to foreclose a defense that, "the kid gave consent to the AI, not me!"
I do oppose the provisions that apply to fictitious images, or to images and videos of adults who could be reasonably mistaken for underage persons. That goes too far.
Umm, am I being poe's lawed here?
The answer's always 'maybe'.
Not with Michael.
If kids want child porn that's what our public school libraries are for. Just ask any librarian.
Dumb people will say everyone is harmed by them. But an image of someone who does not exist does not harm anyone...certainly not to the point where someone should be imprisoned over it.
And the images are not "artificially generated"...not even sure how that would be possible vs having them "real" generated. Maybe you mean non-human generated, but humans are still generating them, with the help of a computer, like they have been doing for 50 years now.
"applying contemporary statewide standards,"
I think the differences between San Francisco, LA, and Sacramento, and the rest of the state will make that a difficult concept to sell to a jury.
This is exactly the sort of thing prosecutors love/hate. They hate defendants having reasonable doubt. They love possession crimes.
To do a reasonably good job does the AI need to be trained on the real thing?
This might just be the question of the thread...
If that's true now, it won't be in the near future.
Not really. It knows the biology of a child and can extrapolate.
but does it "know" the child's gender identity?
I can't be bothered to read past the headline. Is this outright support for child porn? I can understand that at face value you can say no children are directly harmed if the images and actions are entirely generated digitally (or we could even include cartoons and such.) I can't make myself be open-minded enough or ideologically pure enough to say that this is ok or legal.
This shit feeds the hunger that creates victims. I'd like to see the creators of this shit dragged into the street and shot along with the rest of their pedophile friends.
Alternative theories say that flooding the market with fake stuff negates the need to generate real stuff.
Other alternative theories say government funds and authority aren't infinite, the Child Protection and Obscenity Enforcement Act of 1988 passed under the Anti-Drug Abuse Act/Bill of 1988, and busting people for AI CP to bring about a CP-free America makes about as much sense as busting everyone with green, leafy type material in baggies to enact the drug-free America policy.
You don't have to like drug users or sex workers or trannies or people who've never touched children but view AI CP, I generally don't. You're free to have your local PD chase them off any given street corner, out of libraries, and crack their skulls for trying to bring it into schools. But, again, when you're going house to house looking through everyone's closets to find evidence of a law you know they've broken, you've become something far worse and far more threatening than some guy who skins raccoons alive in his shed or likes to dress up like a woman and watch iCarly in the privacy of their own home.
What "need" is there to generate real stuff in the first place?
Demand?
There's a demand for slaves. Should a market be provided to satisfy it?
Well, I mean we have robots that sweep our floors now and I believe “robot” means “slave” in some Slavic language. So, yeah, flooding the market with fake slaves negates the need for real ones.
Human slaves. There's a demand for human slaves. And let's even say cloning technology has advanced to the same degree AI has. Should we start cloning humans for the express purpose of providing a slave market?
There's not a specific demand for human slaves though. The demand is for work to be done for free. Robot vacuums, self checkouts, fast food apps and kiosks, and other automations all satisfy the demand without the need for human slaves.
The demand is for human slaves. Stop changing the terms of the argument.
If you say so...
I believe the parallel argument being made here would make that answer, no. A cloned human is a still human. A human slave is a deprived of rights and therefore violates the non-aggression principle. A robot that resembles a human with an AI "brain" would be OK though.
It’s coming if the democrats are allowed to exist. They’ve been pro slavery for their entire 200 year history.
Why would that be OK? Who gets to define "human" and on what basis?
I guess God, or nature. Subject to change as AI advances I suppose.
According to certain politicians, yes, we need slaves.
https://reason.com/2024/01/26/josh-hawley-thinks-the-white-house-can-force-an-aluminum-plant-to-stay-open/
That's debatable. People are more productive when they are earning based on their own production, as long as there's a free market for labor and skills. Incentivizing people to increase their earnings by being more productive leads to better economic outcomes for both the worker and the employer. In many ways, the South was hamstrung by slavery, held back from economic innovation and the best usage of resources. But the wealthy in the South were heavily invested in slaves, and the whole model was constructed around it, so actually breaking from a slave labor model was always going to require some massive short-term expenditure or losses.
IG Farben discovered something similar at Buna Werke, where German civilians were much more cost effective than the prison labor they were renting from Auschwitz for 3 marks a day.
Isn't that our current situation? Following the NAP and current norms of society, you cannot force involuntary servitude, but you can encourage that servitude with incentives like a wage, time off and the title of employee rather than slave. You can also have a non-human slaves (robot or animal).
Elon musk is solving that with his robot.
Really terrific bit from C.S.Lewis - can you imagine a bunch of people gathered around a covered dish throwing dollar bills and cat-calling as the lid is slowly raised to reveal…. A bacon wrapped Pot roast?
At the time he wrote that, it was truly an absurd thing. You’d think there’s something fundamentally wrong with the appetite for food if a society did such a thing. Ironically, we do that now.
In our sex saturated culture, you’d think that our appetites would be sated, but what has actually happened is we crave it more… like a drug addict craving his next hit. And just like the drug market, the food and sex-scape are saturated with cheap knock offs that satiate for very short periods of time, but leave us hungrier still, yet blind to our hunger for something better, more real, more genuine, and of higher quality.
So no, saturating the market with the fake will increase the desire for the real. That will increasingly be a hole filled with more degeneracy, not less.
A person sexually attracted to underage people views computer generated images of people who do not exist, masturbates and falls asleep. Who is the victim here? The libertarian view is that a crime requires a victim. All other "crimes" are crimes against the state. The state cannot be a victim. I get that the idea is icky. But it seems logical to me that if these people can satisfy their perverse urges looking at fake imagery at least a significant percentage will not take the risk of seeking a real victim. If they do, lock them up. If not, leave them alone.
It gets even worse than that - the punishments for possessing AI generated or animated CP is the same as real CP, and all these punishments are magnitudes greater than actually raping a child. the system perversely encourages actual rape over viewing images
Yup, it's exactly backwards.
True. This makes no sense at all.
The hyper moralistic Left just loves to prosecute people for thought crimes…although the Right isn’t much better there.
Fake CP is super creepy and disturbing, and even people in this comments section wants anyone who creates it to be shot, ..even if it is just drawings of Simpson characters in simulated cartoon sex. It makes them uncomfortable and angry, so they want someone to lose their life over it…or their freedom, at the very least.
That is humanity for you, and there have always been people like this.. They are low IQ religious fanatics….nothing new. At least a few others here can think more rationally about an issue that’s not even especially complex.
This shit feeds the hunger that creates victims.
Now do violent video games.
I find it funny that DOOM and Mortal Kombat came out in the 90s, but since then we've been in a lull of physical violent crime the world has never seen before. Kids are too busy inside playing video games to get knifed in an alley. Meanwhile, porn explodes on the internet the sex crimes rate has absolutely cratered because people are fapping at home instead of getting drunk in bars and touching each other.
I find it funny that DOOM and Mortal Kombat came out in the 90s, but since then we’ve been in a lull of physical violent crime the world has never seen before.
Until 2020.
Dirty, evil, sinful sex is what so many people have guilt and shame over. So the penalties for exploiting imaginary non-existent people should be even higher there.
I can’t be bothered to read past the headline. Is this outright support for child porn? I can understand that at face value you can say no children are directly harmed if the images and actions are entirely generated digitally (or we could even include cartoons and such.) I can’t make myself be open-minded enough or ideologically pure enough to say that this is ok or legal.
If you don't recognize a harm being done, then I wonder under what theory of law you can say it shouldn't be legal.
We can offer condemnation and moral opposition to something without making it legal. We can tell people we don't think this is okay, that if they're making or viewing AI child porn they should seek help, but there shouldn't be an enforcement mechanism for it. We should only use the law to redress actual harms, not enforcement of moral standards. There are other mechanisms for moral arguments to be made.
It is. Sullum really should kill himself. Adults who want child porn are bad, it is not a case of if it's ai it's okay. These are very mentally unstable deranged people. And yes sullum supports them
Not everything I think is wrong should be illegal. Show me the victim.
And the flip side as well, I can suitably deal with things I believe to be wrong without necessarily involving The State.
Right. The bad should not be prohibited, and the good should not be mandatory. I believe seatbelts are a nearly universal positive. I don’t believe people should be fined for not wearing them.
True. But this is one of the things that should.
Buttplug being a good example. People like Pluggo should be eradicated, not accommodated.
Funny thing is that, outside of this silly forum, you will eradicate yourself by being undesirable in the real world. The rest of society just has to stay the course, and you will continue to waste your life away online and endure the angry loneliness. It's quite satisfying to know that your bad behavior is a reflection of your pathetic future.
Hey Pluggo, or is this Pedo Jeffy? Still sore that I rip into you for being a child rapist/groomer? Most people actually like me. Unlike you. Whichever child molestimg monster you are, no one will ever love you.
Maybe you should just do the right thing and kill yourself.
"I can’t make myself be open-minded enough or ideologically pure enough to say that this is ok or legal."
It's irrelevant if it's gross, or distasteful, or feeds a repugnant peccadillo. Is a child exploited or assaulted?
No. Pixels aren't people. Nobody is being hurt. The fact that someone has repellent fetishes is irrelevant.
The real stuff? That's a totally different story.
Q: How do you know the AI generated porn star is under age?
A: When you ask her how old she is, she holds up 12 fingers.
What you did there, I saw it.
I laughed.
And seven of them are on one hand.
Hey Reason, this is why we mock you for taking Taylor Lorenz seriously. This is how you talk about Taylor Lorenz.
"Taylor Lorenz is 'high camp'"
I'm using that.
Holding in your hand and talking into a lapel mic on TikTok is peak Taylor Lorenz.
Isn't she the cunt who had her uncle "the guy who owns way back machine" remove all instances of her after someone pointed out her being a hypocrite?
A-yup.
Assembly Bill 1831, introduced by Assemblymember Marc Berman (D–Palo Alto) on January 12, aims to expand the state's definition of child pornography to include "representations of real or fictitious persons generated through use of artificially intelligent software or computer-generated means, who are, or who a reasonable person would regard as being, real persons under 18 years of age, engaging in or simulating sexual conduct."
The bolded parts are problematic.
Taking sexually explicit pictures of naked 14-year-old persons would be no more justifiable if AI were used to do it.
You messed up the first one:
“real or fictitious persons”
You want to hang a framed picture of Bart Simpson's wang on your bedroom wall? Have at it.
Try and put it in our public library or school or get someone to pose for you, like Bart, or have an AI try and generate the nudes of a little kid from the photos you snapped from the bushes and I’ll be among the volunteers if they ask for someone to tear your thumbs off.
Pubic libraries and schools have great dicretion as to what they choose to display inside.
Their governance varies widely State to State and library to library and, no, for the most part, they don’t. People can play retard with the “It’s not an Adult Bookstore, It’s a Public Library!” or "Hey you got your Adult Bookstore in my Public Library/You got your Library in my Adult Bookstore!" bullshit but, like the idiocy of traipsing the line with male and female and acting like female impersonator strippers aren’t breaking the law, that’s more about people demonstrating their ability to willfully retard themselves and harass others rather than any actual ambiguity in/of the law.
"have an AI try and generate the nudes of a little kid from the photos you snapped from the bushes"
Creepy as fuck but I'm on the fence on this one. The kid isn't being hurt. It's no different than if he drew a picture based on the photos with a pencil.
Lucky us, the pencil drawing is illegal now too!
/sarc
Creepy as fuck but I’m on the fence on this one.
Implication being the bushes aren’t the photographer's, no party has consented, and there’s an expectation of privacy. Any one of the three and they wouldn’t have to hide in the bushes.
You really don't know what AI is, do you?
But what about people drawing child pornography with magic markers? Should we ban magic markers? What about pens and pencils?
Except for the possible example of tracing over an image generated by a camera obscura, it would not be child pornography.
Why would a person be drawing that?
Because he wants to, and if he does so in the privacy of his home, it's no one else's business.
Is that the same reason we cover up the manifestos of trans school shooters? Respect their privacy? It's nobody's business?
Even when their fantasies start manifesting their way into reality?
"representations of real or fictitious persons generated through use of artificially intelligent software or computer-generated means, who are, or who a reasonable person would regard as being, real persons under 18 years of age, engaging in or simulating sexual conduct."
Anyone, feel free to name one single valuable or redeeming characteristic of what's described there.
Not relevant to whether it should be legal. People do a lot of things I consider abhorrent that I don't believe should be illegal.
I'm not asking a question about legality. I'm asking you, Reason, anyone else who finds anything defensible about this - to put in writing how much and to what extent child pornography should be considered acceptable.
And if your position is something other than "not at all" - I'm curious as to the reason why.
Pay no attention to the guy in the blue windbreaker. As long as you're not wearing a MAGA hat, he won't care.
Well the moral principle at play here isn't one of child pornography or pedophilia, it's one of liberty. My defense is that anything you do in private that harms nobody else is none of my business, whether I find it tasteful or otherwise. Child pornography creates a victim. AI generated images of fake children is victimless, even if I think the person who wants such things has a mental issue.
Regardless, the principle of liberty trumps any stance of taste. You have and should have the liberty to do anything that does not infringe on the liberty of others. If you want to turn off the plumbing in your house and start storing your own shit in buckets in the basement crawlspace, go for it. I don't consider that a valuable goal and would advise against doing it, but it's also none of my fucking business.
So would it be accurate to describe you as a supporter/defender of child pornography, so long as it's victimless? Would you object to me characterizing you that way?
Also, define "victimless." I'm thinking of a context in which a child pornographer uses an AI directed to create material from samples based on your own children. Like, say he sits in his panel van all day along outside your child's school, formulates a good description of them, then goes back home and directs his AI to use them as a basis for generating child pornography. Which he also distributes to others for sale.
Distasteful, certainly. But it's victimless, right? No harm has occurred. In fact, maybe it's even praiseworthy - an act of liberty exercised freely. Would you draw issue with any of that?
Also, define “victimless.” I’m thinking of a context in which a child pornographer uses an AI directed to create material from samples based on your own children. Like, say he sits in his panel van all day along outside your child’s school, formulates a good description of them, then goes back home and directs his AI to use them as a basis for generating child pornography. Which he also distributes to others for sale.
This is changing the variables involved to where there's more perceivably a victim. Certainly, if the likeness is close enough, there's a violation of privacy and a specific victim is being exploited, especially if the image is distributed. But that's a different question than the general issue.
Also, call me whatever you like. My positions speak for themselves, your characterizations of them are meaningless to me.
Why is there a violation of privacy? They just drew something that happened to look like someone you care about. You're the one with the hang-up about it. It wasn't actually that person. So, it's cool right?
Well in your example it isn't something that "happens" to look like someone else, it was specifically intended to look like particular person. You also added the element of commercial distribution.
These 2 elements make it actionable under current law, whether it's a child or an adult, pornographic or not.
So as long as the child pornography doesn't actually resemble any actual child on earth that you recognize, you're OK with it?
My positions speak for themselves, your characterizations of them are meaningless to me.
Well, as it stands ATM - your positions is that you're in favor of the production of child pornography in defined circumstances.
So now go back to the earlier question: how much and to what extent should child pornography be considered acceptable, and why?
He's already answered that in detail, sea lion.
It isn't child pornography since there is no child involved in production.
This is obvious enough to you, which is why you keep moving the goal posts to try and paint anyone that doesn't agree with you as an exploiter of children when the image in question is no person living or dead.
Lets reverse that and ask why you're in favor of pencil drawings created entirely from someone's imagination to suddenly become outright illegal with decades of prison and a permanent position on a watch list involved.
Because, know it or not, that is the position you are staking out.
Is he Pedo Jeffy, playing some bizarre semantical game?
Lets reverse that and ask why you’re in favor of pencil drawings created entirely from someone’s imagination to suddenly become outright illegal with decades of prison and a permanent position on a watch list involved.
I've never expressed being in favor of that. In fact, if your mind is compelling you to pencil draw sexually explicit imagery of children, then someone should probably have a talk with you. With orderlies present.
No, you didn't come out and say it since that would have been honest.
What I did read was your rationalization, which would indicate that you must be in favor of that IF you want to be consistent in your application of your ideals.
One need not be in favor of a thing to say it shouldn't be illegal. That you continually fail to understand that means you are either a child or have the mental capacity of a child. Take your pick, I guess.
There are video games that depict the killing of people that are rendered in a realistic fashion. Should this be outlawed as murder is illegal even though no real people are being killed?
Again, we're not talking about legality. That's a copout people like to use to avoid the actual merit of the thing being discussed.
Talk about the thing being discussed. Let's say I come up with a game where you can describe your schoolmates and school and an AI generates a realistic depiction of them, at which point you put on a VR set to go slaughter (and rape, because sure why not) them in a realistic sensory-experience fashion.
Tell me the merits of such a thing.
No, when it comes to a discussion about a proposed law, the moral righteousness of the thing at question is what is actually irrelevant. Your attempt to derail and force people into a position is inconsequential.
The issue here is one of policy. And when it comes to policy, the proper answer, as far as I’m concerned, is with the maximum protection of rights. That means as long as there’s no infringement of others’ rights-in other words, no victim is created-then it falls within your own liberty. I loathe drunk drivers, but if you own a farm and decide to drunkenly drive your pick-up within the boundaries of your own property, you have only exercised your own liberty. That’s my singular guiding moral principle, personal liberty, and the purpose of government is not to instill morality but to protect individual rights.
We're not talking about a proposed law.
We're talking about the alleged merits of child pornography - real or AI-generated. Are there any?
"We’re talking about the alleged merits of child pornography ...AI-generated. Are there any?"
No.
But as long as they don't exploit or harm an actual child it doesn't matter.
I also think there's no merit to smoking weed, ass sex, or taint piercing but they shouldn't be illegal.
You are conflating two things and declaring them to be the same, yet have made no cogent argument that actually draws any parallels between two entirely dissimilar things.
You want to argue the morality of the situation, yet you can’t even illustrate your own moral position on this with facts. Feel free to go in depth on why child pornography is as demonized as it is. I think you’ll find that all the moral underpinnings of that position are simply not present in AI generated art.
If you feel that imaginary depictions of disgusting actions are the same, morally speaking, as entirely real disgusting actions you’re going to have a lot of explaining to do about literally all of modern art and entertainment.
And just for the record, there doesn't need to be any moral value to this in the first place. If art needs to have some moral reason to exist, you're an out and proud authoritarian that would make the Soviets proud. That was literally their position on art.
So it’s your default position that anything without objective merits should be made illegal?
But as long as they don’t exploit or harm an actual child it doesn’t matter.
So you don’t have a problem with sexually explicit imagery of children then, correct? Your only problem is with child abuse/exploitation; NOT child pornography. You’re in support child pornography in certain circumstances, correct?
If you feel that imaginary depictions of disgusting actions are the same, morally speaking, as entirely real disgusting actions you’re going to have a lot of explaining to do about literally all of modern art and entertainment.
Is that what child pornography is? Art and entertainment?
So it’s your default position that anything without objective merits should be made illegal?
I’m not saying anything whatsoever about legality. I’m simply trying to understand why people are in favor of creating/disseminating sexually explicit imagery of children. Strangely, they keep trying to change the subject.
ML: harm an actual child it doesn’t matter.
AT: You’re in support child pornography in certain circumstances, correct?
Did you miss that part of his post or just decide to ignore it to try and play gotcha?
I didn't miss it.
Do you support the creation of sexually explicit materials involving children, or not?
I'm not the one playing gotcha games. Folks keep trying to hang their hat on this notion of a child being harmed. We're not talking about a harmed child. We're talking about child pornography; about sexually explicit materials involving children.
Folks here seem to be in support of its legality, yet they don't seem to have enough confidence in their position to articulate it. Will you?
Child pornogphy, by normal people’s definition, would have to involve living breathing children. Otherwise it’s cartoon/animated pornography, which is clearly not the same thing.
This is what everyone keeps pointing out to you because there is a distinction and a qualitative difference.
Otherwise it’s cartoon/animated pornography, which is clearly not the same thing.
Which you support generating? Why?
Tell me the merits of such a thing.
No one needs to tell you the merits of such a thing. It doesn't need merits. It is well and truly depraved. HOWEVER...
It harms no one and therefore there is no reason to prohibit it.
No, actually, you do need to articulate the merits of a thing if you’re going to defend it. Especially when we’re talking about sexually explicit imagery of children.
You denounce it as depraved, but then defend the depravity. Why? What's your reasoning for that? You think depravity has a place in the world? If so, what and why?
Wait, do you think the 3d images made by computers are real people?
You didn't answer the question.
Sexually explicit imagery of children. Made by computers or anyone/anything else. Defend its tolerance.
Let me bounce your question back at you. What are the merits of romantic-comedy movies? Of the Super Bowl? Of Taylor Swift concerts? Of 'The Big Bang Theory'?
Their value consists in the fact that people want them and are willing to pay for them. So with kiddie-porn. You and I can't imagine wanting it, and are disgusted by the fact that some people do, but other people might be just as appalled by some of the things that we like.
The only justification for a ban on child pornography is that actual children experience real harm in the course of its production or distribution. Otherwise, it's simply a matter of the majority imposing its taste on the minority.
Their value consists in the fact that people want them and are willing to pay for them.
So you respect that there's a market for sexually explicit imagery of children, and you think that market should be encouraged/defended in creating and disseminating its content?
we’re not talking about legality.
Everyone here except you is talking about legality.
I'm pretty sure he is neither a defender nor a supporter. You added the "Which he also distributes to others for sale." to bolster your view. Twenty years ago, in all states, my one plant pot would be 20 years. Now in many; it is only discouraged by municipalities that prefer me to buy taxed pot. There is no injured party, and there is no coverage of the law. Make up your own acts to provide your view of the law. Some don't. 5-10 cops a month are arrested for child sex crimes not CP. In Septemeber, it was 20. They almost all got lighter sentences than a "citizen".
Edited to proper place.
I’m pretty sure he is neither a defender nor a supporter.
Are you though? Because I'm not seeing anywhere where folks are saying anything close to the effect of, "Child pornography shouldn't be a thing that we allow in any circumstances."
Instead, it looks like they're coming up with reasons to support/defend it. Supporting/defending sexually explicit imagery of children on its own merits. Which they refuse to articulate.
It is never ever, under any circumstances okay. It’s not okay in cartoon form or AI generated realism. I won’t go to the consequences when some depraved person gets “bored” without a life form body.
Sullum is a sick depraved fuck. Libertarians have a reputation as moral relativists, it’s true.
It's not a question of what extent, CP should be considered "acceptable".
What is being discussed here is what is considered, and made illegal. What will get somone arrested and locked in a cage, for years or decades.
The bottom line is that victimless crimes should not be crimes, in a legal sense. And needs to be a real victim, not an imaginary one that is "possible", tangentally...at some point in the distant future.
What is being discussed here is what is considered, and made illegal. What will get somone arrested and locked in a cage, for years or decades.
Should someone who produces and disseminates sexually explicit imagery of children be arrested and locked in a cage for years or decades?
Meaning you have to address the question of whether there is or is not a problem with producing sexually explicit imagery of children; is there or is there not a problem with disseminating sexually explicit imagery of children? .
Is there? That's a yes or no question.
The answer is yes, but you still don't understand why.
AI-generated "children" are not children, therefore there can be no AI-generated "sexually explicit imagery of children".
AI-generated scenes your spouse are not your spouse (or mother, or sister, pick whoever means the most to you), therefore deepfakes of your spouse (and kids) riding a train won't bother you, right?
It's not the same thing. But it's functionally similar enough to have the same effect and serve the same purpose. And that line is important, right? Because the AI-gen gangbang of a likeness of your wife and parents and and children is perfectly OK and in no way objectionable. Just so long as no actual harm occurred.
That's a pretty transparent attempt to pull emotional strings in lieu of making a rational argument. Are you at all ashamed?
A deepfake of a particular person can cause harm to that particular person; a deepfake (if that's the right term) of a "person" who does not exist cannot cause harm to any person. That is a fundamentally different "effect".
" . . . or who a reasonable person would regard as being . . . "
We are officially out of reasonable persons.
Dejavu ... Marc Berman[D]? "Those filthy religious Democrats trying to legislate their religion on all of us?", said none of those hypocritical Democrat yet. Still waiting.... And waiting...
This influencer is less than one year old:
https://www.instagram.com/the.natalia.novak/
How old she is "supposed to be" is arbitrary. She could "supposed to be" 15 for all anyone knows, and the creator is lying when they say '21.' If you actually think the government is going to be measured and responsible when being given the power to determine such things...
"It's the pixels"
It's a simple question Jacob: Should child porn be legal or not?
Laws with exceptions (like art and AI) make for poor laws.
https://yourlogicalfallacyis.com/begging-the-question
Ok, so exceptions it is.
It's not child porn if it doesn't exploit an actual child. It's that simple.
I can be gross, detestable and unhealthy, but no people are being exploited or hurt which is the reason why actual child pornography is illegal. There is no victim with the fake stuff.
If we ban fictional representations of the act it would be because it offends our sensibilities, which is a very slippery slope.
If there can be tobacco products without tobacco, there can be child pornography without a child.
If, indeed.
We ban it because it fails the duck test.
This isn't hard, the purpose of this art is accurate substitute, nothing more noble than that.
But it doesn't fail the duck test. It looks like a duck - that's it. There's no actual animal present to judge.
Just accept another step on the path to prosecuting thought crime.
Not on the path...already there, and been there for awhile now.
Who exactly is the child in AI porn? "The person depicted may look younger, but that person is actually 18." How does that not immediately settle the issue? How do you prove that someone who doesn't exist is actually younger than the person who created the fictional person says they are? And how can you possibly trust the government to make that determination fairly?
Yes, it's very simple. Child porn should be illegal.
And AI generated images, and other images, manufactured without a camera, are not child porn.
The only libertarian position is that the mere possession of an image should never be a crime. If the possessor of an image did not commit aggression against a child by producing an image, did not encourage aggression against children by buying or trading for the image, and did not distribute it to others for profit, then possession of the image harmed no one else and no criminal sanction is justified.
If possession of AI generated child porn becomes illegal, how long before the FBI starts generating its own for the purpose of entrapping and then arresting people? A month? A week? Two days? The only reason they don't use real child porn to do so now is because it would involve victimizing a minor. Remove that obstacle and for sure they would.
I can see someone they want to entrap entering "A deer in a forest glade", the AI generating a fake nude kiddie pic instead, and the FBI knocking on their door for generating it at home.
A before you say "That's ridiculous!", remember the last six years.
Child pornography legality has to do with the child in the image, not the image itself. If no child was involved, it is legal. Pornography, in and of itself, is Constitutional.
What I could never understand is why anybody would want to watch children having sex? I can't stand to watch them play on a swing set. I can't stand kids period, let's just ban them outright and replace them with AI generated children who will only only seen if you're within the screens viewing angle. And keep their screeching little voices on your earbuds so we don't have to listen to it.
All paraphilias are like that. They're incomprehensible to anyone but the sufferers.
True. I see children as grubby little bags of snot. And in no way sexually attractive. Let alone wanting to touch one. And I certainly do not want them touching me. I even hate being forced to look at ‘adorable’ pictures of friends children.
A related question: what happens when a puberty blocker causes a young adult to have the physical appearance of a child? Do we stop the clock on the day the blocker was administered and give the young person the protection of laws against child sex and pornography?
There's possibly some mental incapacity grounds there...
Constitutionality is never a consideration for California legislation.
Or logic. Or honesty.
What if we legalized AI kiddie porn, but only with Biden's head on all the naked bodies?
Not content-neutral, sorry.
Pedophilia and cannibalism are the common examples of the “unthinkable” extreme of the Overton Window, describing societal and cultural change (unthinkable, radical, acceptable, sensible, popular, policy). By stretching the definition of child pornography to include AI-generated images, wherein no person is involved, let alone harmed, California takes a giant leap toward realizing the sci-fi world of Thought Crime as policy.
I'm intrigued by the PROTECT act's ban on visual depictions of minors "engaging in sadistic or masochistic abuse".
Let's suppose some enterprising reporter infiltrates an orphanage or other facility for children, and obtains photographs or video of staff members gratuitously beating their young wards, just because said staff members felt like it and enjoyed doing so. If their exposé included such images, would that leave them vulnerable to prosecution under such a law?
Is that even a thing?
I'll do you one better: A security camera catches one child hitting another child. Is the footage now illegal child porn?
This provision seems constitutionally problematic in light of the U.S. Supreme Court's holding that the First Amendment bars legislators from criminalizing the mere possession of obscene material.
When did they make such a ruling? I don't think they did. Rather, they have narrowed the definition of "obscenity" so much that almost nothing qualifies. Meanwhile, they have created a separate exception to the First Amendment for child pornography that makes it prohibitable even if not legally "obscene". But that exception is supposed to be narrow, dealing only with material depicting the sexual abuse of actual children.
I didn't read the comments, so somebody probably already said this but I wanted to type it before I forgot. So,it seems to me that there are two reasons to ban child pornography: the first is moral disgust, the second is that, in order to make a film of a child being assaulted, then the child must be assaulted. The first reason, moral disgust, is probably not a good reason for laws against viewing it, if one is concerned about liberty. The second reason, defending children, is reason enough, IMHO, to dedicate lots of resources to that fight. So, if the footage can be produced without harming any actual children, where are we left? And where does that leave our culture- child pornography legal and common because it's fake?
So, if the footage can be produced without harming any actual children, where are we left?
With a lot of perverts, apparently. Or, at least, people who choose that perversion as their hill to die on.
And where does that leave our culture- child pornography legal and common because it’s fake?
They don't seem to mind it with meat. They'll take a lab-generated product meant to simulate meat, without actually harming any animals, and call it a hamburger.
It's not a hamburger, obviously - but they still want to call it one. Meaning they recognize that, despite its artificial nature, it's intended to be the same thing. "A healthy substitute," if you will.
Which now has them forced into rationalizing "healthy substitutes" for child pornography as somehow legit, simply because it doesn't involve real children. But then they still have a problem if that substitute looks like their child. Go figure. These folks aren't known for consistency or making sense.
I like how you just wave your wand and "these folks" who "have a problem if that substitute looks like their child" magically materialize, despite no one in the comment section above taking your bait.
Face it, you have no interest in discussing liberty; you're not a libertarian. Why are you here?
I'm Chris Hanson. And you're all in a lot of trouble.
I might remind some of the more ridiculous morally-outraged members of the commentariat that 'child' includes people up to the age of 17.
Who's interested in children having sex? Well, based on the success of shows like Beverly Hills 90210 (and its successors), lots of people. Sure, they don't explicitly show it, but they want you to *think about it*. And sure, they aren't actual minors, but they play minors on TV.
(Or any of the dozens of slasher fics with high school characters who get naked on camera - yeah, all the actors are adults, but the fantasy is they're high school kids having sex. Or Fast Times at Ridgemont High for that matter.)
And if the fantasy is they're high school kids having sex, that means a reasonable person could perceive them as being high school kids. (Seriously, our cultural has an obsession with and glorification of high school sexual experience, and our movies and tv shows reflect that at us).
The same logic that declares AI-generated images to be child porn would label Fast Times at Ridgemont High as child porn too. (And remember, we're talking about any image that a reasonable person could perceive as a minor. 17yos don't look that different from 18yos.)
Meanwhile, my sex ed book from a *church* class when I was a child had drawings of (what looked like) naked 13-14yo girls and boys, because it was designed for 13-14yos, and part of the point of a sex ed class is because *children are curious about what bodies look like*. (These drawings were not prurient or sexy, but the same people who want to declare AI images to be child porn would probably declare these drawings child porn too).
t
Exhibit A on why libertarianism is DOA.