Schools and lawmakers are grappling with how to address a new form of peer-on-peer image-based sexual abuse that disproportionately targets girls.
Back in my day we just had to use our own imagination.
Can’t afford this much cheese today to find just the right slice for every bikini photo…
Maybe let’s assume all digital images are fake and go back to painting. Wait… what if children start painting deepfakes ?
Deepfakes might end up being the modern version of a bikini. In the olden days, people wore these to the beach. Having less was scandalous and moral decay. Yet, now we wear much less.
Our grandchildren might simply not give a damn about their nudity, because it is assumed that everyone is deepfaking everyone.
These are all worn voluntarily. This issue isn’t about the equivalent of scandalously clad young girls, it’s like if girls were being involuntarily stripped of their clothing by their classmates. It’s not about modesty standards it’s about sexual abuse.
It can be both. The cornerstone of why nudity can be abused, is that society makes it shameful to be bare. If some generations from now that people can just shrug and not care, that is one less tool an abuser can use against people.
In any case, I am of the mind that people of my generation might be doing their own version of the Satanic Panic, or the reaction against rap music. For better or worse, older people cannot relate to the younger.
Unless it is used to pretend that it is a real video and circulated for denigration or blackmail, it is very much not at all like assault. And also, deepfakes do not have the special features hidden under your clothes, so it is possible to debunk those if you really have to.
anyone using any kind of AI either doesn’t know how consent works-- or they don’t care about it.
a horrifying development in the intersection of technofascism and rape culture
Any AI? Every application? What kind of statement is this?
AI models (unless you’re training your own) are usually trained on data it does not have a licence to use. The companies training these models are also notorious for ignoring robot.txt and other measures websites use to stop bots from trawling their data.
Like in crypto, most people in AI are not nerds, just criminal scum.
For example, Louisiana mandates a minimum five-year jail sentence no matter the age of the perpetrator.
That’s just on it’s face stupid. A thirteen year old boy is absolutely gonna wanna see girls in his age group naked. That’s not pedophilia. It’s wanting to see the girls he fantasizes about at school every day. Source: I was a thirteen year old boy.
It shouldn’t be treated the same as when an adult man generates it; there should be nuance. I’m not saying it’s ok for a thirteen year old to generate said content: I’m saying tailor the punishment to fit the reality of the differences in motivations. Leave it to Louisiana to once again use a cudgel rather than sense.
I’m so glad I went through puberty at a time when this kind of shit wasn’t available. The thirteen year old version of me would absolutely have got myself in a lot of trouble. And depending on what state I was in, seventeen year old me could have ended listed as a sex predetor for sending dick pics to my gf cause I produced child pornography. God, some states have stupid laws.
In general, even up here in woke-ville, punishments have gotten a lot more strict for kids. There’s a lot more involvement of police, courts, jail. As a parent it causes me a lot of anxiety - whatever happened to school being a “sandbox” where a kid can make mistakes without adult consequences, without ruining their lives? Did that ever exist?
I can already picture that as an Onion headline:
New York Renames State to ‘WokeVille’. NYC to follow.
it existed if society liked you enough.
fascists just have a habit of tightening that belt smaller and smaller, is what’s going on.
Punishment for an adult man doing this: Prison
Punishment for a 13 year old by doing this: Publish his browsing and search history in the school newsletter.
13 year old: “I’ll just take the death penalty, thanks."
As a father of teenage girls, I don’t necessarily disagree with this assessment, but I would personally see to it that anyone making sexual deepfakes of my daughters is equitably and thoroughly punished.
There is a difference between ruining the life of a 13 year old boy for the rest of his life with no recourse and no expectations.
Vs scaring the shit out of them and making them work their ass off doing an ass load of community service for a summer.
ruining the life of a 13 year old boy for the rest of his life with no recourse
And what about the life of the girl this boy would have ruined?
This is not “boys will be boys” shit. Girls have killed themselves over this kind of thing (I have personal experience with suicidal teenage girls, both as a past friend and as a father).
I don’t think it’s unreasonable to expect an equivalent punishment that has the potential to ruin his life.
Fake pictures do not ruin your life… sorry…
Our puritanical / 100% sex culture is the problem, not fake pictures…
It is not abnormal to see different punishment for people under the age of 18. Good education about sex and what sexual assault does with their victims (same with guns, drugs including alcohol etc).
You can still course correct the behaviour of a 13 year old. There is also a difference between generating the porn and abusing it by sharing it etc.
The girls should be helped and the boys should be punished, but mainly their behaviour needs to be correcte
Yes, absolutely. But with recognition that a thirteen year old kid isn’t a predator but a horny little kid. I’ll let others determine what that punishment is, but I don’t believe it’s prison. Community service maybe. Written apology. Stuff like that. Second offense, ok, we’re ratcheting up the punishment, but still not adult prison.
In a properly functioning world, this could easily be coupled with particular education on power dynamics and a lesson on consent, giving proper attention to why this might be more harmful to get than to him.
Of course, – so long as we’re in this hypothetical world – you’d just have that kind of education be a part of sex ed. or the like for all students, to begin with, but, as we’re in this world and that’s Louisiana…
I did say equitable punishment. Equivalent. Whatever.
A written apology is a cop-out for the damage this behaviour leaves behind.
Something tells me you don’t have teenage daughters.
No kids. That’s why I say others should write the punishments. A written apology wasn’t meant as the only punishment. It was in addition to community service and other stipulations.
probably because there’s a rapist in the white house.
To add to that. I live in a red area and since the election I’ve been cat called much more. And it’s weird too, cus I’m middle aged…. I thought I’d finally disappear…
the toxic manosphere/blogosphere/whatever it’s called has done so much lifelong damage
Honestly I think we need to understand that this is no different to sticking a photo of someone’s head on a porn magazine photo. It’s not real. It’s just less janky.
I would categorise it as sexual harassment, not abuse. Still serious, but a different level
Schools generally means it involves underage individuals, which makes any content using them csam. So in effect, the “AI” companies are generating a ton of csam and nobody is doing anything about it.
Disagree. Not CSAM when no abuse has taken place.
That’s my point.
If someone put a camera in the girls’ locker room and distributed photos from that, would you consider it CSAM? No contact would have taken place so the kids would be unaware when they were photographed, is it still abuse?
If so, how is the psychological effect of a convincing deepfake any different?
Taking secret nude pictures of someone is quite a bit different than…not taking nude pictures of them.
It’s not CSAM to put a picture of someone’s face on an adult model and show it to your friend. It’s certainly sexual harassment, but it isn’t CSAM.
How is it different for the victim? What if they can’t tell if it’s a deepfake or a real photo of them?
It’s absolutely sexual harassment.
But, to your question: you can’t just say something has underage nudity when the nudity is of an adult model. It’s not CSAM.
Yes, finding out that your peers have been sharing deep fake pornography of you is absolutely fine and a normal thing for young girls to go through in school. No girls have ever killed themselves because of this exact sort of thing, surely. This definitely will not add in any way to the way women and girls are made to feel entirely disgustingly dehumanized by every man or boy in their lives. Groups of men and boys reducing them and their bodies down to vivid sexual fantasies that they can quickly generate photo realistic images of.
If the person in the image is underaged then it should be classified as child pornography. If the woman who’s photo is being used hasnt consented to this then it should be classified as sexual exploitation.
Women and girls have faced degrees of this kind of sexual exploitation by men and boys since the latter half of the 20th century. But this is a severe escalation in that behavior. It should be illegal to do this and it should be prosecuted when and where it is found to occur.
Lawmakers are grappling with how to address …
Just a reminder that the government is actively voting against regulations on AI, because obviously a lot of these people are pocketing lobbyist money
Oh I just assumed that every Conservative jerks off to kids
Get some receipts and that will be a start.
We’re at 56 pages of this now for a nice round count of 1400 charges
So far as I am aware all of these are publicly searchable court cases
God I’m glad I’m not a kid now. I never would have survived.
I don’t understand fully how this technology works, but, if people are using it to create sexual content of underage individuals, doesn’t that mean the LLM would need to have been trained on sexual content of underage individuals? Seems like going after the company and whatever it’s source material is would be the obvious choice here
I agree with the other comments, but wanted to add how deepfakes work to show how simple they are, and how much less information they need than LLMs.
Step 1: Basically you take a bunch of photos and videos of a specific person, and blur their faces out.
Step 2: This is the hardest step, but still totally feasable for a decent home computer. You train a neural network to un-blur all the faces for that person. Now you have a neural net that’s really good at turning blurry faces into that particular person’s face.
Step 3: Blur the faces in photos/videos of other people and apply your special neural network. It will turn all the blurry faces into the only face it knows how, often with shockingly realistic results.