Schools and lawmakers are grappling with how to address a new form of peer-on-peer image-based sexual abuse that disproportionately targets girls.

    • atomicorange@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      10 days ago

      If someone put a camera in the girls’ locker room and distributed photos from that, would you consider it CSAM? No contact would have taken place so the kids would be unaware when they were photographed, is it still abuse?

      If so, how is the psychological effect of a convincing deepfake any different?

      • BombOmOm@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        10 days ago

        Taking secret nude pictures of someone is quite a bit different than…not taking nude pictures of them.

        It’s not CSAM to put a picture of someone’s face on an adult model and show it to your friend. It’s certainly sexual harassment, but it isn’t CSAM.

        • atomicorange@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          arrow-down
          1
          ·
          10 days ago

          How is it different for the victim? What if they can’t tell if it’s a deepfake or a real photo of them?

          • BombOmOm@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            10 days ago

            It’s absolutely sexual harassment.

            But, to your question: you can’t just say something has underage nudity when the nudity is of an adult model. It’s not CSAM.