AI-generated child sex imagery has every US attorney general calling for action::“A race against time to protect the children of our country from the dangers of AI.”

  • ThrowawayOnLemmy@lemmy.world
    link
    fedilink
    English
    arrow-up
    20
    arrow-down
    12
    ·
    10 months ago

    But aren’t these models built from source material? I imagine if you want CP AI, you need actual CP to train it, no? That definitely would be a problem.

    • DLSchichtl@lemmy.world
      link
      fedilink
      English
      arrow-up
      37
      arrow-down
      2
      ·
      10 months ago

      Not necessarily. You could mix PG pics of kids with a Lora trained on pubescent looking but otherwise legal naughty bits. At least that would be the sane way to do it. But hey, world’s full of insane people.

    • Rivalarrival@lemmy.today
      link
      fedilink
      English
      arrow-up
      20
      arrow-down
      1
      ·
      10 months ago

      No, you can use a genetic algorithm. You have your audience rate a legal, acceptable work. You present the same work to an AI and ask it to manipulate traits, and provide a panel of works to your audience. Any derivative that the audience rates better than the original is then given back to the AI for more mutations.

      Feed all your mutation and rating data to an AI, and it can begin to learn what the audience wants to see.

      Have a bunch of pedophiles doing the training, and you end up with “BeyondCP”.