Microsoft CEO calls for tech industry to ‘act’ after AI photos of Taylor Swift circulate X::Satya Nadella spoke to Lester Holt about artificial intelligence and its ability to create deepfake images of others. After pictures of Taylor Swift circulated, he called for actions

  • Andy@slrpnk.net
    link
    fedilink
    English
    arrow-up
    19
    arrow-down
    1
    ·
    edit-2
    11 months ago

    I think everyone’s still trying to get some handle on this, but it seems like it’s mostly an issue with scale.

    People have been making and sharing photo manips of celebrities naked for about 20 years. This is just noteworthy because there’s so much of it, so quickly produced, so publicly present.

    I think we should all have the right to create images – including sexual ones – freely. And I think the subjects of those images (especially Emma Watson and Natalie Portman, I don’t think anyone has been photoshopped into porn as much as those two) deserve to live their lives without having to see it or thick about it.

    I don’t think it’s necessary a problem that has an easy legal solution, but a good start might be just recognizing this social contact. If you want to fantasize about anyone – whether it’s a celebrity or someone you take classes with – understand how gross it is for them to have two know about it, and keep it discrete.

    • fidodo@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      11 months ago

      Not just scale but also accessibility. Now anyone can make these without having a specialized skill.

      I don’t think any laws targeting deep fakes should treat them as prohibited material, that would be invasive on freedom of speech and privacy rights for something like possession to be made illegal.

      Instead it should be treated as harassment. At a bare minimum we should make the situation where someone targets someone they know and distributes pictures of them to other people they know illegal. For example, if someone were to create and distribute deepfake porn pictures of a classmate to other classmates, that situation should not be allowed to happen.

      When it comes to a person of note where it’s happening more in the background I feel that’s more of a grey area since it’s not necessarily a direct target since the person neither knows them nor their social circle, but I think there’s a big difference if it’s being distributed on a social media platform that the celebrity is on or discussed, vs a porn site.