• NuXCOM_90Percent@lemmy.zip
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    26 days ago

    I mean, have you seen Gadget?

    But also… that is kind of the point. Adobe and basically every company that isn’t a porn company doesn’t care about the revenue from porn. And the companies that DO care about the revenue are constantly fighting piracy.

    There are some patreon-like artists who make bank for getting their Source Film Maker on. But they are a handful of licenses, at best.

    • EldritchFeminity@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      0
      ·
      25 days ago

      That’s what I was thinking. Apart from the porn locked up in the Disney vault, big companies aren’t in the business of making porn. And the companies that do aren’t going to be interested in deep fakes. The people who are using Photoshop to create porn are small fries to Adobe. Deep fake porn has been around as long as photo manipulation has, and Adobe hasn’t cared before.

      Bearing that in mind, I don’t think this policy has anything to do with AI deep fakes or porn. I think it’s more likely to be some new revenue source, like farming data for LLM training or something. They could go the Tumblr route and use AI to censor content, but considering Tumblr couldn’t tell the difference between the Sahara Desert and boobs, I think that’s one fuck up with a major company away from being litigation hell. The only reason that I think would make sense for Adobe to do this because of deep fakes is if they believe that governments are going to start holding them liable for the content people make with their products.