• Knighthawk 0811@lemmy.one
    link
    fedilink
    arrow-up
    8
    arrow-down
    1
    ·
    1 year ago

    artists who don’t like this are no longer allowed to look at previous artist’s works before making their own. fan fiction is certainly not allowed.

    • Andreas@feddit.dk
      link
      fedilink
      arrow-up
      0
      ·
      1 year ago

      Well… yes? Fan fiction is not allowed to be published commercially and creators have successfully filed copyright takedowns to remove fan fiction, although most creators choose not to because it’s free advertising. But AI art models are being used commercially (charging for usage, AI-generated art can be used in products) despite being trained on art they don’t have the rights to.

    • ZILtoid1991@kbin.social
      link
      fedilink
      arrow-up
      2
      arrow-down
      2
      ·
      1 year ago

      Bad comparison. The scale is way greater. I can’t go through multiple terabytes of reference images.

      As an artist, that’s my issue with current models. It shouldn’t be opted-in by default with questionable opt-out options.

      • Knighthawk 0811@lemmy.one
        link
        fedilink
        arrow-up
        1
        arrow-down
        3
        ·
        1 year ago

        but that’s exactly how the models are being trained. by manually going through all the images and describing them.

        and you can’t really complain that computers go through data at a greater scale than humans… that’s the whole point. that argument doesn’t hold

        • zalack@kbin.social
          link
          fedilink
          arrow-up
          4
          ·
          1 year ago

          It’s worth entertaining the observation that these learning models piggyback off underpaid human effort to funnel wealth towards the 1%.

          It’s just that the solution isn’t to stop the tech. That never works. It’s out in the world now.

          The real solution is much harder. We need to overhaul our economy and put in mechanisms to recirculate wealth downwards.