• Chozo@fedia.io
    link
    fedilink
    arrow-up
    7
    arrow-down
    5
    ·
    4 months ago

    Piracy isn’t the issue, I’m not sure if we’re referencing different things here.

    How the developers came to possess the training material isn’t being called into question - it’s whether or not they’re allowed to train an AI with it, and whether doing so constitutes copyright infringement. And currently, the way in which generative AI works does not cross those legal boundaries, as written.

    The argument the RIAA wants to make is that using copyrighted material for the purposes of training software extends beyond the protections of fair use. I believe their argument is that - even if acquired otherwise legally - acquiring music for the explicit purpose of making new music would be considered a commercial use of the material. Basically like the difference between buying an album to listen to with your headphones or buying an album to play for a packed concert hall, suggesting that the commercial intent behind acquiring the music is what makes it illegal.

    • FlowVoid@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      edit-2
      4 months ago

      This is the basis for the RIAA claims, which sure sounds like piracy:

      On information and belief, similar to other generative AI audio models, Suno trains its AI model to produce audio output by generally taking the following steps: a. Suno first copies massive numbers of sound recordings, including by “scraping” (i.e., copying or downloading) them from digital sources. This vast collection of information forms the input, or “corpus,” upon which the Suno AI model is trained.

      There is no evidence the AI devs bought any music, for any use. Quite the opposite:

      Antonio Rodriguez, a partner at the venture capital firm Matrix Partners, explained that his firm invested in the company with full knowledge that Suno might get sued by copyright owners, which he understood as “the risk we had to underwrite when we invested in the company.” Rodriguez pulled the curtain back further when he added that “honestly, if we had deals with labels when this company got started, I probably wouldn’t have invested in it. I think they needed to make this product without the constraints.” By “constraints,” Rodriguez was, of course, referring to the need to adhere to ordinary copyright rules and seek permission from rightsholders to copy and use their works.

      • Chozo@fedia.io
        link
        fedilink
        arrow-up
        2
        ·
        edit-2
        4 months ago

        I don’t think that’s the basis of their argument.

        The RIAA alleges that the generators used the record labels’ songs to illegally train the models since they didn’t have the rights holders’ permission to use the recordings. But whether the companies needed that permission is unclear. AI companies have argued that the use of training data is a case of fair use, meaning they are allowed to use the recordings with impunity.

        Emphasis mine. Their concern is that the music was used for commercial purposes, not how the music came into their possession. Web scraping is already legal, that’s never been a piracy issue.

        • FlowVoid@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          1
          ·
          edit-2
          4 months ago

          Courts have found that scraping data from a public website is legal, because data is not protected by copyright. But copying protected works without permission is generally illegal, it doesn’t matter if you use a scraper.

          If the defendants in this case admit using RIAA works, then they will probably try to argue fair use. At that point their product will become relevant, including its commercial nature. This will weigh against them, because their songs directly compete against RIAA songs. In fact, that’s why artists who include samples in their work usually obtain permission first.