Wikipedia has a new initiative called WikiProject AI Cleanup. It is a task force of volunteers currently combing through Wikipedia articles, editing or removing false information that appears to have been posted by people using generative AI.

Ilyas Lebleu, a founding member of the cleanup crew, told 404 Media that the crisis began when Wikipedia editors and users began seeing passages that were unmistakably written by a chatbot of some kind.

  • schizo@forum.uncomfortable.business
    link
    fedilink
    English
    arrow-up
    248
    arrow-down
    10
    ·
    2 months ago

    Further proof that humanity neither deserves nor is capable of having nice things.

    Who would set up an AI bot to shit all over the one remaining useful thing on the Internet, and why?

    I’m sure the answer is either ‘for the lulz’ or ‘late-stage capitalism’, but still: historically humans aren’t usually burning down libraries on purpose.

    • poszod@lemmy.world
      link
      fedilink
      English
      arrow-up
      116
      ·
      2 months ago

      State actors could be interested in doing that. Same with the internet archive attacks.

    • Schmoo@slrpnk.net
      link
      fedilink
      English
      arrow-up
      103
      arrow-down
      5
      ·
      2 months ago

      historically humans aren’t usually burning down libraries on purpose.

      How on earth have you come to this conclusion.

    • Regrettable_incident@lemmy.world
      link
      fedilink
      English
      arrow-up
      14
      arrow-down
      1
      ·
      2 months ago

      historically humans aren’t usually burning down libraries on purpose.

      Sometimes they are, Baghdad springs to mind, I’m sure there are other examples. And this library is online so there’s less chance of getting caught with a can of petrol and a box of matches.

      Then there’s every authoritarian regime that tries to ban or burn specific types of books. What we’re seeing here could be more like that - an attempt to muddy the waters or introduce misinformation on certain topics.

    • Wrench@lemmy.world
      link
      fedilink
      English
      arrow-up
      10
      arrow-down
      1
      ·
      2 months ago

      Because basement losers can’t conquer and raze libraries to the ground.

      The internet has shown that assumed anonymity result in people fucking with other people’s lives for the hell of it. Viruses, trolling, etc. This is just the next stage of it because of a new easy to use tool.

    • endofline@lemmy.ca
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      3
      ·
      2 months ago

      It’s not about on purpose but usually most people don’t care about what’s not in their interest. Today interests are usually quite shallow what tiktok shows quite well. Libraries do require money for operating. Even internet archive and wikipedia

    • rsuri@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      2 months ago

      Yeah but the other thing about humanity is it’s mostly harmless. Edits can be reverted, articles can be locked. Wikipedia will be fine.

      • kent_eh@lemmy.ca
        link
        fedilink
        English
        arrow-up
        14
        ·
        edit-2
        2 months ago

        Edits can be reverted, articles can be locked.

        Sure, but the vandalism has to be identified first. And that takes time and effort.

      • LarmyOfLone@lemm.ee
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        7
        ·
        edit-2
        2 months ago

        Wikipedia relies on sources, and humans choosing the sources like newspapers. And those newspapers are more and more inside a “bubble” that rejects any evidence or reporting presented by a competing bubble.

        Right now wikipedia is covering up one of the greatest acts of mass murder of our times, because the newspapers are covering it up, or rejecting evidence because it’s by the “enemy”. Part of this is a defensive posture against AI bots and enemy disinformation.

    • Petter1@lemm.ee
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      2
      ·
      edit-2
      2 months ago

      Maybe a strange way of activism that is trying to poison new AI models 🤔

      Which would not work, since all tech giants have already archived preAI internet

      • schizo@forum.uncomfortable.business
        link
        fedilink
        English
        arrow-up
        8
        ·
        2 months ago

        Ah, so the AI version of the chewbacca defense.

        I have to wonder if intentionally shitting on LLMs with plausible nonsense is effective.

        Like, you watch for certain user agents and change what data you actually send the bot vs what a real human might see.

        • Dragonstaff@leminal.space
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          1
          ·
          2 months ago

          I suspect it would be difficult to generate enough data to intentionally change a dataset. There are certainly little holes, like the glue pizza thing, but finding and exploiting them would be difficult and noticing you and blocking you as a data source would be easy.

        • T156@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          2 months ago

          I have to wonder if intentionally shitting on LLMs with plausible nonsense is effective.

          I don’t think so. The volume of data is too large for it to make much of a difference, and a scraper can just mimic a human user agent and work that way.

          You’d have to change so much data consistently across so many different places that it would be near-impossible for a single human effort.

    • weeeeum@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 months ago

      Its because there’s no accountability for cybercrimes. If humans always had a button to burn down libraries, I’m sure they would have. Instead they had to put themselves in harms way to do such things.

      People do things cause they can, and fucking with Wikipedia is apparently simple.