It is interesting how easy this has got with a popular service like Prime Voice AI, and when you realise that many use voice recognition for authenticated access to systems, we can see where the risks come in. Like most technology, there are lots of positive upsides, but it always opens up the negatives as well. As Steve points out in his commentary in the linked article, the bad actors are often quicker than anyone else nowadays to take advantage of these new developments.

No end in sight for the upward trajectory of careers in security and vulnerability consultants.

#technology #security #voicecloning

  • CasualTee@beehaw.org
    link
    fedilink
    English
    arrow-up
    5
    ·
    1 year ago

    Not only organisations, but everyone really. The eldery are already massively targeted by scammers. Now, on top of that, scammers can find a child or grandchild voice sample on some social media and use that to ask for money over the phone. Or even via video call with deep fakes. And they can do that at scale.

    • Corkyskog@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      You don’t even need social media. Call person A and keep them on the phone for 15 seconds to get a snippet of their voice. Feed through AI, use that to call person B and extort them or whatever crime you were planning on.