- cross-posted to:
- tech@lemmit.online
- cross-posted to:
- tech@lemmit.online
Thanks to rapid advancements in generative AI and a glut of training data created by human actors that has been fed into its AI model, Synthesia has been able to produce avatars that are indeed more humanlike and more expressive than their predecessors. The digital clones are better able to match their reactions and intonation to the sentiment of their scripts—acting more upbeat when talking about happy things, for instance, and more serious or sad when talking about unpleasant things. They also do a better job matching facial expressions—the tiny movements that can speak for us without words.
But this technological progress also signals a much larger social and cultural shift. Increasingly, so much of what we see on our screens is generated (or at least tinkered with) by AI, and it is becoming more and more difficult to distinguish what is real from what is not. This threatens our trust in everything we see, which could have very real, very dangerous consequences.
“I think we might just have to say goodbye to finding out about the truth in a quick way,” says Sandra Wachter, a professor at the Oxford Internet Institute, who researches the legal and ethical implications of AI. “The idea that you can just quickly Google something and know what’s fact and what’s fiction—I don’t think it works like that anymore.”
It’s noble how many of you are willing to get philosophical about the rise of deep fakes freeing us from puritan beliefs and readdressing the concept of truth.
While completely fucking ignoring the harassment and extortion of deep fakes. Y’all want to get high minded about YOUR right to free speach using OTHER peoples bodies as a gateway to some utopia, while playing dumb that this is just another form of mysgonstic abuse. If it truly is just you something you are doing in the privacy of your own home, why the fuck do you need other people’s media?
Your ideals are built upon YET AGAIN women taking one for the team. The “truth” is immposible to know so YOLO, let’s turn any women who made the mistake of being photographed in to porn. Her consent doesn’t matter between the privacy of me and my dataset, even if I do upload it and blackmail her a lil’.
.