That’s because Facebook wasn’t programming with the power of horny.
An amateurish mistake. I bet their programmers don’t even have thigh high socks.
Can they even be called programmers if they don’t?
Move over HolyC, we got HornyC now!
C Sharp? No… C Soft.
C hard >:)
Yeah it is. :(
Furries 1 - Facebook 0
In an alternative dimension:
Facebook: “announces mind reading headset to animate imaginary body parts”
People: “Nice try, CIA!”, “That’s big gender propaganda!”, “I’m not going to connect my brain to the internet!” “Not guilty, your honor. Facebook made me do it via the headset.”
To be fair, if Metaverse did integrate something like this they would definitely record telemetry data “for development purposes”.
No, they’d straight up say they were doing it to target ads at you. They think it’s a good thing, because we get “relevant” ads.
They’d do both, and most likely be intentionally dishonest about which is which. Or just not give a shit, the fines aren’t that large for a behemoth like them.
… oh that’s interesting. Creation of a phantom “limb” with a brain control interface? I wonder how much control there is? Does it just wiggle? Is it purely binary up/down? Can they control the angle?
I actually have a set of LED eyes that I control with puppetry, last I looked at BCIs it was woefully incapable of what I wanted but maybe I should look at this again…
I have a pair of Necomimi ears and I have no idea how they work, but I wonder if this VRC mod is like that?
Edit: The twitter link shows a video which completely invalidates my previous comment. The ears do seem to be fluidly controllable.
Previous comment:
I would assume it’s just two states (ears up and ears down) that will be switched to. Most VRChat avatars I have seen do exactly this but through pressing a button rather than mind controls.
Even something simple as this adds a lot of immersion! There are probably specific faces to go with the ears as well.I assume the opposite, as in the video the ears move in a much more fluid manner. The same guy also made a seperate component for emotions.
oh, thank you for pointing that out! I did not see the video, you are right ^^
This is both freakin rad but also kinda insane, gotta love how far furries / vrchat users will go to make something insane work
Okay, back up. I must know how this was done.
I’m hella curious how eeg stuff works now, as opposed to the cheap piece of crap I had in the late 90’s/early 2000’s that worked on the same principal way in it’s infancy. Thing I had you could set 3 inputs for and even just recording the right “thoughts” that you sent to trigger them was a PITA, let alone getting it to work while using it in a game or something.
I NEED THIS SO FUCKING BADLY