Some light sneerclub content in these dark times.
Eliezer complements Musk on the creation of community notes. (A project which predates the takeover of twitter by a couple of years (see the join date: https://twitter.com/CommunityNotes )).
In reaction Musk admits he never read HPMOR and he suggests a watered down Turing test involving HPMOR.
Eliezer invents HPMOR wireheads in reaction to this.
God I cannot believe that Harry Potter fanfiction is such a big part of Rationalism.
I made the mistake of reading too many comments down in a Reddit thread and a
ephebophilepedophile showed up. I looked at their profile and was just totally blindsided by Harry Potter and the Methods of Rationality fanfiction and it apparently being their second favorite book ever. I think their first was Lolita but I don’t remember.I’ve heard that (maybe?) SBF had Harry Potter orgies, detailed on his girlfriend’s cryptofascist tradwife Tumblr blog. I know I shouldn’t but I kind of want to see if I can dig around the Internet Archive and find out how much of that really happened. Just making myself suffer for fun
ITT:
All the more so if it can create a high-quality Harry Potter VR Universe that expands infinitely with NPCs powered by AI that is infinitely more interesting than the normal world is.
This is the future LWers want.
I’m reminded of a My Little Pony singularity fan-fiction (Friendship is Optimal) that I read back when I had poor taste. An AI for a pony MMORPG goes rogue and converts everyone into digital ponies to maximize happiness but with a pony theme. The victims live out impossibly long, but ultimately superficial, lives doing pony stuff and goodness gracious why is there such a weird relationship between rationalists and fanfiction writers.
they’re both extremely online. next question
It’s the combination of big imaginations and little real-world experience. In Friendship is Optimal, the AGI goes from asking for more CPUs to asking for information on how to manufacture its own CPUs, somehow without involving the acquisition of silicon crystals or ASML hardware along the way. Rationalist writers imagine that AGI will somehow provide its own bounty of input resources, rather than participating in the existing resource economy.
In reality, none of our robots have demonstrated the sheer instrumentality required to even approach this sort of doomsday scenario. I think rationalists have a bit of the capitalist blind spot here, imagining that everything and everybody (and everypony!) is a resource.
Whatever, I’ll be a pony. Where do I sign up?
Pleasure Island, from Pinocchio. You gotta ask for the pony pass though, or else you’re just gonna get turned into a donkey. To reverse the transformation you gotta go to the island of Dr. Moreau.