Blood Music was way cooler then this just saying.
Small detail: biological viruses are not even remotely similar to computer “viruses”.
that’s where the LLM comes in! oh my god check your reading comprehension
U-huh, and an LLM trained on video game source code and clothing patterns can invent real life Gauntlets of Dexterity.
Why exactly is he so convinced LLMs are indistinguishable from magic? In the reality where I live, LLMs can sometimes produce a correct function on their own and are not capable of reliably transpiling code even for well specified and understood systems, let alone doing comic book mad scientist ass arbitrary code execution on viral DNA. Honestly, they’re hardly capable of doing anything reliably.
Along with the AI compiler story he inflicted on Xitter recently, I think he’s simply confused LLM and LLVM.
For decades he build a belief system where high intelligence is basically magic. That is needed to power his fears of AGI turning everything into paperclips, and it has become such a load bearing belief (one of the reasons for it is is a fear of death and grief over people he lost so not totally weird) that he has other assumptions added to this, for example we know that computers are pretty limited by matter esp the higher end ones need all kinds of metals which must be mined etc today. So that is why he switches his fears to biology, as biology is ‘cheap’ ‘easy’ and ‘everywhere’. The patterns in his reasoning are not that hard to grok. That is also why he thinks LLMs (which clearly are now at the start of their development not the end, it is like the early internet! (personally I think we are mostly at the end and we will just see a few relatively minor improvents but no big revolutionary leap)) will lead to AGI, on some level he needs this.
Men will nuke datacenters before going to therapy for grief and their mid life crisis.
If you’re reading this, here’s a reminder to give your eyes a break from screens. If you like, you can do some eye stretches. Here’s how:
- Read any of Yud’s tweets
- Close your eyes
- Let your eyes roll from processing whatever drivel he’s written. Try for about 30 seconds.
irl lolling so fucking hard at this
Thanks :D
lol np. Writing sneers has become one of my favourite outlets for creative energy and frustration, I’m glad you enjoyed this one.
To unpack the post a bit:
So my understanding is that Yud is convinced that the inscrutable matrices (note: just inscrutable to him) in his LLM have achieved sentience. In his near-future world where AI can exert itself in the physical world at will and, in particular, transfer data into your body, what possible use does it have for a bitcoin? What possible benefit would come from reprogramming human DNA beyond the intellectual challenge? I’ve recently been thinking about how Yud is supposedly the canonical AI-doomer, but his (and the TESCREAL community in general’s) AI ideation is rarely more than just third-rate, first-thought-worst-thought sci-fi.
also:
people keep on talking about… the near-term dangers of AI but they never come up with any[thing] really interesting"
Given the current public discourse on AI and how it might be exploited to make the working class redundant, this is just Yud telling on himself for the gazillionth time.
also a later tweet:
right that’s the danger of LLMs. they don’t reason by analogy. they don’t reason at all. you just put a computer virus in one end and a DNA virus comes out the other
Well, consider my priors adjusted, Yud correctly identifies that LLMs don’t reason, good job my guy. Yet, somehow he believes it’s possible that today’s LLMs can still spit out viable genetic viruses. Well, last I checked, no one on stack overflow has cracked that one yet.
Actually, if one of us could write that as a stack overflow question, maybe we can spook Yud. That would be fun.