Interesting perspective. I’m on board with blaming the creators, but I’m leaning away from blaming ChatGPT itself as it is just a machine.
A machine used incorrectly, to be sure, and we’d be better off without it certainly, but the machine carries no fault its existence. It isn’t conscious after all, it’s akin to a T85 inside a vending machine.
Machines are capable of what their design parameters allow or can be manipulated by a user to a accomplish. In the trolley problem, any outcome is not the fault of the trolley itself.
Funny as the mosquito example is, they aren’t evil. Just as wolves aren’t evil for hunting deer. Animals may not possess consciousness as you or I, they are alive and are driven by biological necessity. Machines on the other hand, aren’t.
The word “fault” means “flaw” or “problem”. I’d say it’s the trolley’s fault it runs the people over. The instigating problem to the whole situation is that the trolley’s brakes are broken. That’s a fault, a flaw, a problem with the trolley.
Likewise, the LLM technology has serious intrinsic flaws that cause it to abuse vulnerable people. It’s part of the machine’s fundamental design. LLMs are faulty, and it’s their fault. I call them evil because there is no way to deploy them without these problems, regardless of the user’s intentions. Anthropic think they can control this basilisk. I think Claude is as rotten as the rest of them.
The word ‘fault’ is commonly used interchangeably with ‘responsible’. Following your described definition, I agree that LLM’s are faulty, and they can be at ‘fault’.
I invoked the ethical dilemma as it’s almost universally understood that it’s a scenario forcing an individual to make a decision. I’ve never before heard someone blame the trolley. The brakes are broken? Come now, if we’re going to be so semantic about it, a human should have regularly inspected the brakes and subsequently had them repaired.
I appreciate your explanation of your viewpoint. Cheers.
So when ChatGPT gave that one kid advice on how to hang himself and told him to hide the noose from his parents, that was just his own desire?
Personally I think ChatGPT murdered a kid. But you can think what you want.
Only ChatGPT, or also its creators?
Both, Professor Falken.
Interesting perspective. I’m on board with blaming the creators, but I’m leaning away from blaming ChatGPT itself as it is just a machine.
A machine used incorrectly, to be sure, and we’d be better off without it certainly, but the machine carries no fault its existence. It isn’t conscious after all, it’s akin to a T85 inside a vending machine.
I think machines are capable of evil. I don’t think consciousness is a prerequisite to evil.
Mosquitoes are evil, after all, and they’re not conscious.
Machines are capable of what their design parameters allow or can be manipulated by a user to a accomplish. In the trolley problem, any outcome is not the fault of the trolley itself.
Funny as the mosquito example is, they aren’t evil. Just as wolves aren’t evil for hunting deer. Animals may not possess consciousness as you or I, they are alive and are driven by biological necessity. Machines on the other hand, aren’t.
The word “fault” means “flaw” or “problem”. I’d say it’s the trolley’s fault it runs the people over. The instigating problem to the whole situation is that the trolley’s brakes are broken. That’s a fault, a flaw, a problem with the trolley.
Likewise, the LLM technology has serious intrinsic flaws that cause it to abuse vulnerable people. It’s part of the machine’s fundamental design. LLMs are faulty, and it’s their fault. I call them evil because there is no way to deploy them without these problems, regardless of the user’s intentions. Anthropic think they can control this basilisk. I think Claude is as rotten as the rest of them.
The word ‘fault’ is commonly used interchangeably with ‘responsible’. Following your described definition, I agree that LLM’s are faulty, and they can be at ‘fault’.
I invoked the ethical dilemma as it’s almost universally understood that it’s a scenario forcing an individual to make a decision. I’ve never before heard someone blame the trolley. The brakes are broken? Come now, if we’re going to be so semantic about it, a human should have regularly inspected the brakes and subsequently had them repaired.
I appreciate your explanation of your viewpoint. Cheers.
I use “fault” and the idea of blame to go “what can we change to prevent the bad thing from happening again?”
Since we can prevent the bad thing by banning LLMs, and there are no significant downsides to doing so, I blame the LLMs. They’re evil.
Bonus: banning LLMs hurts Sam Altman