Stability AI, the UK company behind Stable Diffusion, told the BBC it “prohibits any misuse for illegal or immoral purposes across our platforms, and our policies are clear that this includes CSAM”.
And who in the real world expects those people to obey these terms of service?
To their credit they’ve always had a safety checker on by default, it just isn’t very good and returns a lot of false positives so it quickly became standard practice to bypass it
And who in the real world expects those people to obey these terms of service?
Terms of service have always been only about protecting the company from any legal problems rather than anything else
To their credit they’ve always had a safety checker on by default, it just isn’t very good and returns a lot of false positives so it quickly became standard practice to bypass it