| Topic List | Page List: 1 |
|---|---|
| Topic | Paedophiles Illegal trade in AI child sex abuse images exposed |
| darkmaian23 06/28/23 10:01:31 AM #43: | The dataset used to train Stable Diffusion is thought to be free from all forms of illegal imagery (minus, of course, the fact that nobody consented specifically or was paid for the use of their content in training the model---but that's a matter of perspective). It can be used create things it was not specifically trained on, or else it'd have no value. Then there are other models, which are almost invariably just the Stable Diffusion base model with more training data added, or amalgamations of several such models. When people claim they are using Stable Diffusion ethically because they trained a "model" on "their" art, this isn't actually true because for the image generation to work, it still needs those billions of other data points. You're just adding additional details. Depending on the models used and any additional keywords being added behind the scenes, it can be surprisingly easy to have an AI image generator give you photos of people you didn't ask for: different age, different gender, different breast size, different hair, etc. Some popular models will apparently spit out porn in response to innocuous keywords in the prompt. Even the paid services with a ton of users have trouble not giving people porn or going wildly off subject. So in short, a person might be exposed to objectionable imagery while using AI that was not intentional or wanted. This is one danger of harshly criminalizing disturbing photos coming from AI. The next issue is that, in many places, things like cartoon porn are either granted an exception for free speech or are generally not pursued. Is it morally correct or socially useful to spend time investigating objectionable images of fictional people? If they are cartoons? 3D models? AI generated people from a standard model with no porn? AI generated people from a model including legal porn? AI generated people from a model including illegal porn (it has been argued in the context of regular art that AI models don't copy anything because they retain very little data per image trained)? Does intent count (i.e. you got a weird image intentionally versus not)? How about distribution versus private use? I think everyone is going to have different answers to these questions. Personally, I've wondered since the beginning about the morality of using AI to generate any porn. The base Stable Diffusion model was trained on billions of images scraped from the web. If the training had been opt-in only, I feel fairly certain that few people would have consented to the use of their photos for making fake people, let alone fake people engaged in sex acts. --- Cuteness is justice! It's the law. ... Copied to Clipboard! |
| Topic List | Page List: 1 |