In using Stable Diffusion for a DnD related project, I’ve found that it’s actually weirdly hard to get it to generate people (of either sex) that aren’t attractive - I wonder if it’s a bias in the training materials, or a deliberate bias introduced into the models because most people want attractive people in their AI pics
That’s true, but it’s not like ugly people don’t get photographed - ultimately a professional photographer is going to take photos of whoever pays them to do so. That explanation accounts for part of the bias I think, but not all of it
If I would get pictures taken by a photographer I would not allow them to be used as training data. I don’t even like looking into a mirror. Maybe that’s part of why there are less ugly people pictures to train with.
They were created via a prompt, that prompt probably included some tags to make them more attractive. It’s often standard practice to put tags like “ugly” and “deformed” into the negative prompts just to keep the hands and facial features from going wonky.
There are no elderly women, no female toddlers, and so forth either. Presumably just not what whoever generated this was going for. You can get those from many AI models if you want them.
Battleship coordinates, (B10). Also (I4) looks a lot like my niece. I really think it depends on your definition of “average” though. But as @fubo indicated. There are 0% black people in this photo. There’s some vaguely Asian, roughly Middle Eastern looking, sort of South American, and whatever that is going on in (M8). But there are distinctly zero black people pictured.
But then, the other day I was messing around with an image generation model and it took me way too long to realize that it was only generating East Asian-looking faces unless explicitly instructed not to.
Oh, is that why none of them are black?
Or even average looking.
In using Stable Diffusion for a DnD related project, I’ve found that it’s actually weirdly hard to get it to generate people (of either sex) that aren’t attractive - I wonder if it’s a bias in the training materials, or a deliberate bias introduced into the models because most people want attractive people in their AI pics
It’s trained on professionally taken photos. Professional photographers tend to prefer taking photos of attractive subjects.
Yeah all of my professional photographers hate me.
That’s true, but it’s not like ugly people don’t get photographed - ultimately a professional photographer is going to take photos of whoever pays them to do so. That explanation accounts for part of the bias I think, but not all of it
If I would get pictures taken by a photographer I would not allow them to be used as training data. I don’t even like looking into a mirror. Maybe that’s part of why there are less ugly people pictures to train with.
I would guess that ugly people are less likely to commission photos.
They were created via a prompt, that prompt probably included some tags to make them more attractive. It’s often standard practice to put tags like “ugly” and “deformed” into the negative prompts just to keep the hands and facial features from going wonky.
There are no elderly women, no female toddlers, and so forth either. Presumably just not what whoever generated this was going for. You can get those from many AI models if you want them.
Battleship coordinates, (B10). Also (I4) looks a lot like my niece. I really think it depends on your definition of “average” though. But as @fubo indicated. There are 0% black people in this photo. There’s some vaguely Asian, roughly Middle Eastern looking, sort of South American, and whatever that is going on in (M8). But there are distinctly zero black people pictured.
Now I’m curious. What’s average looking to you if none of these aren’t?
Older and fatter for a start.
Are you american?
What country do you live in btw?
France, and 75% of people are not fatter
I wouldn’t be so confident buddy. 25.3% of French people are obese. Not that lower then America which is at 28.8%. You ain’t much better.
No way
These people are (almost) all young, attractive and have flawless skin
deleted by creator
Not exactly absolution
Thats what pops into your head first?
It’s one thing that strikes me as kinda odd.
But then, the other day I was messing around with an image generation model and it took me way too long to realize that it was only generating East Asian-looking faces unless explicitly instructed not to.
Every model is going to have something as its “average case.” If you want the model to generate something else you’ll have to ask it.
Models generally trend towards one thing. Its hard to create a generalized model, from a mathematical standpoint. You just have to say what you want.
Yep, but you can’t just say “diversity plz”.
Matrix prompt with | white | black | brown | asian
Removed by mod