cyu@sh.itjust.works to Technology@lemmy.worldEnglish · 1 year agoAll women pictured are A.I. generatedfiles.catbox.moeimagemessage-square76fedilinkarrow-up1236arrow-down155
arrow-up1181arrow-down1imageAll women pictured are A.I. generatedfiles.catbox.moecyu@sh.itjust.works to Technology@lemmy.worldEnglish · 1 year agomessage-square76fedilink
minus-squaresetVeryLoud(true);@lemmy.calinkfedilinkEnglisharrow-up7·1 year agoPoints to a limited dataset including mostly white people
minus-squarekromem@lemmy.worldlinkfedilinkEnglisharrow-up4·1 year agoWithout knowing the prompt vs the model, it’s impossible to say which side of things is responsible for the lack of variety. Many modern models are actually very good at reversing sample biases in the training set.
Points to a limited dataset including mostly white people
Without knowing the prompt vs the model, it’s impossible to say which side of things is responsible for the lack of variety.
Many modern models are actually very good at reversing sample biases in the training set.
Good point