cyu@sh.itjust.works to Technology@lemmy.worldEnglish · 2 years agoAll women pictured are A.I. generatedfiles.catbox.moeexternal-linkmessage-square79fedilinkarrow-up1240arrow-down156cross-posted to: technology@lemmy.ml
arrow-up1184arrow-down1external-linkAll women pictured are A.I. generatedfiles.catbox.moecyu@sh.itjust.works to Technology@lemmy.worldEnglish · 2 years agomessage-square79fedilinkcross-posted to: technology@lemmy.ml
minus-squareversionist@lemmy.worldlinkfedilinkEnglisharrow-up32arrow-down5·2 years agoThat’s a lot of white chicks.
minus-squarecyberpunk2350@lemmy.worldlinkfedilinkEnglisharrow-up25arrow-down1·2 years agoReally it’s like 6, copy pasted over and over
minus-squarespammedevito@kbin.sociallinkfedilinkarrow-up9arrow-down1·2 years agoYeah, a lot of the faces look very very similar!
minus-squaresetVeryLoud(true);@lemmy.calinkfedilinkEnglisharrow-up7·2 years agoPoints to a limited dataset including mostly white people
minus-squarekromem@lemmy.worldlinkfedilinkEnglisharrow-up4·2 years agoWithout knowing the prompt vs the model, it’s impossible to say which side of things is responsible for the lack of variety. Many modern models are actually very good at reversing sample biases in the training set.
That’s a lot of white chicks.
Really it’s like 6, copy pasted over and over
Yeah, a lot of the faces look very very similar!
So?
Points to a limited dataset including mostly white people
Without knowing the prompt vs the model, it’s impossible to say which side of things is responsible for the lack of variety.
Many modern models are actually very good at reversing sample biases in the training set.
Good point