20 years after Mark Zuckerberg’s infamous ‘hot-or-not’ website, developers have learned absolutely nothing.
20 years after Mark Zuckerberg’s infamous ‘hot-or-not’ website, developers have learned absolutely nothing.
Two decades after Mark Zuckerberg created FaceMash, the infamously sexist “hot-or-not” website that served as the precursor to Facebook, a developer has had the bright idea to do the exact same thing—this time with all the women generated by AI.
A new website, smashorpass.ai, feels like a sick parody of Zuckerberg’s shameful beginnings, but is apparently meant as an earnest experiment exploring the capabilities of AI image recommendation. Just like Zuck’s original site, “Smash or Pass” shows images of women and invites users to rate them with a positive or negative response. The only difference is that all the “women” are actually AI generated images, and exhibit many of the telltale signs of the sexist bias common to image-based machine learning systems.
For starters, nearly all of the imaginary women generated by the site have cartoonishly large breasts, and their faces have an unsettling airbrushed quality that is typical of AI generators. Their figures are also often heavily outlined and contrasted with backgrounds, another dead giveaway for AI generated images depicting people. Even more disturbing, some of the images omit faces altogether, depicting headless feminine figures with enormous breasts.
According to the site’s novice developer, Emmet Halm, the site is a “generative AI party game” that requires “no further explanation.”
“You know what to do, boys,” Halm tweeted while introducing the project, inviting men to objectify the female form in a fun and novel way. His tweet debuting the website garnered over 500 retweets and 1,500 likes. In a follow-up tweet, he claimed that the top 3 images on the site all had roughly 16,000 "smashes."
Understandably, AI experts find the project simultaneously horrifying and hilariously tonedeaf. “It's truly disheartening that in the 20 years since FaceMash was launched, technology is still seen as an acceptable way to objectify and gather clicks,” Sasha Luccioni, an AI researcher at HuggingFace, told Motherboard after using the Smash or Pass website.
One developer, Rona Wang, responded by making a nearly identical parody website that rates men—not based on their looks, but how likely they are to be dangerous predators of women.
The sexist and racist biases exhibited by AI systems have been thoroughly documented, but that hasn’t stopped many AI developers from deploying apps that inherit those biases in new and often harmful ways. In some cases, developers espousing “anti-woke” beliefs have treated bias against women and marginalized people as a feature of AI, and not a bug. With virtually no evidence, some conservative outrage jockeys have claimed the opposite—that AI is “woke” because popular tools like ChatGPT won’t say racial slurs.
The developer’s initial claims about the site’s capabilities seem to be exaggerated. In a series of tweets, Halm claimed the project is a “recursively self-improving” image recommendation engine that uses the data collected from your clicks to determine your preference in AI-generated women. But the currently-existing version of the site doesn’t actually self-improve—using the site long enough results in many of the images repeating, and Halm says the recursive capability will be added in a future version.
It's also not gone over well with everyone on social media. One blue-check user responded, "Bro wtf is this. The concept of finetuning your aesthetic GenAI image tool is cool but you definitely could have done it with literally any other category to prove the concept, like food, interior design, landscapes, etc."
Halm could not be reached for comment.
“I’m in the arena trying stuff,” Halm tweeted. “Some ideas just need to exist.”
Luccioni points out that no, they absolutely do not.
“There are huge amounts of nonhuman data that is available and this tool could have been used to generate images of cars, kittens, or plants—and yet we see machine-generated images of women with big breasts,” said Luccioni. “As a woman working in the male-dominated field of AI, this really saddens me.”
This is beside the main point, but it jumped out at me and bothered me the instant I read it:
One blue-check user responded…
Who cares if they have a check or not? Any asshole can just buy one. Blue checks haven’t been “special” for a while now.
The article raises some important issues, but it’s undermining its own tone. Since the article is driven by moral indignation, why make special emphasis of a quote from someone who’s helping support Musk’s cesspool of hate?
For starters, nearly all of the imaginary women generated by the site have cartoonishly large breasts
That wasn't my experience when I went there just now. I think maybe it learned from the author's preferences more than the author realises.
I went there and clicked "pass" on everything and it generated a range of different body types of AI women. There were also way more heads without bodies than bodies without heads.
Nah, the first time I opened the website was after reading this comment. Spent a few minutes passing on every image with large breasts and it's still the majority of them.
I actually did it a second time and eventually it stopped serving images, so my guess is it's actually a shallow pool of images and it just shows them to everyone and takes the data, no adapting to results. That or the website got hug of deathed
But see they're made up. If the characters aren't real, who could possibly suffer the effects of media intentionally objectifying women or otherwise reducing a group of people into caricatured stereotypes- ohhhhhhhhhhhhh
This type of "party game" is still at it's core objectifying women. They may be generated images but the whole project is aimed at passing judgement on women you would rate as fuckable or not. It's encouraging behaviour that makes women feel uncomfortable or unsafe.
This type of objectifying isn't exclusive to this project. Groups of men will rate and objectify women casually and frequently. I've worked in the trades and have been surrounded by such talk from men. The more normalized this type of behaviour is, the easier it is to consider women as less than human. Feeling like a replaceable tool with no sense of self or sense of worth is dehumanizing.
They could have chosen to base this project on just about anything else in our world. We have animals, nature, technology and so much more to try this kind of thing out on. Yet, what seems like another "tech bro" idea was focused on hyper sexualizing and objectifying women as if they were just another thing for men's entertainment.
Simply, it's gross behaviour. Just because they are generated images does not make it any less gross or acceptable. People are not objects for another person's amusement and we should not encourage such behaviour.
This is like dunking on DeviantArt because it has artists who make cheesecake pictures of ladies. I'm not saying it's something I personally enjoy, but who am I to tell others what art they should enjoy?
The only way for this to be consistent is if you believe authorial intent or real practical effects on an audience have no bearing on the properties of a piece of media.
Jumping into the feminism community to challenge a fairly core tenant of feminism is a bad take. I'm removing this because it was a comment made clearly in bad faith. You're expected to be nice on our instance, do better in the future.
I'm just curious if we could collect demo data on the raters and have that train the generation algorithm. Could we find regional, statistically significant differences in aesthetic preferences? Would we be able to trace the cultural influence of different groups by their preferences?
Do some groups prefer specific shapes/sizes/colors? What's the most predictive feature of perceived attractiveness across different groups?
I dunno, sounds like some interesting research into the visual aspect of attraction and also implicit biases.
I think I may have participated in research like this once. It collected a lot of demographic data on me and then I had to rate a bunch of people and in the next section make value judgements about whether each of a series of people looked like they were likely to be smart, trustworthy, bad tempered etc. It was all on a timer.
Besides "pure" Psychology, Aesthetics research is dealing with such kind of questions and has been going on for a while.
This does not only encompass visual features in humans to which people feel some form of attraction, but also stuff like music or visual art.
It's not my field and it has been some time since I read good literature on this, which is why I am not giving you any possibly erroneous summaries. But I am sure that this has been investigated and is still a topic of active research. So you can take my comment as a pointer.
First hit, when searching for:
"cultural differences in visually appealing facial features"
Yeah! That's is an interesting unintended use for something like this. A dumb sexist app might turn out be a useful tool for psychologists and sociologists. (... and marketing people)
This is most likely to a use like capcha, which has been used by google to train visual algorithms and AI. I imagine the intent here is to gather a bunch of data on sexual preference and use it to make targeted ads towards men.
What a dumb article, it reads like a few women feeling inadequate and threatened by fictional ones.
What is even the point? That it enforces an increasingly superficial society? Go on Tinder for five minutes and you will get that experience in full and then some, just with real people instead of fictional ones.
And this human being here is the prophet of all female human beings and can speak in behalf of all of them.
I myself find it a big turn off for a person to be so insecure about what defines them as attractive, besides their visual appearance.
I originally posted the following comment as a reply to another comment that has now been removed. I'm reposting it as I think it still has value to the current conversation under this post.
This type of "party game" is still at it's core objectifying women. They may be generated images but the whole project is aimed at passing judgement on women you would rate as fuckable or not. It's encouraging behaviour that makes women feel uncomfortable or unsafe.
This type of objectifying isn't exclusive to this project. Groups of men will rate and objectify women casually and frequently. I've worked in the trades and have been surrounded by such talk from men. The more normalized this type of behaviour is, the easier it is to consider women as less than human. Feeling like a replaceable tool with no sense of self or sense of worth is dehumanizing.
They could have chosen to base this project on just about anything else in our world. We have animals, nature, technology and so much more to try this kind of thing out on. Yet, what seems like another "tech bro" idea was focused on hyper sexualizing and objectifying women as if they were just another thing for men's entertainment.
Simply, it's gross behaviour. Just because they are generated images does not make it any less gross or acceptable. People are not objects for another person's amusement and we should not encourage such behaviour.``
This is really just a symptom of people seeing women as objects or as a collection of features.
I try to see the positive, at least it's not photos of actual people. But it is a sad outlook on our society.
Just a side note:
Those are generated based on a data set of real people on which the AI methods are trained on. It is – to some degree and with specific AI models – possible to reconstruct the original photos of the training set. This has risen a lot of concerns; privacy among those.
It would be nice if men learned that attraction doesn't have to mean objectification, and that real women are way better than a cobbled together Frankenstein "perfect" monster woman.
I mean, 99% of these men would have zero chance with a woman half as attractive. They seriously need to start figuring out what WOMEN find attractive instead of wasting their time with empty fantasies if they want to get a real relationship someday.
I really don't like this idea that "men should figure out what women find attractive". This goes against the idea of being natural - it puts useless pressure on men who are not able to find a partner, as the implicit message is really "You could not find a partner because you don't know what women find attractive".
I mean, if I were to say the same sentence but with the roles reversed "women should figure out what men find attractive" you would most probably call me a sexist. See the problem?
Here is what all men should know : attractiveness is a matter of taste. As long as the guy is healthy and respectful, eventually he will find someone. Knowing that, he should get confident and not be afraid to propose dates.
Nobody said anything about having a chance with someone that attractive lol.
"Yes, I find this woman attractive"
YOU DISGUSTING PIG! YOU'RE A PERVERT!
That's you. That's what you sound like.
Objectifying women is objectively bad. But ranking women who don't even exist in terms of attractiveness isn't hurting anyone. Honestly it just sounds like you're upset with male sexuality existing.
Exactly that propagates and fosters what "conventionally attractive" implicates and imposes unrealistic or unhealthy expectations upon women.
Properties, which are completely natural, are considered "ugly", e.g. hair on legs, arms and arm pits or a "normal" and perfectly healthy amount of body fat. Women start developing severe mental health issues from a young age and - in extreme cases - risk their lives by trying to fit those images.
This is, however, not only a problem of AI generated images alone, but a problem of society and media in general.
Furthermore this is not just the case for women, but also for men. Although it is more prevalent with women, if I am not mistaken about the numbers, and has a history almost as old as humanity itself.