20 years after Mark Zuckerberg’s infamous ‘hot-or-not’ website, developers have learned absolutely nothing.
Two decades after Mark Zuckerberg created FaceMash, the infamously sexist “hot-or-not” website that served as the precursor to Facebook, a developer has had the bright idea to do the exact same thing—this time with all the women generated by AI.
A new website, smashorpass.ai, feels like a sick parody of Zuckerberg’s shameful beginnings, but is apparently meant as an earnest experiment exploring the capabilities of AI image recommendation. Just like Zuck’s original site, “Smash or Pass” shows images of women and invites users to rate them with a positive or negative response. The only difference is that all the “women” are actually AI generated images, and exhibit many of the telltale signs of the sexist bias common to image-based machine learning systems.
For starters, nearly all of the imaginary women generated by the site have cartoonishly large breasts, and their faces have an unsettling airbrushed quality that is typical of AI generators. Their figures are also often heavily outlined and contrasted with backgrounds, another dead giveaway for AI generated images depicting people. Even more disturbing, some of the images omit faces altogether, depicting headless feminine figures with enormous breasts.
According to the site’s novice developer, Emmet Halm, the site is a “generative AI party game” that requires “no further explanation.”
“You know what to do, boys,” Halm tweeted while introducing the project, inviting men to objectify the female form in a fun and novel way. His tweet debuting the website garnered over 500 retweets and 1,500 likes. In a follow-up tweet, he claimed that the top 3 images on the site all had roughly 16,000 “smashes.”
Understandably, AI experts find the project simultaneously horrifying and hilariously tonedeaf. “It’s truly disheartening that in the 20 years since FaceMash was launched, technology is still seen as an acceptable way to objectify and gather clicks,” Sasha Luccioni, an AI researcher at HuggingFace, told Motherboard after using the Smash or Pass website.
One developer, Rona Wang, responded by making a nearly identical parody website that rates men—not based on their looks, but how likely they are to be dangerous predators of women.
The sexist and racist biases exhibited by AI systems have been thoroughly documented, but that hasn’t stopped many AI developers from deploying apps that inherit those biases in new and often harmful ways. In some cases, developers espousing “anti-woke” beliefs have treated bias against women and marginalized people as a feature of AI, and not a bug. With virtually no evidence, some conservative outrage jockeys have claimed the opposite—that AI is “woke” because popular tools like ChatGPT won’t say racial slurs.
The developer’s initial claims about the site’s capabilities seem to be exaggerated. In a series of tweets, Halm claimed the project is a “recursively self-improving” image recommendation engine that uses the data collected from your clicks to determine your preference in AI-generated women. But the currently-existing version of the site doesn’t actually self-improve—using the site long enough results in many of the images repeating, and Halm says the recursive capability will be added in a future version.
It’s also not gone over well with everyone on social media. One blue-check user responded, “Bro wtf is this. The concept of finetuning your aesthetic GenAI image tool is cool but you definitely could have done it with literally any other category to prove the concept, like food, interior design, landscapes, etc.”
Halm could not be reached for comment.
“I’m in the arena trying stuff,” Halm tweeted. “Some ideas just need to exist.”
Luccioni points out that no, they absolutely do not.
“There are huge amounts of nonhuman data that is available and this tool could have been used to generate images of cars, kittens, or plants—and yet we see machine-generated images of women with big breasts,” said Luccioni. “As a woman working in the male-dominated field of AI, this really saddens me.”
This is like dunking on DeviantArt because it has artists who make cheesecake pictures of ladies. I’m not saying it’s something I personally enjoy, but who am I to tell others what art they should enjoy?
The only way for this to be consistent is if you believe authorial intent or real practical effects on an audience have no bearing on the properties of a piece of media.
As long as it’s fiction, it’s okay?
Unless the author writes an essay to accompany their piece, I think any conclusion you make about authorial intent is speculative. A beefcake pic of a guy in speedos lifting weights could be sexual, or maybe the artist is doing a study in human musculature? Heck if I know.
Effects on the audience, I’m not sure I understand that. It’s up to the audience to decide whether they like something, or not, or whether they are happy with whatever “effects” it has on them. The effect most are interested in is “pleasure”, I think. If one doesn’t like the pics, one is not in the audience for that art.
If one wants to make the argument that folks shouldn’t look at cheesecake or beefcake pics, because they create some sort of problem for the viewer, the onus is on the claimant to win the hearts and minds of the audience. As long as all parties are consenting adults making informed decisions, I don’t see the issue.
I do concur that it could be “sexist” in the same sense that anybody discriminating based on sexual preference is sexist, but I’m not sure that is wrong. Someone who prefers lady types as sexual partners may prefer to look at cheesecake pics of lady types, I guess, and that’s technically sexist because they’re choosing those pics based on lady characteristics.
Now if you want to argue that such pics have downstream effects on a vulnerable/disempowered population, that would be a different argument.
We have no control over who we are attracted to sexually (or not at all), but we do have control over how we interact with the world. Who you are attracted to cannot be sexist, racist, etc. because there is no intention - it merely is. Being attracted and choosing to objectify someone are two very distinct processes because one involves intention. Discrimination is also an act of intent.
What “someone” is being objectified in this case?
I’m merely explaining why it is not analogous and why attraction cannot be considered bigoted. Anything that involves intent can be criticized for bigotry if it is present.
That’s fair. Thank you.
That is literally the argument in question about this whole post. 🤦
Your rant about not being able to do any rhetorical analysis without an author spelling it out for you is really not my problem. Maybe don’t criticize it if you have no practice doing it in the first place.
Your willful misunderstanding of how objectification in fiction can ever be any more problematic than “discrimination based on sexual preferences” is just… Wow.
I can only respond to the complaint you made:
… not the one you imagine you made.
To be clear, I disagree with this:
To clarify, I don’t think the author’s intent really matters in art. If one is interested in context, then it’s a useful context.
In this case, the images have no “author”, they’re a machine output, so I’m not sure how you think authorial intent figures in this.
EDIT: My mistake, I’m mixing up responses. I should further clarify that, in the case of cheesecake/beefcake pics on DeviantArt (the example I gave), there clearly is an author/artist. But ultimately I’m still not sure it matters what their intent is. Do they like drawing lingerie as an artistic subject, or do they like drawing ladies for sexual titillation? I’m not sure there is any moral imperative on the viewer to care.