‘Untrained Eyes’ puts an AI spin on looking at yourself in the mirror
What if you stood in front of a mirror and saw someone who barely looked like you? That’s exactly what happens in Untrained Eyes, an interactive sculpture debuting today at the Engadget Experience, a one-day event that showcases exhibitions which mix art with technology. Untrained Eyes, created by conceptual artist Glenn Kaino and actor Jesse Williams (Grey’s Anatomy), doesn’t require a headset to be experienced. Instead, the project uses your face, a mirror, a Kinect and machine learning to show you pictures of people who you may look like — or not.
Sometimes you won’t get a person who resembles you in any way, but that’s the entire point of Untrained Eyes. When Kaino and Williams set out to make this project, it was always with the intention to shed light on the inherent flaws of artificial intelligence algorithms, particularly those utilized in image search databases. The experience itself works effortlessly. You walk up to the installation, wave at the mirror and then, within a few seconds, you’ll be presented an image of your alleged doppelgänger. The images displayed are pulled from a curated dataset that will “match” your appearance, based on your facial attributes.
In its current iteration, Untrained Eyes features five mirrors, which wasn’t the original idea. Kaino said that, toward the end of the development process, he realized that the installation would be better with more than a single mirror. That way people could see each other’s reactions to their image results. And you can definitely see the difference when someone who tries it gets a picture of Brad Pitt, as opposed to another human being who’s, well, less attractive. People will keep going back in front of the mirror, waving their hand, and waiting until they get an image of someone who they’re satisfied with.
An Engadget editor gets an unlikely match.
Regardless of the results, Kaino wants Untrained Eyes to make everyone think about the bias of image searches on the internet, be it on Google or other platforms like it. For example, he pointed to the fact that when you search Google for “men,” most of the results you get served are pictures of white men. Then, there was the time in 2015, when Google Photos mistakenly labeled black people as “gorillas.” These are just two instances where machine-learning has failed. “If there’s anyone that could have an infinite dataset of everyone in the world, it would be Google, “Kaino said, “and even then they have massive failures.”
Ultimately, those failures served as inspiration for Kaino and Williams to create Untrained Eyes. The reward has been the effect it has on people’s insecurities when they see “themselves” in the mirror. “The paradox is, once you see yourself. Even when we people get matches that are close to them, they immediately start distancing themselves [from the mirror],” Kaino said. “They might be happy with it but they’re like, ‘Oh, but my hair is a little bit better than that person,’ or ‘Those aren’t my eyes, but it’s good enough. There’s an immediate distancing that happens despite any of the gratification.”
I, for one, know I felt much better when I saw Johnny Depp in my Untrained Eyes mirror and not Salt Bae.
Untrained Eyes was made possible through funding from the Engadget Alternate Realities grant program, established in May 2017. It debuted, along with four other prize-winning immersive-media projects, at the Engadget Experience on November 14th, 2017.