People on Twitter are justifiably outraged after a user noticed “brassiere” was a searchable category in the Photos app on iPhones.
Searching that term returns photos that are, true to name, mostly pictures of users wearing bras, bikinis, lingerie or nothing at all. As Dami Lee points out for The Verge, the fact that “brassiere” is a category may not be the most concerning aspect of this weird situation. Rather, it’s that so many people didn’t know their images are being categorized like this. While most of us know that you can search for photos on iPhones by location and faces, surprisingly few people (our entire editorial team, included) realized that you can search for “things” via a tiny icon that opens up a search bar.
And the “things” you can search for are, much like brassiere, archaic, gendered, and weirdly specific.
Developer Kenny Yin created a comprehensive list of the iPhoto and Photos app search terms in a blog post last year. The search terms auto populate, meaning even if you searched for images of boxers, and you know you had them on your phone, they wouldn’t be recognized by the software.
According to his list, you can search for a myriad of “ladies” undergarments thanks to available search terms like Brassiere, Bandeau, Bandeaus, Bra or Brassieres. But the list fails to include any gender-neutral undergarments, like, I don’t know, just “underwear,” or any traditionally male items such as boxers or briefs. No, the only search term that summons half-dressed photos is the strangely antiquated term “Brassiere.”
The full list of available search terms is outdated and bizarre, and many words seem like they’re pulled from another century. Photos of a blizzard were categorized, Shakespearean like, as “Tempest” in my phone. A photo of my computer appears under “Apparatus,” buildings are “Edifices,” and some purple flowers are “Efflorescences.” Add that to the dictionary-defined-archaic noun for giraffe, “Camelopard,” as an option, and the very old terms describing magic, Legerdemain and Thaumaturgy. (Fun side note: the search function also categorizes my dachshunds as “Badger Dogs” and some cats as “Adult Cats.”)
That Apple is sorting images through recognition software isn’t a secret—it’s actually part of their selling point for improving user experience. But for some reason, the ability to search through our photos using keywords seems to have slipped under everyone’s radar.
Apple’s website says that on the iPhone and Photos app you can “Easily search your images for things like cars or trees or snow.”
“You can even ask Siri to help look for photos, with commands like ‘Find my pictures with dogs in them.’” The feature is no doubt intended to help users find photos without having to slog through their entire library. But as is true with so many modern technologies, the things that are intended to make our digital lives more seamless often come with a high, and in this case, uncomfortable, price tag.
This raises questions of security and how easy it could be for someone else to find your photos categorized in this way, too. Apple says that when face recognition and object detection happen, it’s all within the device: “Which means your photos are yours and yours alone,” according to their website. But as The Verge’s Lee argues, people should be more concerned that these same, searchable photos might be on Google Photos, which stores photos “on the cloud, in Google’s servers.”
The emphasis on “Brassiere” brings up a strange, and surprisingly hard to figure out, question: where do these search terms come from? The rabbit hole of machine learning, object recognition and image captioning is dense and complicated, but somewhere along the line, someone decided what to include—or, what not to include—in these search terms.
Justin Johnson, a PhD student in the Stanford Vision Lab who studies deep learning and computer vision told me that while he’s not sure where Apple got this specific list of terms, there’s a fair amount of overlap between their list and the categories used in the ImageNet challenge.
The ImageNet challenge was an annual computer vision competition from 2010 to 2017,” he wrote over email, adding that in one of the competition tracks researchers built a “computer vision system to classify images into one of the 1,000 categories in the list above.”
The list isn’t a perfect match, but there are many overlaps including, as Johnson pointed out, a “brassiere, bra, bandeau” category. “ImageNet does contain a category for bra / brassiere / bandeau, so it seems plausible to me that Apple inherited these categories in some way from the ImageNet dataset,” he said.
“Since this is such a commonly used dataset in the research community, it’s pretty common to see ImageNet categories show up in a lot of different computer vision systems,” Johnson told me.
While that helps us understand where the Apple list potentially pulled its word choice from, it doesn’t necessarily solve the mystery of why the words are so old and gendered.
And a potentially bigger question persists: so many people, myself included, don’t know the extent of how our most-used devices function. To spend so much time, to become so intimate with a literal object, and fail to understand the ways it categorizes, names and stores information about us seems not only scary, but perhaps naive.