Apple isn’t saving bra photos — but your iPhone is looking at boobs

Apple isn’t saving bra photos — but your iPhone is looking at boobs

The world discovered yesterday that the iPhone, which sorts pictures to categories, has a folder specifically for ‘brassiere’ pictures. Much pearl-clutching was done, but it might be even more scandalous: the category could actually be sorting photos showing the skin associated with your brassieres.

Here’s the tweet that stoked the fire:

Several people tried the search themselves, and posted the results: lots of bra, bikini, and NFSW pics — and everyone was galled that Apple was ‘saving’ their suggestive pics. Personally, I was just shocked that someone besides me still used the word “brassiere.”

It came to light that the category has existed for more than a year now — it’s just the first time anyone’s noticed to get mad. In fact, the entire list of categories is available to browse on Medium.

The list contains the words ‘brassiere,’ ‘girdle,’ and ‘corset.’ It also has such categories as ‘marjoram,’ ‘legerdemain,’ aqualung,’ and ‘thaumaturgy,’ but not a generic ‘underwear’ category, so clearly the people who designed this were big on narrow niches.

That said, the image recognition algorithm that categorizes the pics might be more attuned to your skin than your clothes.

Just to go from my own phone, I have 7 photos that fit under the “Brassiere” category. Spoiler: none of them have my actual bras in them. In fact, the only common feature in them is not what I’m wearing, but what I’m showing. Namely, in all of them, you can see my cleavage, collarbone, and, in one, my ribs.

For example, here’s an old Halloween selfie in which I’m working on my Lara Croft costume. No bra:

Here’s a non-Halloween selfie. No bra:

 

Finally, a necklace I got for Christmas in 2009 (yes, the one from Lord of the Rings). No bra:

I don’t think it’s a deliberate move on Apple’s part, though. I suspect the machine learning designed to sort these pics is remembering visible skin around the lingerie just as much as the article of clothing.

I’m also not going to set much stock by the iOS categories, if only because they seem a bit unreliable. For example, I found a picture of my printer cartridges under the category “Gin Mill.” Also, the brassiere category somehow missed the actual photos of me in a bikini top.

Regardless, while I don’t think people need to worry about Apple collecting their nudes — the machine learning is entirely within your phone and you don’t need to upload anything to iCloud — I do think it’s a good reminder that your phone remembers what it sees. It all depends on what you show it.

h/t The Guardian

Read next: This new AI can make your low resolution photos great again