Twitter users were alarmed by a feature of Apple's iOS.
Let's face it: many people out there have at least a few selfies buried deep in their smartphones that are, shall we say, more than a little revealing. In a world where Snapchat and Instagram flirtation sometimes takes priority over real-world human interaction, the likelihood that you'll find at least one nude on someone's mobile device is higher than ever. With this in mind, what if Apple was automatically storing photos of this nature in a special folder in your iPhone? A viral tweet made that assertion earlier this week, sending many female Twitter users into a panic.
As the above post noted, typing "brasserie" or another related search keywords will give you a whole bunch of search results that the face recognition technology that was introduced into that ecosystem back in 2016. Even Chrissy Teigen confirmed this user's findings with a screengrab of her own, showing several pictures where she was in different states of undress. You can see what she tweeted below.
However, as The Verge so duly noted, the categorized photos that come up as search results are only stored locally. This means that, because of the facial recognition software, you won't be showing up in any other search results for those types of keywords by other people or have them aggregated in some large cloud-based folder, outside of whatever service you currently sync those kinds of pictures with. In truth, Apple is also not the only one using this type of machine-learning technique with photos either - apparently Google Photos has been doing the same thing for a while now too.
What is perhaps more interesting in the decision making behind which keywords are included and which ones are not. For example, the terms "boxers," "briefs," "underwear," "penis" and "dick" are not searchable when it comes to the pictures on iOS, but "brasserie" and derivatives like "bra" are. No explanation of this reasoning, one way or the other, has been made public by Apple.