The artificial intelligence used to sort photos on the iPhone has caused minor panic among some Twitter users.
Earlier this week, Twitter user @ellieeewbu posted her concerns after discovering a search for the word “brassiere” returned images of bras and clothes straps saved on her camera roll.
Model Chrissy Teigen also found similar results when she tried it, posting a screenshot of what she was shown.
Quickly spreading on social media, the reports raised certain levels of panic among some iOS, with concerns that Apple was scanning and organising their intimate photos in the background.
However, others quickly noted that the search function was not only far from seeking out intimate photos, it could also be used to identify a wide range of objects and scenes.
At the centre of the incident is image recognition technology Apple introduced over a year ago that has been trained to recognise more than 4,000 different objects, faces and scenes.
“Advanced face recognition and computer vision technology lets you search your photos by who and what’s in them,” the technology giant says on its website.
Apple also points out no folders are being created based on these searches and all image recognition is done solely on a user’s phone, and not anywhere else.
“When you search your photos, all of the face recognition and scene and object detection are done completely on your device,” the company’s website says.
So, it seems “brassiere” is one of the words – and one of the objects – iPhones have been taught to recognise.
Google Photos has had similar technology in place since 2015. Google uploads users’ images to its own cloud and carries out most of its image recognition there, using the photos to further train its AI.
This latest surprise – and slightly creepy – feature discovery shows just how big a part artificial intelligence and machine learning are beginning to play in mainstream gadgets.
Apple has not commented on the bra issue.