Finally, machine learning gets another good feature on the iPhone. The system can already identify objects in a photo and you can search for them in the Photos app.
With iOS 16, the group is increasing the possibilities again. Objects can now be pulled out of an image, automatically cut out and moved to another app. The iPhone recognizes whether it is buildings, mountains, people, objects or animals and then releases them.
In the keynote, we see Apple’s Siri boss Robby Walker dragging a dog from a photo into an iMessage.
iOS 16 releases objects and people
The function, which is still in the beta phase in iOS 16 itself, is still a bit sensitive. A light touch on the object is enough to trigger the function. If you hold your finger on it for too long, the live function of the photo is called up. There will certainly be some improvements here in the coming betas.
The object recognition itself is very good. The system even recognizes furry animals and releases them. The algorithm has almost no problems with objects with clearly recognizable edges. Especially with photos that contain depth information.
Of course, the iPhone 13 pro with its LIDAR sensor has a clear advantage here. But portrait photos are already well recognized within the scope of the possibilities.
Now what is this function for? Certainly, sending animal pictures via iMessage may have played an important role in its development. But cutting out items and objects has always been a popular method, for example to use a different background.
Certainly the function is not perfect. Maybe she won’t be perfect either. However, it is sufficient for simple image processing.
Photos: Apple and Apple Talk