This story is partCNET’s full coverage from and about Apple’s annual developer conference.
Appleswill include lots of new iPhone features like and one . But there was one feature that really caught my attention during despite occupying less than 15 seconds of the event.
The feature hasn’t been given a name, but here’s how it works: you tap and hold on a photo to separate the subject of an image, such as a person, from the background. And if you keep holding, you can then “lift” the cutout from the photo and drag it into another app to post, share, or make a collage, for example.
Technically, the tap-and-lift photo feature is part of Visual Lookup, which first launched with iOS 15 and can recognize objects in your photos such as plants, food, landmarks and even pets. In iOS 16, Visual Lookup lets you remove that object from a photo or PDF by doing nothing more than tap and hold.
Robby Walker, Apple’s senior director of Siri Language and Technologies, demonstrated the new tap-and-lift tool over a photo of a French bulldog. The dog was “cropped” from the photo, then dragged and dropped into a message text field.
“It’s like magic,” Walker said.
Sometimes Apple overuses the word “magic”, but this tool sounds impressive. Walker was quick to point out that the effect was the result of an advanced machine learning model, which is accelerated by basic machine learning and Apple’s neural engine to perform 40 billion operations in a second.
Knowing the amount of processing and machine learning required to cut a dog out of a photo delights me to no end. Often, new phone features must be revolutionary or solve a serious problem. I guess you could say that the press and hold tool solves the problem of removing the background from a photo, which, at least for some, could be serious business.
I couldn’t help but notice the similarity to another iOS 16 photo feature. On the lock screen, the photo editor separates the foreground subject from the background of the photo used for your wallpaper. So lock screen elements like time and date can be layered behind your wallpaper subject but in front of the photo background. It looks like the cover of a magazine.
I couldn’t try out the new visual search feature, so I instead watched the part of the WWDC keynote where this French bulldog is pulled out of his picture over and over again. If you have a spare iPhone to try it out, aand a public beta of iOS 16 will be released in July.
To find out more, seeincluding the .