Apple has quietly built an automated Photoshop into iOS 16 in 2022

Apple has quietly built an automated Photoshop into iOS 16 in 2022

Apple has quietly built an automated Photoshop into iOS 16

The dazzling new iPhone lock screen designs in iOS 16 may have grabbed all the headlines at WWDC 2022, but behind them lies a new feature that’s also a highly unusual one for Apple – Photoshop-style editing skills.

Apple’s AI tools have traditionally been focused on helping you take great iPhone photos, rather than edit them. But a new ‘Visual Look Up’ feature, which you’ll be able to find in the Photos app and across iOS 16, lets you tap on a photo’s subject (for example, a dog) then lift them out of the snap to be pasted somewhere else, like in Messages.

That may not sound too spectacular, but the unnamed feature – which has echoes of Google’s ‘Magic Eraser’ for Pixel phones – will be a significant addition to iPhones when it lands in the software update later this year. Apple usually leaves these kinds of tricks to the best photo editing apps, but it’s now dabbling with automated Photoshop skills.

Just a few years ago, cutting out a complex subject in a photo used to be the preserve of Photoshop nerds. But Apple says its Visual Look Up feature, which also automatically serves up info on the subject you tap on, is based on advanced machine learning models.

Simply lifting a French Bulldog from a photo’s background is, Apple says, powered by a model and neural engine that performs 40 billion operations in milliseconds. This means it’ll only be supported by the iPhone XS (and later models).

Beyond the Photos app, the feature will apparently also work in Quick Look, which lets you quickly preview images in apps. There are also echoes of it in iOS 16’s new customizable lock screens, which can automatically place elements of a photo in front of your iPhone’s clock for a more modern look.

Right now, the feature is limited to letting you quickly cut out and paste subjects in photos, but Apple clearly has an appetite for building Photoshop-style tools into its iPhones. And iOS 16 could just be the start of its battle with the likes of Adobe and Google when it comes to letting you quickly tweak and edit your photos.


Analysis: The AI-powered editing race heats up

A phone screen showing Google's Magic Eraser tool

(Image credit: Google)

Photoshop and Lightroom will always be popular among pro photographers and keen hobbyists, but we’re starting to see tech giants bake automated equivalents of Adobe’s most popular tools into their operating systems.

Just last month Google announced that its Magic Eraser tool, available on Pixel phones, now lets you change the color of objects in your photos with just one tap. This new feature joined the tool’s existing ability to remove unwanted objects or people from your photos.

Apple hasn’t quite gone that far with Visual Look Up’s new feature, which is more like Photoshop’s ‘Select Subject’ tool than Google’s take on the healing brush. But the iOS 16 upgrade is a significant one in the context of the wider race to build the ultimate mobile editing skills for point-and-shoot photographers.

There’s no reason why Apple couldn’t extend the concept to let you, for example, select and replace a drab sky with a more dramatic one. This ‘Sky Replacement’ feature is one we’ve recently seen come to Photoshop and other AI desktop photo editors, and today’s smartphones certainly have the processing power to pull it off.

Of course, Adobe won’t idly stand by and let Apple and Google eat its editing lunch, even if Apple appears to be coming at it sideways. By baking these technologies into core features like the new iOS 16 lock screen, Apple makes them part of not just an iPhone stock app, but the core OS. That’s trouble for Adobe but good news for anyone who doesn’t want to learn, or pay for, Photoshop.