The iPhone 16’s Apple Intelligence Revolutionizes the Camera and Photos App
The new Apple Intelligence feature on the iPhone 16 has been built “from the ground up” to understand language and images.
Craig Federighi introduced the feature at Apple’s Glowtime event today (Monday) which saw the launch of the iPhone 16 and 16 Plus — the first iPhone fully integrated with Apple Intelligence.
Relive Memories
Apple users have already been able to search for objects in Photos but with Apple Intelligence they will be able to get more specific.
For example, users will be better able to describe a photo they want to find on their phone using natural language. This feature extends to videos where the user can find a “specific moment” in the clip.
Users can also create a movie with AI, it will automatically find photos and videos and “smartly” arrange them into a storyline.
Visual Intelligence
Using the new touch-sensitive Camera Control button, users can point the iPhone 16’s camera at a subject and be able to see specific information.
Apple gave the example of using Visual Intelligence to get more information about a restaurant or an event poster. For example, the user can find out the restaurant’s opening hours and reviews.
Image Playground
Image Playground is a new dedicated app that will be integrated into other apps such as Messages.
It works like an AI image generator, allowing users to generate different types of images based on a prompt. Notably, Image Playground doesn’t support photorealistic generation. Instead, it has three styles: Animation, Illustration, and Sketch.
It can also be used to create novel emojis by typing in a prompt.
Other Apple Intelligence Features
Away from the visual side of things, Apple says its AI feature is “grounded in personal context” thanks to a unique integration of hardware and software.
Apple Intelligence will be found across a multitude of apps and can be used to help write emails or private messages.
It will also improve Siri — Apple’s much-maligned personal assistant — by giving the user “step-by-step guidance on how to do something with your iPhone”. Siri will be able to “tap into your personal context” while gaining “on-screen awareness”; for example, taking prompts from messages and opening a relevant app.
Users will be able to use AI-enhanced Siri to send photos, for example: “Hey Siri, send Erica the photos from Saturday’s BBQ.”
The Siri feature will be available in Beta from next month. Localized English — like U.K. and Australia — will start rolling out in December and next year it will add other languages like Chinese French, Japanese, and Spanish.
Meanwhile, the Private Cloud Computer feature will give users access to larger AI models that are more powerful than the one installed on the iPhone 16. But Apple insists the user’s data will be protected and never stored.