It seems the iPhone camera is about to get a whole lot smarter. A leaked internal version of iOS 11 has revealed that there’s a new feature coming to iPhones called “SmartCam.” It seems to be a way for the camera to automatically optimize your shot based on what it sees you taking a photo of.
SlashGear reports that iOS developer Guilherme Rambo spotted the SmartCam references when an internal version of iOS 11.0.2 meant for HomePod test units was sent out to outside developers.
iOS 11 (or the next iPhone) will have something called SmartCam. It will tune camera settings based on the scene it detects pic.twitter.com/7duyvh5Ecj
— Guilherme Rambo (@_inside) August 2, 2017
Based on the references seen in the leaked code, SmartCam will apparently adjust your iPhone camera settings and features after recognizing things in your shot. It may be able to freeze motion and trigger precapture when you’re photographing pets and babies, for example. When you’re shooting a bright scene or a sunrise/sunset, it may be able to automatically capture a low-light HDR photo.
Scenes mentioned in the leaked code include Baby, BrightStage, Document, Fireworks, Foliage, Pet, PointLightSources, QR, Sky, Snow, Sport, and SunsetSunrise.
These are scenes and subjects that can be difficult to photograph well with a point-and-shoot camera, so it makes sense that Apple would want to be able to identify them and help people use better settings for better results.
It seems Apple is working hard at integrating computation photography and machine learning technologies into its iPhone camera, which is one of the dominant cameras being used in the world today.
It’s unknown at this point whether SmartCam will be rolled out to existing iPhones when iOS 11 arrives later this year, or whether it’s software that’s specifically designed for Apple’s future generations of iPhone cameras. You may remember that the now-ubiquitous Panorama mode was also a feature discovered by developers to be hidden inside iOS source code in 2011.