Apple’s New API Turns Your iPhone Into Pet-Tracking 360-Degree Camera

Apple Vision updates with Animal Tracking API

As more people return to the office, there’s an uptick in demand for people to monitor their pets remotely. Per documentation on Apple’s website, developers can take advantage of pet-tracking features and motorized iPhone stands to help users watch their furry friends from afar.

Pandemic pups” are creating numerous issues for new pet owners that adopted furry friends during the COVID crisis. Animals are getting into trouble when their owners aren’t home, especially as many new pets were integrated into a home where someone was nearly always present. Other pet owners are struggling with separation anxiety of their own.

Apple Vision updates with Animal Tracking API

As reported by The Verge, Apple aims to enable developers to capitalize on the increased pet ownership by providing access to an “Animal Body Pose application programming interface (API).”

This API provides tools needed to identify and track animals with an iPhone camera, including the ability to identify a pet’s pose. For example, the API can detect when a pet is sitting down, sleeping, begging for food, etc.

Apple Vision updates with Animal Tracking API

In a video about the Animal Body Pose updates to Vision, Apple’s Nadia Zouba describes an all-too-familiar situation:

“Animal Body Pose can be used in many possible applications. For example, imagine you left your cat and dog at home alone and you spent the day at work. When you are back from work, you find a mess in your house. Don’t worry. The Vision Framework can help you figure out what happened.”

Vision has previously been able to identify and place a bounding box around non-human animals, but users have demanded more functionality and sophisticated pose recognition. The API supports cats and dogs and can detect 25 animal body landmarks, including an animal’s tail and ears.

“The Animal Body Pose API is available in Vision starting iOS 17, iPadOS 17, tvOS 17, and macOS Sonoma. The input to Animal Body Pose can be an image or video. After creating and processing the request in Vision, the request will produce a collection of joints which will define the skeleton of the animal. For Animal Body Pose, six joint groups have been defined,” Zouba explains.

Apple Vision updates with Animal Tracking API

The six joint groups include the head group (ears, eyes, and nose), forelegs group (front legs), hindlegs (back legs), trunk (neck), tail (three tail joints), and “all,” a group that includes all the characterized joints.

When paired with Apple’s new DockKit framework, developers can combine the improvements to Vision with mechanized iPhone stands, enabling an iPhone to move and track a subject, including a cat or dog.

Apple Vision updates with Animal Tracking API

As The Verge observes, Apple isn’t pitching pet tracking as a feature of iOS 17. Still, the improvements to Vision in iOS 17 and other new Apple operating systems enable new pet tracking features, even across a 360-degree field of view when paired with a DockKit-enabled mount.

It’ll take a bit more than an iPhone and a motorized stand to capture unique dog portraits like these, but the improvements to Vision sound promising for developers, and will undoubtedly result in new apps that help pet parents connect with their animals in new and exciting ways.


Image credits: Apple

Discussion