In their work with ETH Zurich, MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) have developed a drone that simplifies aerial tracking by removing humans almost entirely from the equation.
Aerial tracking shots normally require a team of skilled camera and drone operators to handle collision avoidance and framing. This system aims to change that with intelligent automation.
A director can use the system to specify the viewing angle, screen position and size of the target’s face on the screen, and the camera-equipped drone will stay locked on using those parameters while also avoiding obstacles. The parameters can even be changed on the fly and the drone adjusts its flying position accordingly.
The demo video shows how the drone deals with multiple actors moving through a scene, anticipating the collision and dodging out of the way while keeping the subject perfectly framed in shot.
The new system allows the director to give weighting to different elements in the scene, as the news release from MIT explains:
Usually, the maintenance of the framing will be approximate. Unless the actors are extremely well-choreographed, the distances between them, the orientations of their bodies, and their distance from obstacles will vary, making it impossible to meet all constraints simultaneously. But the user can specify how the different factors should be weighed against each other. Preserving the actors’ relative locations onscreen, for instance, might be more important than maintaining a precise distance, or vice versa. The user can also assign a weight to minimize occlusion, ensuring that one actor doesn’t end up blocking another from the camera.
The system will be presented at the end of the month at the International Conference on Robotics and Automation in Singapore. The research group hopes to make drone cinematography more accessible, simple, and reliable.
Hopefully the new technology will filter down to consumer products and eventually help to eliminate human error in drone piloting altogether.