New Research Shows Drone Can Shoot Video Based on Desired Emotion

Making a movie with a drone is quite tasking. First of all, it requires skill to fly the expensive pieces of equipment smoothly without any crashes. Once you have mastered flying, there care panning speeds, camera angles, trajectories and flight plans to sort out.

With all the sensors and processing power a drone has embedded in its camera, there has to be a better way to capture the perfect shot.

‘Sometimes, you just want to tell the drone to make an exciting video,’ said Rogerio Bonatti, Ph.D. candidate in Carnegie Mellon University’s Robotics Institute.

Bonatti was a part of the team from CMU, University of Sao Paulo and Facebook AI Research which developed a model enabling a drone to shoot a video based on a desired emotion or viewer reaction. The drone uses camera angles, speeds and other parameters to generate a video that could enjoyable, calm or scary, depending on the instructions of the filmmaker.

The team presented their paper on the research at the 2021 International Conference on Robotics and Automation. It can be viewed on YouTube.

‘We are learning how to map semantics, like a word or emotion, to the motion of the camera,’ Bonatti said.

The researchers collected hundreds of videos to get data on what makes a video evoke a certain emotion. A few thousand viewers then watched 12 pairs of videos and gave them scores based on how the videos made them feel.

They used the data to train a model that the directed the drone to imitate the cinematography corresponding to a particular emotion. If fast moving, tight shots created feelings of excitement, the drone would use those same elements to make an exciting video when the user requested it.

‘I was surprised that this worked,’ said Bonatti. ‘We were trying to learn something incredibly subjective, and I was surprised that we obtained good quality data.’

The team tested their data by creating sample videos, like a chase scene and asked the viewers for feedback on how the videos made them feel. Bonatti said the videos created by the team was also able to elicit different degrees of those emotions.

The aim of the team’s work is to improve the interface between people and cameras, by helping amateur filmmakers with drone cinematography or providing on-screen directions on a smartphone to capture the perfect shot.

‘This opens the door to many other applications, even outside filming or photography,’ Bonatti said. ‘We designed a model that maps emotions to robot behavior.’

By Marvellous Iwendi.

Source: CMU