The coming paradigm shift in video stabilization testing
  • Knowledge Base
  • Thursday 19 August 2021

Over the years, video stabilization capabilities have grown increasingly powerful, creating super-stable video feeds that score highly on smartphone industry tests and benchmarks. But user expectations are now shifting towards a more realistic video experience. If you're involved in video stabilization testing, how will you take this realism trend into account? What does 'realistic' really mean, and how do you go about trying to measure such a seemingly subjective quality? We'll share some insider insights from Imint CTO Johan Svensson to give you some food for thought in beginning to address these questions.

How to go about defining realism in video

For the purposes of this discussion, we define realistic video as having the appearance of being filmed by a human. This is in contrast to an artificially enhanced feeling you get from certain camera movements being stabilized too perfectly. Instead of seeming robotic, movements should look like they were made by a professional human cameraperson. Johan explains why Imint has come to this conclusion.

'We've been intensively studying what type of video users prefer in various situations, and what we've found may surprise you. In defiance of the logical conclusion, the more video stabilization, the better, professional users were less fatigued when following objects in a video with more realistic motions. Everyday users were also found to prefer video that appears as if it were shot by a professional cameraperson, like a Hollywood movie,' says Johan.

Panning - case in point for distinguishing realistic from robotic motions

Panning is an excellent example of a movement that stands out as either artificial or realistic in videos. The logical approach to panning is to take the shortest possible route from point A to point B. This results in a single, robotic motion, on a straight line from start to end, maintaining the same speed throughout. Johan explains why this, in fact, is not the way it works in real life and how Imint is addressing this:

'Take a closer look at a movie or watch a professional cameraperson at work. They are trained to follow the leading lines of the scenery, such as the horizon. It also feels less robotic if you make the accelerations and decelerations softer and keep the panning level even along the horizon. Considering how a human naturally turns their head, all of this feels more realistic. Our next-gen video stabilization algorithms now support horizontal panning, and in the future, AI could enable us to identify and follow other leading lines,' says Johan.

The journey from pixel-perfect stabilization to a realistic user experience

Video stabilization testing has traditionally had both a subjective and objective component, with the lab-based objective side normally taking precedence. But what if a smartphone camera deliberately reduces video stabilization power to create a more realistic experience? If users prefer this camera over one that scores higher in the objective testing, then the subjective side of testing should take this user experience into account to provide a fair score. Johan sheds light on this shift in priorities from pixel-perfect stabilization to a realistic user experience:

'We've worked with some of the leading smartphone hardware manufacturers on the journey toward achieving pixel-perfect stabilization. Now that this goal has largely been achieved, some may see this as the end of the road - but not us. The future journey will transcend pixel-perfect and focus more on a realistic user experience supported by increasingly advanced AI capabilities. This paradigm shift will empower everyday consumers to film professional-quality video. We're aiming to get out in front of this with our next-gen Vidhance, and other video stabilization stakeholders, like test institutes, will want to hop onboard this train soon before it's too late,' says Johan.

Let's redefine video stabilization testing together

At Imint, we are leading the transition to a new paradigm of how video stabilization is defined, tested, optimized and used. We're working on a new generation of our world-leading video enhancement platform Vidhance to create more realistic video experiences and enable more flexible optimizations. Learn more in our guide, 'Redefining video stabilization quality'.

Nobody has all the answers to questions like how video stabilization test criteria can be adapted yet, but let's work on them together. We want to share our knowledge and help craft meaningful test criteria for next-gen video stabilization. Contact Johan to continue the dialogue. For inspiration, insights and best practices for the next generation of video stabilization, enter your email address below and subscribe to our newsletter.

Don't miss out!

Get news, articles, insights and discussion about the latest video enhancement technology to your inbox by submitting your email.

Attachments

  • Original document
  • Permalink

Disclaimer

Imint Image Intelligence AB published this content on 19 August 2021 and is solely responsible for the information contained therein. Distributed by Public, unedited and unaltered, on 19 August 2021 14:13:03 UTC.