We're about to get super meta here. But not in the self-referential, 'that's so meta' way that became nomenclature in the early 2010s, sometime after 'Inception' hit theaters. We're talking meta specifically in regard to metadata-i.e. data about data.

In early 2017, GoPro introduced GoPro Metadata Format, which open-sourced GoPro video metadata-sharing telemetry, temperature, image exposure and shutter speed, to name a few, in a dedicated MP4 'track' (much like an audio or video track within that file). In late 2017, we launched GoPro's first-ever custom processing chip, GP1, and HERO6 Black began to tap further into metadata capabilities.

And in 2018, *drumroll, please* we continued the meta momentum with HERO7 Black-the first GoPro with scene recognition as well as improved facial and smile recognition. These upgrades help with not only auto white balance (AWB), color rendering and color grading but also improve the QuikStories auto-editing experience.

To help explain, we asked the GoPro experts to give from a relatable scenario in which this benefits user, and what better than when taking the waterproof HERO7 Black under water.

The following words are by firmware architectAnandhakumar Chinnaiyan, who works out of the GoPro headquarters in San Mateo, Calif., and algorithm engineerAdrien Cariou, who works out of the GoPro Paris office.

Under the Sea: Metadata + Auto White Balance

Underwater photos and videos have been part of the GoPro DNA since Day 1. And starting with HERO5, we've been working toward making underwater footage easier than ever to capture. That means eliminating the need for color-correcting dive filters-less gear needed, means less gear to forget.*

From an image engineer point of view, the water acts like a blue filter and blocks most of the red-light rays. This phenomenon intensifies with depth: The deeper you dive, the less red rays will hit your sensor. Dealing with this lack of red coloring is one of the most challenging scenarios for our AWB algorithm to correct.

Historically, AWB was applied after analyzing the color of the scene, but this approach can lead to tricky defect called 'metamerism.' Metamerism is when a color is perceived in different ways depending on the light source. For example, a yellow color swatch in sunlight can appear to be the exact same color as a green color swatch under a warm artificial light source. This is simply due to the human brain and its interpretation, and we rarely (if not, never) consider how our brains interpret color differently depending on ambient light sources, both consciously and unconsciously.

When the human brain is removed from the equation, and a machine is asked to make these differentiations, it gets trickier. AWB can't recognize metamerism, which makes it particularly difficult to color-correct underwater scenes proactively with AWB and Auto Exposure.

Enter HERO7 Black armed with the GP1 chip and advanced metadata capabilities. Now, with this GoPro, as you reach lower depths and less light is available, image analysis and metadata influence AWB and Auto Exposure to deliver color-corrected content straight from the camera.

While this underwater scene analysis is a favorite example amongst GoPro, it's actually only one of six scenes that HERO7 Black metadata can differentiate. The others are indoor, urban, beach, snow and vegetation.

These scene classifications are baked in to the AWB algorithm, using metadata, and rely on analyzing global parameters, such as the exposure condition, to deliver more accurate, natural colors straight from the GoPro.

Check out HERO7 Black here, and read more about open-sourced metadata here.

*Pro Tip Deep Dive: For all you Scuba Steves and Scuba Sallys out there who are diving to depths deeper than 10 meters, there are many variables that we cannot always predict and correct for-water color, changing light conditions and transmission. In this case, using a dive filter may still be necessary.

Attachments

  • Original document
  • Permalink

Disclaimer

GoPro Inc. published this content on 22 March 2019 and is solely responsible for the information contained herein. Distributed by Public, unedited and unaltered, on 22 March 2019 18:59:02 UTC