Facebook predicts where you’ll look to improve 360 video
When you stream a regular video, it’s tough enough to get high-quality visuals to your screen without sacrificing viewing quality. Now imagine the even higher difficulty of streaming a 360-degree video. You don’t know where your viewer might be looking at any given time. That’s where Facebook’s new view prediction systems come in.
The company already keeps video fresh with a process it calls dynamic streaming, a way to send the highest number of pixels to your field of view. To do this in a 360 degree video, however, involves knowing where you’ll be looking at any given moment. That’s easy for a human to do, as we all tend to know where the most interesting stuff is, but tough for a computer.
Facebook has put together three strategies to address this. A gravitational view-prediction model that uses physics and heat-maps has increased resolution on VR devices by up to 39 percent. An AI model can assist by “intuiting” the most interesting part of a video. The company is also testing a new encoding technique, called content-dependent streaming, that improves resolution on non-VR devices like your smartphone by up to 51 percent.
Facebook says that you’ll be able to see 360 degree video at a higher quality resolution, even when network conditions are rough. If you want a much deeper dive into how Facebook achieves this, be sure to read the post detailing a ton of technical information on each of these new approaches. For the rest of us, though, perhaps we can look forward to higher-quality streams on our VR and flat screens in the near future.