Advanced search
Start date
Betweenand

Perceptually-efficient streaming of 360-degree edited video

Grant number: 18/23086-1
Support Opportunities:Regular Research Grants
Start date: December 01, 2020
End date: November 30, 2022
Field of knowledge:Engineering - Electrical Engineering - Telecommunications
Agreement: MCTI/MC
Principal Investigator:Marcelo Menezes de Carvalho
Grantee:Marcelo Menezes de Carvalho
Host Institution: Faculdade de Tecnologia. Universidade de Brasília (UNB)
Associated researchers: Mylene Christine Queiroz de Farias ; Ravi Prakash
Associated scholarship(s):21/05972-7 - Identifying natural scene cuts in 360-degree videos using a hybrid saliency computational model, BP.TT
20/05561-4 - Adaptive bit rate scheme for 360-degree edited videos, BP.TT

Abstract

The main goal of this project is to develop a new adaptive bit rate (ABR) algorithm for efficient transmission of omnidirectional (360-degree) video over the Internet. In particular, the proposed adaptive bit rate (ABR) algorithm will be tailored to the streaming of edited 360-degree videos (e.g., TV shows, films) where shots are intercalated by scene cuts. Today, most of research work seems to assume that 360-degree videos will always be a single continuous shot recorded by an omnidirectional camera. In this project, we anticipate new applications and use of 360-degree videos, in which video editing is exploited for the creation of 360-degree videos with novel immersive experiences. In this sense, the editing of 360-degree video itself needs to be studied, and in this project we exploit the editing of cuts through techniques of saliency models and detection of points of interest in video sequences. Therefore, by taking into account the detection of points of interest and saliency features of the images, and their impact on the overall perceptual quality of the consumed video, the work of video editing can be done more intelligently in a way that it facilitates the work of the ABR scheme at the client side. In this case, by using the realigned field of view (FoV) at scene cuts, the anticipated pre-fetching of a video sequence can be done more efficiently, easing the work of the ABR algorithm. In this sense, the proposed ABR scheme will exploit both the pre-fetching of video sequences at scene cuts as well as between scene cuts (while the user is gazing through the scene). Moreover, besides the temporal dimension, the proposed ABR scheme will exploit the spatial dimension of the 360-degree video through a perceptual-based quality approach not considered before in the literature. In the end of this project, we intend to deliver a rich video dataset with accompanying psycho-physical experiments over edited videos, that take into account saliency models and detection of points of interest. Also, we plan to make available our ABR algorithm for edited 360-degree video streaming. This project involves the colaborative work with two other associate researchers, including a faculty member of the University of Texas at Dallas. (AU)

Articles published in Agência FAPESP Newsletter about the research grant:
More itemsLess items
Articles published in other media outlets ( ):
More itemsLess items
VEICULO: TITULO (DATA)
VEICULO: TITULO (DATA)