The Prediction of Saliency Map for Head and Eye Movements in 360 Degree Images
Zhu,Yucheng1; Zhai,Guangtao1; Min,Xiongkuo1; Zhou,Jiantao2
Source PublicationIEEE Transactions on Multimedia
AbstractBy recording the whole scene around the capturer, virtual reality (VR) techniques can provide viewers the sense of presence. To provide a satisfactory quality of experience, there should be at least 60 pixels per degree, so the resolution of panoramas should reach 21600 × 10800. The huge amount of data will put great demands on data processing and transmission. However, when exploring in the virtual environment, viewers only perceive the content in the current field of view (FOV). Therefore if we can predict the head and eye movements which are important behaviors of viewer, more processing resources can be allocated to the active FOV. But conventional saliency prediction methods are not fully adequate for panoramic images. In this paper, a new panorama-oriented model, to predict head and eye movements, is proposed. Due to the superiority of computation in the spherical domain, the spherical harmonics are employed to extract features at different frequency bands and orientations. Related low- and high-level features including the rare components in the frequency domain and color domain, the difference between center vision and peripheral vision, visual equilibrium, person and car detection, and equator bias are extracted to estimate the saliency. To predict head movements, visual mechanisms including visual uncertainty and equilibrium are incorporated, and the graphical model and functional representation for the switch of head orientation are established. Extensive experimental results on the publicly available database demonstrate the effectiveness of our methods.
Keyword360 degree center and peripheral vision head-eye motion saliency scanpath spherical harmonics VR
URLView the original
Fulltext Access
Citation statistics
Cited Times [WOS]:6   [WOS Record]     [Related Records in WOS]
Document TypeJournal article
CollectionUniversity of Macau
Corresponding AuthorZhu,Yucheng
Affiliation1.Institute of Image Communication and Information Processing,Shanghai Jiao Tong University,Shanghai,China
2.State Key Laboratory of Internet of Things for Smart City,Department of Computer and Information Science,University of Macau,Macau,Macao
Recommended Citation
GB/T 7714
Zhu,Yucheng,Zhai,Guangtao,Min,Xiongkuo,et al. The Prediction of Saliency Map for Head and Eye Movements in 360 Degree Images[J]. IEEE Transactions on Multimedia,2020,22(9):2331-2344.
APA Zhu,Yucheng,Zhai,Guangtao,Min,Xiongkuo,&Zhou,Jiantao.(2020).The Prediction of Saliency Map for Head and Eye Movements in 360 Degree Images.IEEE Transactions on Multimedia,22(9),2331-2344.
MLA Zhu,Yucheng,et al."The Prediction of Saliency Map for Head and Eye Movements in 360 Degree Images".IEEE Transactions on Multimedia 22.9(2020):2331-2344.
Files in This Item:
There are no files associated with this item.
Related Services
Recommend this item
Usage statistics
Export to Endnote
Google Scholar
Similar articles in Google Scholar
[Zhu,Yucheng]'s Articles
[Zhai,Guangtao]'s Articles
[Min,Xiongkuo]'s Articles
Baidu academic
Similar articles in Baidu academic
[Zhu,Yucheng]'s Articles
[Zhai,Guangtao]'s Articles
[Min,Xiongkuo]'s Articles
Bing Scholar
Similar articles in Bing Scholar
[Zhu,Yucheng]'s Articles
[Zhai,Guangtao]'s Articles
[Min,Xiongkuo]'s Articles
Terms of Use
No data!
Social Bookmark/Share
All comments (0)
No comment.

Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.