DML-iTrack-HDR
About the dataset
DML-iTrack-HDR dataset is an eye
tracking dataset for HDR video. Ten HDR video sequences were shown on a Dolby
prototype HDR display, and the eye movements of the viewers were recorded by a
SensoMotoric Instruments (SMI) eye tracker. These videos were watched by 18
individuals, and fixation density maps (FDM) were obtained for each frame.
The Dolby prototype HDR TV display used
in our experiment consists of a projector with the resolution of 1024×768 at
the back and a 40-inch full HD LCD placed in the front [1]. The contrast range
of the projector is 20000:1 and is capable of emitting 15000 ANSI lumens. To
deliver HDR viewing experience, the LCD screen is fed by a color stream, which
is calibrated with a luminance stream sent to the projector. The maximum
brightness level achieved by this HDR display system prototype is 2700 cd/m2.
HDR video sequences
The HDR video sequences used in the
study are selected from three different sources. Two videos are from the
Technicolor HDR dataset [2] used by the Joint Collaborative Team on Video
Coding (JCT-VC) [2]. They were created by Technicolor using a simultaneous low
exposure/high exposure capture with a rig of two Sony F3 or F56 cameras. The
following six sequences were captured at the Stuttgart Media University (HdM)
by Froehlich et al.[3], and the final two were captured at the University of
British Columbia with RED Scarlet-X cameras, which can capture dynamic range up
to 18 stops.
content of dataset
This dataset provides the following:
Fixations
positions.
Fixation Density
Map. A continuous Fixation Density Map (FDM)
is calculated by convolving locations of fixation with a Gaussian filter. The
sigma of Gaussian is most commonly set to be approximately 1 degree of visual
angle.
Video Clip |
Source |
Clip Snapshot |
FDM |
Balloon |
Technicolor[2] |
|
|
Market |
Technicolor[2] |
|
|
Bistro01 |
Froehlich et al.[3] |
|
|
Bistro02 |
Froehlich et al.[3] |
|
|
Bistro03 |
Froehlich et al.[3] |
|
|
Carousel |
Froehlich et al.[3] |
|
|
Park |
Froehlich et al.[3] |
|
|
Fishing |
Froehlich et al.[3] |
|
|
Playground |
|
||
Mainmall |
|
Disclaimer
This data set is free to be used for any
non-commercial purposes. Please cite the following papers if you use our data
set:
E.Nasiopoulos, Y.Dong, and A. Kingstone,
"Evaluation of High Dynamic Range Content Viewing Experience Using
Eye-Tracking Data," 10th International Conference on Heterogeneous
Networking for Quality, Reliability, Security and Robustness, QSHINE, Aug.
2014, Greece.
Y.Dong, E.Nasiopoulos, M.T.Pourazad,
P.Nasiopoulos, "High Dynamic Range Video Eye Tracking Dataset," 2nd
International Conference on Electronics, Signal processing and Communications,
ESPCO, Nov 2014, Greece.
A. Banitalebi-Dehkordi, Y. Dong, M. T.
Pourazad, and Panos Nasiopoulos, “A Learning Based Visual Saliency Fusion Model
For High Dynamic Range Video (LBVS-HDR),” 23rd European Signal
Processing Conference, EUSIPCO 2015.
Contact
To download the dataset please contact
Dr. Panos Nasiopoulos at "panos [at] ece [dot] ubc [dot] ca".
reference
[1] H. Seetzen, W. Heidrich, W. Stuerzlinger, G. Ward, L. Whitehead, M. Trentacoste, A. Ghosh and A. Vorozcovs, "High dynamic range display systems," ACM Transactions on Graphics (TOG), vol. 23, no. 3, pp. 760-768, 2004.