[1] SHI L, COPOT C, VANLANDUIT S. GazeEMD: detecting visual intention in gaze-based human-robot interaction[J]. Robotics, 2021, 10(2): 68.
[2] CHAUDARY A K, NAIR N, BAILEY R J, et al. From real infrared eye-images to synthetic sequences of gaze behavior[J]. IEEE Transactions on Visualization and Computer Graphics, 2022, 28(11): 3948-3958.
[3] MIRSADIKOV A, GEORGE J F. Can you see me lying? investigating the role of deception on gaze behavior[J]. International Journal of Human-Computer Studies, 2023, 174: 103010.
[4] WANG S, OOYANG X, LIU T, et al. Follow my eye: using gaze to supervise computer-aided diagnosis[J]. IEEE Transactions on Medical Imaging, 2022, 41(7): 1688-1698.
[5] TU D, MIN X, DUAN H, et al. End-to-end human-gaze-target detection with transformers[C]//Proceedings of the 2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2022: 2192-2200.
[6] RECASENS A, KHOSLA A, VONDRICK C, et al. Where are they looking?[C]//Advances in Neural Information Processing Systems, 2015, 28.
[7] CHONG E, RUIZ N, WANG Y, et al. Connecting gaze, scene, and attention: generalized attention estimation via joint modeling of gaze and scene saliency[C]//Proceedings of the European Conference on Computer Vision, 2018: 383-398.
[8] CHONG E, WANG Y, RUIZ N, et al. Detecting attended visual targets in video[C]//Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2020: 5396-5406.
[9] FANG Y, TANG J, SHEN W, et al. Dual attention guided gaze target detection in the wild[C]//Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2021: 11390-11399.
[10] JIN T, YU Q, ZHU S, et al. Depth-aware gaze-following via auxiliary networks for robotics[J]. Engineering Applications of Artificial Intelligence, 2022, 113: 104924.
[11] LI Y, LIU M, REHG J M. In the eye of the beholder: gaze and actions in first person video[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2023, 45(6): 6731-6747.
[12] MIN K, CORSO J J. Integrating human gaze into attention for egocentric activity recognition[C]//Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision, 2021: 1069-1078.
[13] THAKUR S K, BEYAN C, MORERIO P, et al. Predicting gaze from egocentric social interaction videos and IMU data[C]//Proceedings of the 2021 International Conference on Multimodal Interaction, 2021: 717-722.
[14] TURKMEN R, NWAGU C, RAWAT P, et al. Put your glasses on: a voxel-based 3D authentication system in VR using eye-gaze[C]//Proceedings of the 2023 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops, 2023: 947-948.
[15] HU Z, YANG D, CHENG S, et al. We know where they are looking at from the RGB-D camera: gaze following in 3D[J]. IEEE Transactions on Instrumentation and Measurement, 2022, 71: 1-14.
[16] YANG X, XU F, WU K, et al. Gaze-aware graph convolutional network for social relation recognition[J]. IEEE Access, 2021, 9: 99398-99408.
[17] ZHUANG N, NI B, XU Y, et al. Muggle: multi-stream group gaze learning and estimation[J]. IEEE Transactions on Circuits and Systems for Video Technology, 2019, 30(10): 3637-3650.
[18] RECASENS A, VONDRICK C, KHOSLA A, et al. Following gaze in video[C]//Proceedings of the IEEE/CVF International Conference on Computer Vision, 2017: 1444-1452.
[19] LIAN D, YU Z, GAO S. Believe it or not, we know what you are looking at![C]//Proceedings of the 14th Asian Conference on Computer Vision, 2019: 35-50.
[20] MARIN-JIMENEZ M J, KALOGEITON V, MEDINA-SUAREZ P, et al. LAEO-Net: revisiting people looking at each other in videos[C]//Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2019: 3477-3485.
[21] SUMER O, GERJETS P, TRAUTWEIN U, et al. Attention flow: end-to-end joint attention estimation[C]//Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision, 2020: 3327-3336.
[22] FAN L, WANG W, HUANG S, et al. Understanding human gaze communication by spatio-temporal graph reasoning[C]//Proceedings of the IEEE/CVF International Conference on Computer Vision, 2019: 5724-5733. |