Abstract
In this paper, we proposed a new system of mobile human-computer interaction based on eye gestures. This system aims to control and command mobile devices through the eyes for the purpose of providing an intuitive communication with these devices and a flexible usage with all contexts that a user can be situated. This system is based on a real-time streaming video captured from the front-facing camera without needing any additional equipment. The algorithm aims in the first time to detect user’s face and their eyes in the second time. The eyes gesture recognition is based on fuzzy inference system. We deployed this algorithm on an android-based tablet and we asked 8 volunteers to test it. The obtained results proved that this system has promising and competitive results.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Bulling, A., Gellersen, H.: Toward mobile eye-based human-computer interaction. IEEE Pervasive Comput. 9, 8–12 (2010)
Pino, C., Kavasidis, I.: Improving mobile device interaction by eye tracking analysis. In: Federated Conference on Computer Science and Information Systems, pp. 1199–1202, Wroclaw (2012)
Vaitukaitis, V., Bulling, A.: Eye gesture recognition on portable devices. In: Proceedings of ACM Conference on Ubiquitous Computing, pp. 711–714, United States (2012)
Viola, P., Jones, M.J.: Robust real-time face detection. Int. J. Comput. Vis. 57, 137–154 (2004)
Miluzzo, E., Wang, T., Campbell, A.T.: EyePhone: activating mobile phones with your eyes. In: Proceedings of the Second ACM SIGCOMM Workshop on Networking, Systems, and Applications on Mobile Handhelds, pp. 15–20, India (2010)
Wood, E., Bulling, A.: EyeTab: model-based gaze estimation on unmodified tablet computers. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 207–210, USA (2014)
Skodras, E., Fakotakis, N.: Precise localization of eye centers in low resolution color images. J. Image Vis. Comput. 36, 51–60 (2015)
Drewes, H., De Luca, A., Schmidt, A.: Eye-Gaze interaction for mobile phones. In: International Conference on Mobile Technology, Applications and Systems, pp. 364–371, Singapore (2007)
Elleuch, H., Wali, A., Alimi, A.M.: Smart tablet monitoring by a real-time head movement, eye gestures recognition system. In: International Conference on Future Internet of Things and Cloud, pp. 393–398, Spain (2014)
Zadeh, L.A.: Fuzzy sets. Inf. Control 8(3), 338–353 (1965)
OpenCV library. http://opencv.org/platforms/android.html
FuzzyLite. http://www.fuzzylite.com/
Acknowledgments
The authors would like to acknowledge the financial support of this work by grants from General Direction of Scientific Research - (DGRST), Tunisia, under the ARUB program. The research and innovation are performed in the framework of a thesis MOBIDOC financed by the EU under the program PASRI.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2016 Springer International Publishing AG
About this paper
Cite this paper
Elleuch, H., Wali, A., Samet, A., Alimi, A.M. (2016). A Real-Time Eye Gesture Recognition System Based on Fuzzy Inference System for Mobile Devices Monitoring. In: Blanc-Talon, J., Distante, C., Philips, W., Popescu, D., Scheunders, P. (eds) Advanced Concepts for Intelligent Vision Systems. ACIVS 2016. Lecture Notes in Computer Science(), vol 10016. Springer, Cham. https://doi.org/10.1007/978-3-319-48680-2_16
Download citation
DOI: https://doi.org/10.1007/978-3-319-48680-2_16
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-48679-6
Online ISBN: 978-3-319-48680-2
eBook Packages: Computer ScienceComputer Science (R0)