ABSTRACT
Head-mounted Mixed Reality (MR) systems enable touch interaction on any physical surface. However, optical methods (i.e., with cameras on the headset) have difficulty in determining the touch contact accurately. We show that a finger ring with Inertial Measurement Unit (IMU) can substantially improve the accuracy of contact sensing from 84.74% to 98.61% (f1 score), with a low latency of 10 ms. We tested different ring wearing positions and tapping postures (e.g., with different fingers and parts). Results show that an IMU-based ring worn on the proximal phalanx of the index finger can accurately sense touch contact of most usable tapping postures. Participants preferred wearing a ring for better user experience. Our approach can be used in combination with the optical touch sensing to provide robust and low-latency contact detection.
Supplemental Material
- Ankur Agarwal, Shahram Izadi, Manmohan Chandraker, and Andrew Blake. 2007. High precision multi-touch sensing on surfaces using overhead cameras. In Horizontal Interactive Human-Computer Systems, 2007. TABLETOP'07. Second Annual IEEE International Workshop on. IEEE, 197--200.Google ScholarCross Ref
- Shiri Azenkot and Shumin Zhai. 2012. Touch behavior with different postures on soft smartphone keyboards. In Proceedings of the 14th international conference on Human-computer interaction with mobile devices and services. ACM, 251--260.Google ScholarDigital Library
- Paul Badger. 2018. Capacitive Sensing Library. (2018). https://playground.arduino.cc/Main/CapacitiveSensor/Google Scholar
- Hrvoje Benko, Ricardo Jota, and Andrew Wilson. 2012. MirageTable: freehand interaction on a projected augmented reality tabletop. In Proceedings of the SIGCHI conference on human factors in computing systems. ACM, 199--208.Google ScholarDigital Library
- Stephen J Bisset and Bernard Kasser. 1998. Multiple fingers contact sensing method for emulating mouse buttons and mouse operations on a touch sensor pad. (Oct. 20 1998). US Patent 5,825,352.Google Scholar
- Daniel Buschek, Alexander De Luca, and Florian Alt. 2015. Improving accuracy, applicability and usability of keystroke biometrics on mobile touchscreen devices. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems. ACM, 1393--1402.Google ScholarDigital Library
- Xiang Cao, Andrew D Wilson, Ravin Balakrishnan, Ken Hinckley, and Scott E Hudson. 2008. ShapeTouch: Leveraging contact shape on interactive surfaces. In 2008 3rd IEEE International Workshop on Horizontal Interactive Human Computer Systems. IEEE, 129--136.Google Scholar
- Jae Sik Chang, Eun Yi Kim, KeeChul Jung, and Hang Joon Kim. 2005. Real time hand tracking based on active contour model. In International Conference on Computational Science and Its Applications. Springer, 999--1006.Google ScholarDigital Library
- Xiang'Anthony' Chen, Tovi Grossman, Daniel J Wigdor, and George Fitzmaurice. 2014. Duet: exploring joint interactions on a smart phone and a smart watch. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, 159--168.Google ScholarDigital Library
- Jonathan Deber, Ricardo Jota, Clifton Forlines, and Daniel Wigdor. 2015. How much faster is fast enough?: User perception of latency & latency improvements in direct and indirect touch. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems. ACM, 1827--1836.Google ScholarDigital Library
- John Greer Elias, Wayne Carl Westerman, and Myra Mary Haggerty. 2010. Multi-touch gesture dictionary. (Nov. 23 2010). US Patent 7,840,912.Google Scholar
- Liuhao Ge, Yujun Cai, Junwu Weng, and Junsong Yuan. 2018. Hand PointNet: 3D hand pose estimation using point sets. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. 8417--8426.Google ScholarCross Ref
- Jefferson Y Han. 2005. Low-cost multi-touch sensing through frustrated total internal reflection. In Proceedings of the 18th annual ACM symposium on User interface software and technology. ACM, 115--118.Google ScholarDigital Library
- Chris Harrison, Hrvoje Benko, and Andrew D Wilson. 2011a. OmniTouch: wearable multitouch interaction everywhere. In Proceedings of the 24th annual ACM symposium on User interface software and technology. ACM, 441--450.Google ScholarDigital Library
- Chris Harrison, Julia Schwarz, and Scott E Hudson. 2011b. TapSense: enhancing finger interaction on touch surfaces. In Proceedings of the 24th annual ACM symposium on User interface software and technology. ACM, 627--636.Google ScholarDigital Library
- Seongkook Heo and Geehyuk Lee. 2011a. Force gestures: augmented touch screen gestures using normal and tangential force. In CHI'11 Extended Abstracts on Human Factors in Computing Systems. ACM, 1909--1914.Google ScholarDigital Library
- Seongkook Heo and Geehyuk Lee. 2011b. Forcetap: extending the input vocabulary of mobile touch screens by adding tap gestures. In Proceedings of the 13th International Conference on Human Computer Interaction with Mobile Devices and Services. ACM, 113--122.Google ScholarDigital Library
- Christian Holz and Patrick Baudisch. 2011. Understanding touch. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, 2501--2510.Google ScholarDigital Library
- Ken Iwasaki, Takashi Miyaki, and Jun Rekimoto. 2009. Expressive typing: a new way to sense typing pressure and its applications. In CHI'09 Extended Abstracts on Human Factors in Computing Systems. ACM, 4369--4374.Google ScholarDigital Library
- Shahram Izadi, David Kim, Otmar Hilliges, David Molyneaux, Richard Newcombe, Pushmeet Kohli, Jamie Shotton, Steve Hodges, Dustin Freeman, Andrew Davison, and others. 2011. KinectFusion: real-time 3D reconstruction and interaction using a moving depth camera. In Proceedings of the 24th annual ACM symposium on User interface software and technology. ACM, 559--568.Google ScholarDigital Library
- Ricardo Jota, Albert Ng, Paul Dietz, and Daniel Wigdor. 2013. How fast is fast enough?: a study of the effects of latency in direct-touch pointing tasks. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, 2291--2300.Google ScholarDigital Library
- Shaun K Kane, Daniel Avrahami, Jacob O Wobbrock, Beverly Harrison, Adam D Rea, Matthai Philipose, and Anthony LaMarca. 2009. Bonfire: a nomadic system for hybrid laptop-tabletop interaction. In Proceedings of the 22nd annual ACM symposium on User interface software and technology. ACM, 129--138.Google ScholarDigital Library
- Hideki Koike, Yoichi Sato, and Yoshinori Kobayashi. 2001. Integrating paper and digital information on EnhancedDesk: a method for realtime finger tracking on an augmented desk system. ACM Transactions on Computer-Human Interaction 8, 4 (2001), 307--322.Google ScholarDigital Library
- Alan HF Lam, Wen J Li, Yunhui Liu, and Ning Xi. 2002. MIDS: micro input devices system using MEMS sensors. In Intelligent Robots and Systems, 2002. IEEE/RSJ International Conference on, Vol. 2. IEEE, 1184--1189.Google ScholarCross Ref
- SK Lee, William Buxton, and KC Smith. 1985. A multi-touch three dimensional touch-sensitive tablet. In Acm Sigchi Bulletin, Vol. 16. ACM, 21--25.Google ScholarDigital Library
- G Julian Lepinski, Tovi Grossman, and George Fitzmaurice. 2010. The design and evaluation of multitouch marking menus. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, 2233--2242.Google ScholarDigital Library
- Julien Letessier and Francc ois Bérard. 2004. Visual tracking of bare fingers for interactive surfaces. In Proceedings of the 17th annual ACM symposium on User interface software and technology. ACM, 119--122.Google ScholarDigital Library
- Sebastian Madgwick. 2010. An efficient orientation filter for inertial and inertial/magnetic sensor arrays. Report x-io and University of Bristol (UK) 25 (2010), 113--118.Google Scholar
- Damien Masson, Alix Goguey, Sylvain Malacria, and Géry Casiez. 2017. Whichfingers: identifying fingers on touch surfaces and keyboards using vibration sensors. In Proceedings of the 30th Annual ACM Symposium on User Interface Software and Technology. ACM, 41--48.Google ScholarDigital Library
- Nobuyuki Matsushita and Jun Rekimoto. 1997. HoloWall: designing a finger, hand, body, and object sensitive wall. In Proceedings of the 10th annual ACM symposium on User interface software and technology. ACM, 209--210.Google ScholarDigital Library
- Rishi Mohindra. 2015. Identifying hover and/or palm input and rejecting spurious input for a touch panel. (July 14 2015). US Patent 9,081,450.Google Scholar
- Albert Ng, Julian Lepinski, Daniel Wigdor, Steven Sanders, and Paul Dietz. 2012. Designing for low-latency direct-touch input. In Proceedings of the 25th annual ACM symposium on User interface software and technology. ACM, 453--464.Google ScholarDigital Library
- Takehiro Niikura, Yoshihiro Watanabe, and Masatoshi Ishikawa. 2014. Anywhere surface touch: utilizing any surface as an input area. In Proceedings of the 5th Augmented Human International Conference. ACM, 39.Google ScholarDigital Library
- Ju Young Oh, Jun Lee, Joong Ho Lee, and Ji Hyung Park. 2017. Anywheretouch: Finger tracking method on arbitrary surface using nailed-mounted imu for mobile hmd. In International Conference on Human-Computer Interaction. Springer, 185--191.Google ScholarCross Ref
- Joseph A Paradiso, Kai-yuh Hsiao, Joshua Strickon, Joshua Lifton, and Ari Adler. 2000. Sensor systems for interactive surfaces. IBM Systems Journal 39, 3 (2000), 892--914.Google ScholarDigital Library
- Joseph A Paradiso, Che King Leo, Nisha Checka, and Kaijen Hsiao. 2002. Passive acoustic sensing for tracking knocks atop large interactive displays. In Sensors, 2002. Proceedings of IEEE, Vol. 1. IEEE, 521--527.Google ScholarCross Ref
- Gonzalo Ramos, Matthew Boulos, and Ravin Balakrishnan. 2004. Pressure widgets. In Proceedings of the SIGCHI conference on Human factors in computing systems. ACM, 487--494.Google ScholarDigital Library
- Anne Roudaut, Eric Lecolinet, and Yves Guiard. 2009. MicroRolls: expanding touch-screen input vocabulary by distinguishing rolls vs. slides of the thumb. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, 927--936.Google ScholarDigital Library
- Elliot N Saba, Eric C Larson, and Shwetak N Patel. 2012. Dante vision: In-air and touch gesture sensing for natural surface interaction with combined depth and thermal cameras. In 2012 IEEE International Conference on Emerging Signal Processing Applications. IEEE, 167--170.Google ScholarCross Ref
- Adrian Spurr, Jie Song, Seonwook Park, and Otmar Hilliges. 2018. Cross-modal deep variational hand pose estimation. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. 89--98.Google ScholarCross Ref
- Lee Stearns, Uran Oh, Leah Findlater, and Jon E Froehlich. 2018. TouchCam: Realtime Recognition of Location-Specific On-Body Gestures to Support Users with Visual Impairments. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies 1, 4 (2018), 164.Google ScholarDigital Library
- Naoki Sugita, Daisuke Iwai, and Kosuke Sato. 2008. Touch sensing by image analysis of fingernail. In SICE Annual Conference, 2008. IEEE, 1520--1525.Google Scholar
- Feng Wang and Xiangshi Ren. 2009. Empirical evaluation for finger input properties in multi-touch interaction. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, 1063--1072.Google ScholarDigital Library
- Dong Wei, Steven Zhiying Zhou, and Du Xie. 2010. MTMR: A conceptual interior design framework integrating Mixed Reality with the Multi-Touch tabletop interface. In Mixed and Augmented Reality (ISMAR), 2010 9th IEEE International Symposium on. IEEE, 279--280.Google ScholarCross Ref
- Andrew D. Wilson. 2004. TouchLight: An Imaging Touch Screen and Display for Gesture-based Interaction. In Proceedings of the 6th International Conference on Multimodal Interfaces (ICMI '04). ACM, New York, NY, USA, 69--76.Google ScholarDigital Library
- Andrew D Wilson and Hrvoje Benko. 2010. Combining multiple depth cameras and projectors for interactions on, above and between surfaces. In Proceedings of the 23nd annual ACM symposium on User interface software and technology. ACM, 273--282.Google ScholarDigital Library
- Robert Xiao, Scott Hudson, and Chris Harrison. 2016. DIRECT: Making Touch Tracking on Ordinary Surfaces Practical with Hybrid Depth-Infrared Sensing. In Proceedings of the 2016 ACM on Interactive Surfaces and Spaces. ACM, 85--94.Google ScholarDigital Library
- Robert Xiao, Greg Lew, James Marsanico, Divya Hariharan, Scott Hudson, and Chris Harrison. 2014. Toffee: enabling ad hoc, around-device interaction with acoustic time-of-arrival correlation. In Proceedings of the 16th international conference on Human-computer interaction with mobile devices & services. ACM, 67--76.Google ScholarDigital Library
- Robert Xiao, Julia Schwarz, and Chris Harrison. 2015. Estimating 3d finger angle on commodity touchscreens. In Proceedings of the 2015 International Conference on Interactive Tabletops & Surfaces. ACM, 47--50.Google ScholarDigital Library
- Robert Xiao, Julia Schwarz, Nick Throm, Andrew D Wilson, and Hrvoje Benko. 2018. MRTouch: Adding Touch Input to Head-Mounted Mixed Reality. IEEE transactions on visualization and computer graphics 24, 4 (2018), 1653--1660.Google ScholarDigital Library
- Shanxin Yuan, Guillermo Garcia-Hernando, Björn Stenger, Gyeongsik Moon, Ju Yong Chang, Kyoung Mu Lee, Pavlo Molchanov, Jan Kautz, Sina Honari, Liuhao Ge, and others. 2018. Depth-based 3d hand pose estimation: From current achievements to future goals. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. 2636--2645.Google ScholarCross Ref
- Dan Zhao, Yue Liu, and Guangchuan Li. 2018. Skeleton-based Dynamic Hand Gesture Recognition using 3D Depth Data. Electronic Imaging 2018, 18 (2018), 1--8.Google ScholarCross Ref
Index Terms
- Accurate and Low-Latency Sensing of Touch Contact on Any Surface with Finger-Worn IMU Sensor
Recommendations
Ready, Steady, Touch!: Sensing Physical Contact with a Finger-Mounted IMU
A finger held in the air exhibits microvibrations, which are reduced when it touches a static object. When a finger moves along a surface, the friction between them produces vibrations, which can not be produced with a free-moving finger in the air. With ...
Touch+Finger: Extending Touch-based User Interface Capabilities with "Idle" Finger Gestures in the Air
UIST '18: Proceedings of the 31st Annual ACM Symposium on User Interface Software and TechnologyIn this paper, we present Touch+Finger, a new interaction technique that augments touch input with multi-finger gestures for rich and expressive interaction. The main idea is that while one finger is engaged in a touch event, a user can leverage the ...
Expanding Touch Interaction Capabilities for Smart-rings: An Exploration of Continual Slide and Microroll Gestures
CHI EA '22: Extended Abstracts of the 2022 CHI Conference on Human Factors in Computing SystemsAs smart-rings emerge in both research and commercial markets, their limited physical size remains to restrict the interaction potential and input vocabulary possible. Thus, focusing on touch interaction for its natural and preferred input potential, ...
Comments