Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3332165.3347947acmconferencesArticle/Chapter ViewAbstractPublication PagesuistConference Proceedingsconference-collections
research-article

Accurate and Low-Latency Sensing of Touch Contact on Any Surface with Finger-Worn IMU Sensor

Authors Info & Claims
Published:17 October 2019Publication History

ABSTRACT

Head-mounted Mixed Reality (MR) systems enable touch in­teraction on any physical surface. However, optical methods (i.e., with cameras on the headset) have difficulty in determin­ing the touch contact accurately. We show that a finger ring with Inertial Measurement Unit (IMU) can substantially im­prove the accuracy of contact sensing from 84.74% to 98.61% (f1 score), with a low latency of 10 ms. We tested different ring wearing positions and tapping postures (e.g., with different fingers and parts). Results show that an IMU-based ring worn on the proximal phalanx of the index finger can accurately sense touch contact of most usable tapping postures. Partici­pants preferred wearing a ring for better user experience. Our approach can be used in combination with the optical touch sensing to provide robust and low-latency contact detection.

Skip Supplemental Material Section

Supplemental Material

ufp8756pv.mp4

mp4

7 MB

ufp8756vf.mp4

mp4

22.7 MB

p1059-gu.mp4

mp4

509.6 MB

References

  1. Ankur Agarwal, Shahram Izadi, Manmohan Chandraker, and Andrew Blake. 2007. High precision multi-touch sensing on surfaces using overhead cameras. In Horizontal Interactive Human-Computer Systems, 2007. TABLETOP'07. Second Annual IEEE International Workshop on. IEEE, 197--200.Google ScholarGoogle ScholarCross RefCross Ref
  2. Shiri Azenkot and Shumin Zhai. 2012. Touch behavior with different postures on soft smartphone keyboards. In Proceedings of the 14th international conference on Human-computer interaction with mobile devices and services. ACM, 251--260.Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. Paul Badger. 2018. Capacitive Sensing Library. (2018). https://playground.arduino.cc/Main/CapacitiveSensor/Google ScholarGoogle Scholar
  4. Hrvoje Benko, Ricardo Jota, and Andrew Wilson. 2012. MirageTable: freehand interaction on a projected augmented reality tabletop. In Proceedings of the SIGCHI conference on human factors in computing systems. ACM, 199--208.Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Stephen J Bisset and Bernard Kasser. 1998. Multiple fingers contact sensing method for emulating mouse buttons and mouse operations on a touch sensor pad. (Oct. 20 1998). US Patent 5,825,352.Google ScholarGoogle Scholar
  6. Daniel Buschek, Alexander De Luca, and Florian Alt. 2015. Improving accuracy, applicability and usability of keystroke biometrics on mobile touchscreen devices. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems. ACM, 1393--1402.Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Xiang Cao, Andrew D Wilson, Ravin Balakrishnan, Ken Hinckley, and Scott E Hudson. 2008. ShapeTouch: Leveraging contact shape on interactive surfaces. In 2008 3rd IEEE International Workshop on Horizontal Interactive Human Computer Systems. IEEE, 129--136.Google ScholarGoogle Scholar
  8. Jae Sik Chang, Eun Yi Kim, KeeChul Jung, and Hang Joon Kim. 2005. Real time hand tracking based on active contour model. In International Conference on Computational Science and Its Applications. Springer, 999--1006.Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Xiang'Anthony' Chen, Tovi Grossman, Daniel J Wigdor, and George Fitzmaurice. 2014. Duet: exploring joint interactions on a smart phone and a smart watch. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, 159--168.Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. Jonathan Deber, Ricardo Jota, Clifton Forlines, and Daniel Wigdor. 2015. How much faster is fast enough?: User perception of latency & latency improvements in direct and indirect touch. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems. ACM, 1827--1836.Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. John Greer Elias, Wayne Carl Westerman, and Myra Mary Haggerty. 2010. Multi-touch gesture dictionary. (Nov. 23 2010). US Patent 7,840,912.Google ScholarGoogle Scholar
  12. Liuhao Ge, Yujun Cai, Junwu Weng, and Junsong Yuan. 2018. Hand PointNet: 3D hand pose estimation using point sets. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. 8417--8426.Google ScholarGoogle ScholarCross RefCross Ref
  13. Jefferson Y Han. 2005. Low-cost multi-touch sensing through frustrated total internal reflection. In Proceedings of the 18th annual ACM symposium on User interface software and technology. ACM, 115--118.Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. Chris Harrison, Hrvoje Benko, and Andrew D Wilson. 2011a. OmniTouch: wearable multitouch interaction everywhere. In Proceedings of the 24th annual ACM symposium on User interface software and technology. ACM, 441--450.Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Chris Harrison, Julia Schwarz, and Scott E Hudson. 2011b. TapSense: enhancing finger interaction on touch surfaces. In Proceedings of the 24th annual ACM symposium on User interface software and technology. ACM, 627--636.Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. Seongkook Heo and Geehyuk Lee. 2011a. Force gestures: augmented touch screen gestures using normal and tangential force. In CHI'11 Extended Abstracts on Human Factors in Computing Systems. ACM, 1909--1914.Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. Seongkook Heo and Geehyuk Lee. 2011b. Forcetap: extending the input vocabulary of mobile touch screens by adding tap gestures. In Proceedings of the 13th International Conference on Human Computer Interaction with Mobile Devices and Services. ACM, 113--122.Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. Christian Holz and Patrick Baudisch. 2011. Understanding touch. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, 2501--2510.Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. Ken Iwasaki, Takashi Miyaki, and Jun Rekimoto. 2009. Expressive typing: a new way to sense typing pressure and its applications. In CHI'09 Extended Abstracts on Human Factors in Computing Systems. ACM, 4369--4374.Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. Shahram Izadi, David Kim, Otmar Hilliges, David Molyneaux, Richard Newcombe, Pushmeet Kohli, Jamie Shotton, Steve Hodges, Dustin Freeman, Andrew Davison, and others. 2011. KinectFusion: real-time 3D reconstruction and interaction using a moving depth camera. In Proceedings of the 24th annual ACM symposium on User interface software and technology. ACM, 559--568.Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. Ricardo Jota, Albert Ng, Paul Dietz, and Daniel Wigdor. 2013. How fast is fast enough?: a study of the effects of latency in direct-touch pointing tasks. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, 2291--2300.Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. Shaun K Kane, Daniel Avrahami, Jacob O Wobbrock, Beverly Harrison, Adam D Rea, Matthai Philipose, and Anthony LaMarca. 2009. Bonfire: a nomadic system for hybrid laptop-tabletop interaction. In Proceedings of the 22nd annual ACM symposium on User interface software and technology. ACM, 129--138.Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. Hideki Koike, Yoichi Sato, and Yoshinori Kobayashi. 2001. Integrating paper and digital information on EnhancedDesk: a method for realtime finger tracking on an augmented desk system. ACM Transactions on Computer-Human Interaction 8, 4 (2001), 307--322.Google ScholarGoogle ScholarDigital LibraryDigital Library
  24. Alan HF Lam, Wen J Li, Yunhui Liu, and Ning Xi. 2002. MIDS: micro input devices system using MEMS sensors. In Intelligent Robots and Systems, 2002. IEEE/RSJ International Conference on, Vol. 2. IEEE, 1184--1189.Google ScholarGoogle ScholarCross RefCross Ref
  25. SK Lee, William Buxton, and KC Smith. 1985. A multi-touch three dimensional touch-sensitive tablet. In Acm Sigchi Bulletin, Vol. 16. ACM, 21--25.Google ScholarGoogle ScholarDigital LibraryDigital Library
  26. G Julian Lepinski, Tovi Grossman, and George Fitzmaurice. 2010. The design and evaluation of multitouch marking menus. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, 2233--2242.Google ScholarGoogle ScholarDigital LibraryDigital Library
  27. Julien Letessier and Francc ois Bérard. 2004. Visual tracking of bare fingers for interactive surfaces. In Proceedings of the 17th annual ACM symposium on User interface software and technology. ACM, 119--122.Google ScholarGoogle ScholarDigital LibraryDigital Library
  28. Sebastian Madgwick. 2010. An efficient orientation filter for inertial and inertial/magnetic sensor arrays. Report x-io and University of Bristol (UK) 25 (2010), 113--118.Google ScholarGoogle Scholar
  29. Damien Masson, Alix Goguey, Sylvain Malacria, and Géry Casiez. 2017. Whichfingers: identifying fingers on touch surfaces and keyboards using vibration sensors. In Proceedings of the 30th Annual ACM Symposium on User Interface Software and Technology. ACM, 41--48.Google ScholarGoogle ScholarDigital LibraryDigital Library
  30. Nobuyuki Matsushita and Jun Rekimoto. 1997. HoloWall: designing a finger, hand, body, and object sensitive wall. In Proceedings of the 10th annual ACM symposium on User interface software and technology. ACM, 209--210.Google ScholarGoogle ScholarDigital LibraryDigital Library
  31. Rishi Mohindra. 2015. Identifying hover and/or palm input and rejecting spurious input for a touch panel. (July 14 2015). US Patent 9,081,450.Google ScholarGoogle Scholar
  32. Albert Ng, Julian Lepinski, Daniel Wigdor, Steven Sanders, and Paul Dietz. 2012. Designing for low-latency direct-touch input. In Proceedings of the 25th annual ACM symposium on User interface software and technology. ACM, 453--464.Google ScholarGoogle ScholarDigital LibraryDigital Library
  33. Takehiro Niikura, Yoshihiro Watanabe, and Masatoshi Ishikawa. 2014. Anywhere surface touch: utilizing any surface as an input area. In Proceedings of the 5th Augmented Human International Conference. ACM, 39.Google ScholarGoogle ScholarDigital LibraryDigital Library
  34. Ju Young Oh, Jun Lee, Joong Ho Lee, and Ji Hyung Park. 2017. Anywheretouch: Finger tracking method on arbitrary surface using nailed-mounted imu for mobile hmd. In International Conference on Human-Computer Interaction. Springer, 185--191.Google ScholarGoogle ScholarCross RefCross Ref
  35. Joseph A Paradiso, Kai-yuh Hsiao, Joshua Strickon, Joshua Lifton, and Ari Adler. 2000. Sensor systems for interactive surfaces. IBM Systems Journal 39, 3 (2000), 892--914.Google ScholarGoogle ScholarDigital LibraryDigital Library
  36. Joseph A Paradiso, Che King Leo, Nisha Checka, and Kaijen Hsiao. 2002. Passive acoustic sensing for tracking knocks atop large interactive displays. In Sensors, 2002. Proceedings of IEEE, Vol. 1. IEEE, 521--527.Google ScholarGoogle ScholarCross RefCross Ref
  37. Gonzalo Ramos, Matthew Boulos, and Ravin Balakrishnan. 2004. Pressure widgets. In Proceedings of the SIGCHI conference on Human factors in computing systems. ACM, 487--494.Google ScholarGoogle ScholarDigital LibraryDigital Library
  38. Anne Roudaut, Eric Lecolinet, and Yves Guiard. 2009. MicroRolls: expanding touch-screen input vocabulary by distinguishing rolls vs. slides of the thumb. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, 927--936.Google ScholarGoogle ScholarDigital LibraryDigital Library
  39. Elliot N Saba, Eric C Larson, and Shwetak N Patel. 2012. Dante vision: In-air and touch gesture sensing for natural surface interaction with combined depth and thermal cameras. In 2012 IEEE International Conference on Emerging Signal Processing Applications. IEEE, 167--170.Google ScholarGoogle ScholarCross RefCross Ref
  40. Adrian Spurr, Jie Song, Seonwook Park, and Otmar Hilliges. 2018. Cross-modal deep variational hand pose estimation. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. 89--98.Google ScholarGoogle ScholarCross RefCross Ref
  41. Lee Stearns, Uran Oh, Leah Findlater, and Jon E Froehlich. 2018. TouchCam: Realtime Recognition of Location-Specific On-Body Gestures to Support Users with Visual Impairments. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies 1, 4 (2018), 164.Google ScholarGoogle ScholarDigital LibraryDigital Library
  42. Naoki Sugita, Daisuke Iwai, and Kosuke Sato. 2008. Touch sensing by image analysis of fingernail. In SICE Annual Conference, 2008. IEEE, 1520--1525.Google ScholarGoogle Scholar
  43. Feng Wang and Xiangshi Ren. 2009. Empirical evaluation for finger input properties in multi-touch interaction. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, 1063--1072.Google ScholarGoogle ScholarDigital LibraryDigital Library
  44. Dong Wei, Steven Zhiying Zhou, and Du Xie. 2010. MTMR: A conceptual interior design framework integrating Mixed Reality with the Multi-Touch tabletop interface. In Mixed and Augmented Reality (ISMAR), 2010 9th IEEE International Symposium on. IEEE, 279--280.Google ScholarGoogle ScholarCross RefCross Ref
  45. Andrew D. Wilson. 2004. TouchLight: An Imaging Touch Screen and Display for Gesture-based Interaction. In Proceedings of the 6th International Conference on Multimodal Interfaces (ICMI '04). ACM, New York, NY, USA, 69--76.Google ScholarGoogle ScholarDigital LibraryDigital Library
  46. Andrew D Wilson and Hrvoje Benko. 2010. Combining multiple depth cameras and projectors for interactions on, above and between surfaces. In Proceedings of the 23nd annual ACM symposium on User interface software and technology. ACM, 273--282.Google ScholarGoogle ScholarDigital LibraryDigital Library
  47. Robert Xiao, Scott Hudson, and Chris Harrison. 2016. DIRECT: Making Touch Tracking on Ordinary Surfaces Practical with Hybrid Depth-Infrared Sensing. In Proceedings of the 2016 ACM on Interactive Surfaces and Spaces. ACM, 85--94.Google ScholarGoogle ScholarDigital LibraryDigital Library
  48. Robert Xiao, Greg Lew, James Marsanico, Divya Hariharan, Scott Hudson, and Chris Harrison. 2014. Toffee: enabling ad hoc, around-device interaction with acoustic time-of-arrival correlation. In Proceedings of the 16th international conference on Human-computer interaction with mobile devices & services. ACM, 67--76.Google ScholarGoogle ScholarDigital LibraryDigital Library
  49. Robert Xiao, Julia Schwarz, and Chris Harrison. 2015. Estimating 3d finger angle on commodity touchscreens. In Proceedings of the 2015 International Conference on Interactive Tabletops & Surfaces. ACM, 47--50.Google ScholarGoogle ScholarDigital LibraryDigital Library
  50. Robert Xiao, Julia Schwarz, Nick Throm, Andrew D Wilson, and Hrvoje Benko. 2018. MRTouch: Adding Touch Input to Head-Mounted Mixed Reality. IEEE transactions on visualization and computer graphics 24, 4 (2018), 1653--1660.Google ScholarGoogle ScholarDigital LibraryDigital Library
  51. Shanxin Yuan, Guillermo Garcia-Hernando, Björn Stenger, Gyeongsik Moon, Ju Yong Chang, Kyoung Mu Lee, Pavlo Molchanov, Jan Kautz, Sina Honari, Liuhao Ge, and others. 2018. Depth-based 3d hand pose estimation: From current achievements to future goals. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. 2636--2645.Google ScholarGoogle ScholarCross RefCross Ref
  52. Dan Zhao, Yue Liu, and Guangchuan Li. 2018. Skeleton-based Dynamic Hand Gesture Recognition using 3D Depth Data. Electronic Imaging 2018, 18 (2018), 1--8.Google ScholarGoogle ScholarCross RefCross Ref

Index Terms

  1. Accurate and Low-Latency Sensing of Touch Contact on Any Surface with Finger-Worn IMU Sensor

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      UIST '19: Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology
      October 2019
      1229 pages
      ISBN:9781450368162
      DOI:10.1145/3332165

      Copyright © 2019 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 17 October 2019

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article

      Acceptance Rates

      Overall Acceptance Rate842of3,967submissions,21%

      Upcoming Conference

      UIST '24

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader