Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3209219.3209267acmconferencesArticle/Chapter ViewAbstractPublication PagesumapConference Proceedingsconference-collections
keynote

Interpreting User Input Intention in Natural Human Computer Interaction

Published:03 July 2018Publication History

ABSTRACT

Human Computer Interaction (HCI)1 is about information exchange between human and computers. Interaction between users and computers occurs at the User Interface (UI). Now, computers become pervasive, they are embedded in everyday things and UIs are the main value-added competitive advantages. UIs should be more natural for users. NUI (natural user interface) expands forms beyond formal input devices like the mouse and keyboard to more and more natural forms of interaction such as touch, speech, gestures, handwriting, and vision. Unlike speech, handwriting and vision, which have been researched for decades and put into practical use recently, touch and gestures are interaction tasks related, and yet lack of study. This talk will introduce methods of modelling user input action based on data with the random noise for fast touch input and natural gestures.

References

  1. Yiqin Lu, Chun Yu, Xin Yi, Yuanchun Shi, Shengdong Zhao: BlindType: Eyes-Free Text Entry on Handheld Touchpad by Leveraging Thumb's Muscle Memory. IMWUT 1(2): 18:1--18:24 (2017) Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. Weinan Shi, Chun Yu, Xin Yi, Zhen Li, Yuanchun Shi: TOAST: Ten-Finger Eyes-Free Typing on Touchable Surfaces. IMWUT 2(1): 33:1--33:23 (2018) Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. Xin Yi, Chun Yu, Mingrui Zhang, Sida Gao, Ke Sun, Yuanchun Shi: ATK: Enabling Ten-Finger Freehand Typing in Air Based on 3D Hand Tracking Data. UIST 2015: 539--548 Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Xin Yi, Chun Yu, Weinan Shi, Yuanchun Shi: Is it too small?: Investigating the performances and preferences of users when typing on tiny QWERTY keyboards. Int. J. Hum.-Comput. Stud. 106: 44--62 (2017)Google ScholarGoogle ScholarCross RefCross Ref
  5. Chun Yu, Ke Sun, Mingyuan Zhong, Xincheng Li, Peijun Zhao, Yuanchun Shi: One-Dimensional Handwriting: Inputting Letters and Words on Smart Glasses. CHI 2016: 71--82 Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Mingyuan Zhong, Chun Yu, Qian Wang, Xuhai Xu, Yuanchun Shi: ForceBoard: Subtle Text Entry Leveraging Pressure. CHI 2018: 528 Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Interpreting User Input Intention in Natural Human Computer Interaction

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      UMAP '18: Proceedings of the 26th Conference on User Modeling, Adaptation and Personalization
      July 2018
      393 pages
      ISBN:9781450355896
      DOI:10.1145/3209219
      • General Chairs:
      • Tanja Mitrovic,
      • Jie Zhang,
      • Program Chairs:
      • Li Chen,
      • David Chin

      Copyright © 2018 Owner/Author

      Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 3 July 2018

      Check for updates

      Qualifiers

      • keynote

      Acceptance Rates

      UMAP '18 Paper Acceptance Rate26of93submissions,28%Overall Acceptance Rate162of633submissions,26%

      Upcoming Conference

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader