Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Filters








1,397 Hits in 5.9 sec

Study on intelligent autonomous navigation of avatar using hand gesture recognition

J.-S. Kim, K.-H. Park, J.-B. Kim, J.-H. Do, K.-J. Song, Z. Bien
SMC 2000 Conference Proceedings. 2000 IEEE International Conference on Systems, Man and Cybernetics. 'Cybernetics Evolving to Systems, Humans, Organizations, and their Complex Interactions' (Cat. No.00CH37166)  
In this paper, we present a real-time hand gesture recognition system that controls motion of a human avatar based on pre-defined dynamic hand gestures in a virtual environment.  ...  To resolve this problem, we propose a new recognition method using intelligent techniques.  ...  features of hand gestures and a motion control algorithm of a human avatar using the recognition results.  ... 
doi:10.1109/icsmc.2000.885955 dblp:conf/smc/KimPKDSB00 fatcat:tj55a6bfsfdtvhbjarfrodyv54

Special Section: Best Papers from the 19th ACM Symposium on Virtual Reality Software and Technology (VRST 2013) Guest Editors' Introduction

Nadia Magnenat Thalmann, Daniel Thalmann
2014 Presence - Teleoperators and Virtual Environments  
The third paper, ''A Study on High-Level Autonomous Navigational Behaviors for Telepresence Applications,'' by Wee Ching Pang, Gerald Seet, and Xiling Yao, presents a framework enabling navigational autonomy  ...  We envision that in the absence of the real users, when they have to leave or they do not want to perform a repetitive task, the control of the robots can be handed to an artificial intelligence component  ... 
doi:10.1162/pres_e_00175 fatcat:kzdsbalpnzgnzevq7bxlbjjrni

Engagement with Artificial Intelligence through Natural Interaction Models

Sara Feldman, Steve DiPaola, Özge Nilay Yalçin
2017 EVA London 2017  
Our proposed conversational agent system makes use of the affective signals from the gestural behaviour of the user and the semantic information from the speech input in order to generate a personalised  ...  , human-like conversation that is expressed in the visual and conversational output of the 3D virtual avatar system.  ...  Our 3D conversational agent uses real-time high end (game level) 3D rendering and animation in the form of natural speech lip sync, facial expressions and hand, arm and body gestures.  ... 
doi:10.14236/ewic/eva2017.60 dblp:conf/eva/FeldmanDY17 fatcat:6pazi3xtmjfazoc4znefkp273q

A Framework for Human-like Behavior in an immersive virtual world

Fons Kuijk, Sigurd Van Broeck, Claude Dareau, Brian Ravenet, Magalie Ochs, Konstantinos Apostolakis, Petros Daras, David Monaghan, Noel E O'Connor, Julie Wall, Ebroul Izquierdo
2013 2013 18th International Conference on Digital Signal Processing (DSP)  
Rather than just being visualized in a 3D space, the virtual characters (autonomous agents as well as avatars representing users) in the immersive environment facilitate social interaction and multi-party  ...  to their expectations, based on their lifelong observations in the real world.  ...  , ranging from transformation of data up to gesture recognition; note: it does not include interpretation.  ... 
doi:10.1109/icdsp.2013.6622826 dblp:conf/icdsp/KuijkBDROADMOWI13 fatcat:aryka2lofvcaznosez2qlydoqe

Artificial Intelligence Synergetic Opportunities in Services: Conversational Systems Perspective

Shai Rozenes, Yuval Cohen
2022 Applied Sciences  
Two case studies of service systems are presented to illustrate the importance of synergy.  ...  The paper shows that the literature related to the use of AI in service is divided into independent knowledge domains (silos) that are either related to the technology under consideration, or to a small  ...  An advanced chatbot with the abilities of face recognition and hand gesture recognition is described in Gopinath et al. [67] .  ... 
doi:10.3390/app12168363 fatcat:6uptcmn2wbdgzgknzvcimuhrru

SIGVerse: A Cloud-Based VR Platform for Research on Multimodal Human-Robot Interaction

Tetsunari Inamura, Yoshiaki Mizuchi
2021 Frontiers in Robotics and AI  
validated the system's usefulness and its potential for the development and evaluation of social intelligence via multimodal HRI.  ...  interface for robot/avatar teleoperations.  ...  The gesture recognition functions required for intelligent robots that work with people include both label recognition, and it also follows a moving human and observation of the gesture from an easy-toview  ... 
doi:10.3389/frobt.2021.549360 pmid:34136534 pmcid:PMC8202404 fatcat:5nibhsqggzhbbg2kfxw7hnf4oy

Telepresence Social Robotics towards Co-Presence: A Review

Luis Almeida, Paulo Menezes, Jorge Dias
2022 Applied Sciences  
This work presents a literature review on developments supporting robotic social interactions, contributing to improving the sense of presence and co-presence via robot mediation.  ...  This survey aims to define social presence, co-presence, identify autonomous "user-adaptive systems" for social robots, and propose a taxonomy for "co-presence" mechanisms.  ...  Conflicts of Interest: The authors declare no conflict of interest.  ... 
doi:10.3390/app12115557 fatcat:dn3tvcwbrfhjfnpzlr6ui3onqi

A Holographic Infotainment System For Connected And Driverless Cars: An Exploratory Study Of Gesture Based Interaction

Nicholas Lambert, Seungyeon Ryu, Mehmet Mulla, Albert Kim
2018 Zenodo  
The research focuses on the development of interactive avatars for this system and its gesture-based control system.  ...  This is a case study for the development of a possible human-centred means of presenting a connected or autonomous vehicle's On-Board Diagnostics through a projected 'holographic' infotainment system.  ...  Enable-Disable -were also rated for Comfort, Ease of Use, Accuracy, and Memorability. Study Two: This examined the qualitative aspects of hand gestures being used in a vehicle environment.  ... 
doi:10.5281/zenodo.1474994 fatcat:7b5uj4z6qfhk7ok2uty4m575wm

A Perspective on Robotic Telepresence and Teleoperation using Cognition: Are we there yet? [article]

Hrishav Bakul Barua, Ashis Sau, Ruddra dev Roychoudhury
2022 arXiv   pre-print
Intelligent robotic systems are being deployed both in industrial and domestic environments. Telepresence is the idea of being present in a remote location virtually or via robotic avatars.  ...  With the Artificial Intelligence (AI) revolution already being started, we can see a wide range of robotic applications being realized.  ...  The right hand side image shows a similar graph for different use cases. Fig. 4 :Fig. 5 : 45 Fig. 4: Full scale image of the robotic avatar "Asha" for Telepresence and Teleoperation.  ... 
arXiv:2203.02959v1 fatcat:yys66jpjsbc7lanhovzf7vjxzi

SIGVerse: A cloud-based VR platform for research on social and embodied human-robot interaction [article]

Tetsunari Inamura, Yoshiaki Mizuchi
2020 arXiv   pre-print
Through demonstration experiments at the competition, we show the usefulness and potential of the system for the development and evaluation of social intelligence through human-robot interaction.  ...  The platform also contributes in providing a dataset of social behaviors, which would be a key aspect for intelligent service robots to acquire social interaction skills based on machine learning techniques  ...  Acknowledgments The authors would like to thank Hiroki Yamada to support the development of the cloud-based VR platform as a software technician.  ... 
arXiv:2005.00825v1 fatcat:ub77g6whcrg4jam44jd354lqly

Robot Avatar: A Virtual Tourism Robot for People with Disabilities

Chong Wing Cheung, Tai Ip Tsang, Kin Hong Wong
2017 Journal of clean energy technologies  
We also propose to design a hand gesture recognition system that the user can use very simple gestures to control the motion of the robot to select his/her favorite views.  ...  Currently the robot and finger gesture recognition system have been built and tested successfully. Data analysis of the video latency from the robot to the user has also been carried out.  ...  It includes a remote control robot and a visual recognition system that the user can control the robot by using simple hand gestures.  ... 
doi:10.7763/ijcte.2017.v9.1143 fatcat:nw3vvahwpzfijm2ydfh2zci3oa

Conferring human action recognition skills to life-like agents

Luc Emering, Ronan Boulic, Daniel Thalmann
1999 Applied Artificial Intelligence  
In this paper we present one skill: human action recognition. By opposition to human-computer interfaces (HCI) that focus on speech or hand gestures, we propose a full-body integration of the user.  ...  Most of today's virtual environments are populated with some kind of autonomous life-like agents.  ...  Chauvineau for the human body deformations and all the members of EPFL-LIG for their constructive help for the present work.  ... 
doi:10.1080/088395199117379 fatcat:c6qudsdqmfeflk5s5ltvkl74fe

Artificial Techniques for Language Disorders

Eugenia Gkeka, Eleni Agorastou, Athanasios Drigas
2019 International Journal of Recent Contributions from Engineering, Science & IT  
<p class="0abstract">This review focuses on artificial techniques which include the artificial intelligent techniques and applications, the robot technology and the serious games supporting the procedure  ...  of learning and teaching of language disorders and deficits as well the evolution of speech.  ...  (AI) has its limitations as long as it is focused on a specific task such as robots, pattern recognition, voice, images, translation robots, autonomous robots in delivery systems.  ... 
doi:10.3991/ijes.v7i4.11845 fatcat:qlwpdmubarhj5lxa4i4twzvmjq

A Study on High-Level Autonomous Navigational Behaviors for Telepresence Applications

Wee Ching Pang, Gerald Seet, Xiling Yao
2014 Presence - Teleoperators and Virtual Environments  
This paper discusses the development of higher-level, human-like navigational behaviors such as following, accompanying, and guiding a person autonomously.  ...  This allows the inhabitor to focus on interactions at the remote environment, rather than being engrossed in controlling robot navigation.  ...  On the other hand, no collisions occurred when the robotic avatar was operated in the autonomous configuration.  ... 
doi:10.1162/pres_a_00178 fatcat:5iyyezxudfevrdbojk2qkr6jfm

A Multimodal User Interface for an Assistive Robotic Shopping Cart

Dmitry Ryumin, Ildar Kagirov, Alexandr Axyonov, Nikita Pavlyuk, Anton Saveliev, Irina Kipyatkova, Milos Zelezny, Iosif Mporas, Alexey Karpov
2020 Electronics  
Among the main topics covered in this paper are the presentation of the interface (three modalities), the single-handed gesture recognition system (based on a collected database of Russian sign language  ...  The use of multimodal interfaces, namely the speech and gesture modalities, make human-robot interaction natural and intuitive, as well as sign language recognition allows hearing-impaired people to use  ...  Conflicts of Interest: The authors declare no conflict of interest.  ... 
doi:10.3390/electronics9122093 fatcat:mbczgqcvjrftpesvnkd23cl2ee
« Previous Showing results 1 — 15 out of 1,397 results