Projects:

Current Projects

Interactive Robot Control and Programming
Learning by Observations

Past Projects

Gesture Based Programming
Sensorimotor Primitives for Robot Skills
Iconic Robot Programming
Tactile/Haptic Interface


Interactive Robot Control and Programming

Interactive Multi-Modal Robot Programming

Video Sequence
 1. gesture recognition
 2. programming
 3. execution

Overview: 
As robots enter the human environment and come in contact with inexperienced users, they need to be able to interact with users in a multi-modal fashion - keyboard and mouse are no longer acceptable as the only input modalities.
 
We are currently investigating the multi-modal interaction scenario, where robots are controlled through two-handed gestures and speech. Multi-modality comes in handy when the user needs to teach a new skill and compose programs out of basic skill sets known as primitives. We are using HTK (Hidden Markov Model Toolkit), and off-the-shelf software component for gesture and speech recognition. The mobile vacuum cleaning robot, Cye, is used as a test-bed, but the framework is not restricted to any particular platform.
 
Representative Publication:
S. Iba, C. J. J. Paredis, and P. K. Khosla, "Interactive Multi-Modal Robot Programming," International Conference on Robotics and Automation (ICRA) 2002, Washington, D.C., pp. 161-168, May 11 - 15, 2002. (Finalist for the Anton Philips Best Student Paper Award)
 

Interactive Multi-Modal Programming for Robotic Arc Welding / Learning by Observation


Video [1,2,3]


Video [1]

Overview:
The aim of this experiment is to use multi-modal commands (hand-gestures & voice) to perform an arc welding task, and compare against conventional programming method using teach-pendant. We investigated under three different scenarios:
  1. Reduced sensor mode: Only acceleration and velocity of the hand is available (no absolute position of the hand)
  2. Full sensor mode: Absolute position of the hand is available
  3. Learning by observation: Use teach-pendant to store points, but the system can infer new position from previous pattern that is similar.
 
Representative Publications:
K.R. Dixon, M. Strand, and P.K. Khosla, "Predictive Robot Programming," In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2002), pp. 876-881, October, 2002.

K.R. Dixon and P.K. Khosla, "Unsupervised Model-Based Prediction of User Actions," Technical report, Carnegie Mellon University, Institute for Complex Engineered Systems, 2002.
 
Gesture Based Control of a Mobile Robot


Video [1,2,3]

Overview:
Compared to a mouse and keyboard, hand gestures have high redundancy to convey geometric and temporal data. They are rich in vocabulary while being intuitive to users. These features make hand gestures an attractive tool to interact with robots. For this particular system, we have experimented with the use of single-handed gesture to control a single mobile robot. We have developed a gesture spotting and recognition algorithm based on a Hidden Markov Model (HMM) for this system.
 
Representative Publication:
S. Iba, J.M.Vande Weghe, C. Paredis, and P. Khosla, " An Architecture for Gesture Based Control of Mobile Robots," Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS'99), pp. 851-857, October, 1999. (Slides)
 

Interactive Two-Handed Gesture Based Control of a Manipulator


Video [1]

Overview:
Another experiment is conducted to show the usefulness of hand gesture as an interaction mode for a robotic system. The manipulator with pre-defined geometric primitives performs a drawing task, while the user can play around with their parameter through hand gestures. The system will be extended into a functional interactive programming system by providing extra modalities (such as speech, tactile feedback) to compose and modify primitives.

Gesture Based Programming

Gesture Based Programming

Overview:
Gesture-Based Programming is a new form of programming by human demonstration that views the demonstration as a series of inexact gestures that convey the intention of the task strategy, not the details of the strategy itself. This is analogous to the type of programming that occurs between human teacher and student and is more intuitive for both. However, it requires a shared ontology between teacher and student -- in the form of a common skill database -- to abstract the observed gestures to meaningful intentions that can be mapped onto previous experiences and previously-acquired skills.
 
Link:
Image Sequence of the demonstration (Dr. Voyles, UMN)
 
Representative Publications:
R.M. Voyles, J.D. Morrow, and P.K. Khosla, "Towards Gesture-Based Programming: Shape from Motion Primordial Learning of Sensorimotor Primitives," Journal of Robotics and Autonomous Systems, v. 22, n. 3-4, pp. 361-375, Dec. 1997.

R. Voyles, Toward Gesture-Based Programming: Agent-Based Haptic Skill Acquisition and Interpretation, doctoral dissertation, tech. report CMU-RI-TR-97-36, Robotics Institute, Carnegie Mellon University, August, 1997.

Sensorimotor Primitives for Robot Skills

Sensorimotor Primitives

Overview:
The project involves developing a complete system for skill synthesis based on the cognitive and associative phases of skill acquisition. The system has three components: 
  1. sensorimotor primitives (SMP's)
  2. skill programming interface (SPI)
  3. skill tuning

Grounding the approach are sensorimotor primitives which are sensor-integrated, task-relevant commands. The idea is to bridge the task space with the robot/sensor space so that task strategy translation onto the robot/sensor system is more direct.

Representative Publications:
J. Morrow and P. Khosla, "Manipulation Task Primitives for Composing Robot Skills," IEEE International Conference on Robotics and Automation, pp. 3354-3359, April, 1997.

J.D. Morrow, "Sensorimotor Primitives for Programming Robotic Assembly Skills", doctoral dissertation, tech. report CMU-RI-TR-97-23, Robotics Institute, Carnegie Mellon University, May, 1997.

Iconic Robot Programming

Onika: A Multilevel Human-Machine Interface for Reconfigurable Real-Time Control Systems


Onika User's Manual

Overview:
The Advanced Mechatronics Laboratory has developed Onika an iconically programmed human-machine interface, to interact with the Chimera Real-Time Operating System in the context of a reconfigurable software framework to create reusable code. Onika presents appropriate work environments for both application engineers and end-users. Code can be generated simply by assembling icons using a mouse. Onika uses hyperlinks to retrieve code from remote software libraries, and can automatically integrate this into a Chimera executable. Onika has been copyrighted (c) 1993,1994 by Carnegie Mellon University.
 
Link:
Onika's Project Web Site
 
Representative Publications:
M. Gertz and P. Khosla, "Onika: A Multilevel Human-Machine Interface for Real-Time Sensor-Based Robotics Systems," ASCE: SPACE 94: The 4th Intternational Conference and Exposition on Engineering, Construction, and Operations in Space, February, 1994.

Tactile/Haptic Interface

Vibrotactile Feedback Device for Human-Computer Interfaces


Video [1]

Overview:
The goal of this work is to convey finger touch and force information (i.e., haptic feedback) to a human operator so (s)he can "feel" what the remote or virtual hand is grabbing. Why is haptic feedback so important? Numerous human factor studies have shown that our ability to manipulate objects relies heavily on the contact (touch and force) information we gather. Consequently, we are in the process of demonstrating that haptic feedback, even in crude forms, can help a person manipulate remote or virtual objects better than visual feedback alone.

Building a user-transparent tactile feedback system is a difficult research problem, since current and near-term actuator technologies do not provide the fidelity needed to produce realistic sensations. Moreover, these technologies are not sufficiently small and lightweight for a person to wear in a glove. We have opted to use vibrotactile feedback (using vibration to convey information) so that we can have a wearable system. The picture on the bottom left shows the vibrotactile glove we have developed, which uses miniature voice coils (e.g., small audio speakers) to produce vibrations on the wearer's fingertips and palm.
 
Representative Publications:
K. Shimoga, A. Murray, and P. Khosla, " A Touch Reflection System for Interaction with Remote and Virtual Environments," IEEE/RSJ International Conf. on Intelligent Robots and Systems, August, 1995.
 

Haptic Interface (sensing and actuation) based on Electrorheological Gel

Overview:
The system consists of three interchangeable parts: an intrinsic tactile sensor for measuring net force/torque, an extrinsic tactile sensor for measuring contact distributions, and a tactile actuator for displaying tactile distributions. The novel components are the extrinsic sensor and tactile actuator which are "inside-out symmetric" to each other and employ an electrorheological gel for actuation.
Representative Publications:
R. Voyles, G. Fedder, and P. Khosla, " A Modular Tactile Sensor and Actuator Based on an Electrorheological Gel," Proceedings of 1996 IEEE International Conference on Robotics and Automation, April, 1996.

Soshi Iba <iba@ri.cmu.edu>
Last modified: Mon Feb 2 16:03:27 EST 2004
[count] visitors since May 22, 2002