Difference between revisions of "HMI Classification"

From Asibot & HOAP3 & TEO Wiki
Jump to navigation Jump to search
(→‎High-level devices: corrijo enlace; el código que habría que usar para embeber el vídeo de YT es <youtube>yByKRh1YIv8</youtube>, pero con el enlace anterior ya es redundante)
 
Line 35: Line 35:
 
* Some actual developments:
 
* Some actual developments:
  
. We are working on integration of AR tools, speech recognition and voice synthesis for augmentation of interaction capabilities with assistive robots and the environment.
+
We are working on integration of AR tools, speech recognition and voice synthesis for augmentation of interaction capabilities with assistive robots and the environment.
  
A recent video is linked
+
A recent video is linked here: [[Media:Video AR demo.ogg]].
[[Media:http://robots.uc3m.es/w/index.php/File:Video_AR_demo.ogg]]
 
 
 
<nowiki>
 
<iframe width="560" height="315" src="http://www.youtube.com/embed/yByKRh1YIv8" frameborder="0" allowfullscreen></iframe>
 
</nowiki>
 

Latest revision as of 12:58, 9 July 2016

This page is intended for exploring the different types of human-machine interfaces (HMI) we may want to use for ASIBOT (and potentially other robots in the RoboticsLab). By classifying the different types of HMIs it is hoped that common interface standards can be defined. For each type of HMI a known data type and port can then be used, making the data from these devices easily available to all modules in the software architecture.

The first distinction made is between low-level and high-level devices. Low-level devices are those intended for teleoperation (direct control) of the robot. This means the data typically has a constant number of variables (for example describing the desired x, y, z, roll, pitch, yaw). High-level devices are those intended for high-level commands that do not fit in the standard formats intended for teleoperation (for example a sentence from a speech recognizer).

Low-level devices

  • Position-based
    • Relative position
      • Discrete
        • Example: P300 BCI.
      • Continuous
        • Example: No obvious commercial products, but equivalent to using a free flying input device to command velocities.
      • Data:
        • xDelta,yDelta,zDelta,rollDelta,pitchDelta,yawDelta (doubles).
        • Flag denoting discrete or continuous?
    • Absolute position
      • Discrete
        • Example: Speech recognition for teleoperation (is this useful?).
      • Continuous
        • Example: Free-flying input devices like Flock of Birds.
      • Data:
        • x,y,z,roll,pitch,yaw (doubles).
        • Flag denoting discrete or continuous?
  • Velocity/rate-based
    • Discrete
      • Example: ERS/ERD BCI for wheelchairs.
    • Continuous
      • Joysticks, Space Navigator etc.
    • Data:
      • xDot,yDot,zDot,rollDot,pitchDot,yawDot (doubles).
      • Flag denoting discrete or continuous?

High-level devices

  • Some actual developments:

We are working on integration of AR tools, speech recognition and voice synthesis for augmentation of interaction capabilities with assistive robots and the environment.

A recent video is linked here: Media:Video AR demo.ogg.