Skip to content

Commit 2d04437

Browse files
committed
Incorporate HMI into this story
1 parent 8ce1bcb commit 2d04437

File tree

1 file changed

+5
-1
lines changed

1 file changed

+5
-1
lines changed

_posts/2018-09-25-from-ui-to-motors.markdown

+5-1
Original file line numberDiff line numberDiff line change
@@ -17,7 +17,11 @@ On Amigo, we use [dragonfly_speech_recognition](https://github.com/tue-robotics/
1717
This also uses the grammar (as describer below) to aid in speech recognition.
1818
The grammar restricts what the STT can hear and thus the result is always something the robot can at least parse.
1919

20-
The text gets published to a ROS topic and then read by the
20+
Because we want to be able to both talk to the robot and text with it, each taing turns or even falling back from one modality to another, we use the [HMI](https://github.com/tue-robotics/hmi).
21+
The Human Machine Interface provides a client that the GPSR's conversation_engine uses. This client is connected to several servers that implement some way for the robot to ask the user a question.
22+
This can thus be voice (via STT)or text (via Telegram or Slack) or some mock interface for testing.
23+
24+
The text is eventually read by the
2125
[conversation_engine](https://github.com/tue-robotics/conversation_engine).
2226
This interprets the command using the [grammar_parser](https://github.com/tue-robotics/grammar_parser).
2327
The result of the parsing is an action description that gets sent to the [action_server](https://github.com/tue-robotics/action_server).

0 commit comments

Comments
 (0)