The RIVA (Realistic Interactive Virtual Agent) Project is being developed at the University of California, Irvine (Calit2). The goal of this system is to create a virtual assistant that provides feedback to patients undergoing physical rehabilitation with the Music Glove.
The project utilizes USC's Smartbody system (Documentation), and is controlled using the Python scripting language through the VirtualAssistant.py script. This script relies on output from our Music Glove Script.
- File Structure
- How to Run
- Using with MusicGlove
- Gazing Coordinates
- Facial Expression BMLs
- In Progress
- Files in 'scripts' should be placed in "__SmartBodySVN\data\scripts"
- Files in 'MusicGlove' should be placed in "__SmartBodySVN\data\MusicGlove"
- Main program in VirtualAssistant.py
- RUN the Music Glove Script
- RUN sbgui.exe
- SELECT File > Run Script > VirtualAssistant.py
- RUN MusicGlove
Iteration:13;Expression:1;TTS:OVERALL_SUMMARY_2
- VirtualAssistant.py reads from the MusicGlove log file. If the file has been updated, RIVA will execute the appropriate actions.
- z-coordinate: 1.0 onscreen, 1.5 offscreen
- Gaze Forward:
(0.03, 1.58, z)
- Gaze Left:
(-4, 1.58, z)
- Gaze Right:
(4, 1.58, z)
- Happy:
bml.execBML('ChrRachel', '<face type="facs" au="6" amount="1"/><face type="facs" au="12" amount="1"/>')
- Sad:
bml.execBML('ChrRachel', '<face type="facs" au="1_left" amount="1"/><face type="facs" au="1_right" amount="1"/> + <face type="facs" au="4_left" amount="1"/><face type="facs" au="4_right" amount="1"/> + <face type="facs" au="6" amount="0.58"/>')
- Improve character lighting
- Include 3D objects for the environment
- Concatenate audio files
- Differentiate between positive and negative feedback
- Improve character behavior (gestures, expressions)
- Character movement and speech timing