User Interfaces that we
design should be as easy and seamless as possible. The student will
often have to step to different parts of the technique he or she is
studying, vary the speed, change some options or what have you, and
this will need to be accomplished as quickly as possible. As before,
interfaces which we will test are in red, those that we would like
to get to are in black.
Virtual
Buttons
Display a set of buttons for the student, in virtual reality, which
the student then "presses" in some way.
Haptic
Buttons
Take the set of buttons from above, but give them a haptic component.
This can be done to three varying degrees:
- Static haptics - display
the buttons where a real life static object is. The student feels
something there when he or she presses the button
- Limited dynamic haptics
- give the student some kind of haptic feedback for his or her choice.
This could be something like mounting foam pads on a board so that
the student presses into the foam. This need not be reflected in
the visuals
- Tracked haptics - create
a set of buttons that respond the same in the physical and virtual
world. When the student presses the physical button, the virtual
button also moves.
Posture
Matching
We create a variety of special postures which the student can perform
and which have meaning to the system. A very nice example of this
is that if the student wanted to start at "White Crane",
he or she can move into the White Crane position and the form would
start there

Speech
Recognition
We would do man-behind-the-curtain style speech recognition to test
initially. The student would shout out a voice command and the operator
would press the appropriate button
Floorpie Menu
We could display something like a pie menu on the ground, or coming
up in space, which would let the student choose an option by walking
into the appropriate choice or by simply pointing at it.

Gesture Recognition
The dynamic counterpart to Posture Matching. This could be technologically
complex because most work on gesture recog. has been done in 2D, and
extending to 3D may not be trivial.
Physical Input
Devices
We could create some wearable input devices that are unobtrusive and
always available to the student. There are some serious technology
issues here, for example building the devices and transmitting there
signals wirelessly
.