The Pennsylvania State University Mars Society
“Integrated Astronaut Control System for EVA”
Integrated Astronaut Control System for EVA
Submitted by
The Pennsylvania State University Mars Society
Pursuant to the 2003 RASC-AL Forum
Kevin Sloan1,2 , Student Team Leader
Megan DeCesar3,4 , Brendan Knowles 1 , Steve McGuire 5 ,
Nima Moshtagh1 , and Jonathan Trump 3,6
Dr. Lyle Long7 , Advisor
1
2
3
4
5
6
7
The Department of Electrical Engineering at the Pennsylvania State University
The Department of Mechanical Engineering at the Pennsylvania State University
The Department of Astronomy and Astrophysics at the Pennsylvania State University
The School of Music at the Pennsylvania State University
The Department of Computer Science and Engineering at the Pennsylvania State University
The Department of Physics at the Pennsylvania State University
The Department of Aerospace Engineering at the Pennsylvania State University
Abstract
Manned missions to Mars will require a new type of exploration involving almost complete
independence from Earth. These new challenges in space exploration will require a new
rover control system for on-site astronauts on extra -vehicular activities (EVAs) far from
home base. The solution proposed in the paper is a new gesture control system involving
integration of a virtual reality (VR) glove into the spacesuit glove. A simple system is
created for using a pair of 5DT five sensor data gloves to steer and control a mounted
pan/tilt camera on an ActivMedia Pioneer 2-AT class rover. Comparisons with the
historical solution to rover control systems —the joystick—revealed that while the joystick
has a slight advantage in speed of response by the rover, the glove is heavily more efficient
in terms of transportability, in addition to being much more versatile.
1
The Pennsylvania State University Mars Society
1
Introduction
As the space program continues to grow, sending a
manned mission to Mars becomes less of a dream and
more of a technological reality. When we are finally
able to land such a mission on Mars, our astronauts
will need to explore much of the planet in order to
adequately study its resources and discover whether
or not it holds living reservoirs in its surface. These
exploratory missions will require astronauts to
operate almost completely independently from Earth
and to perform extra-vehicular activities (EVAs) far
from their habitat base. Rather than sending these
Martian and lunar explorers out completely alone,
however, it makes more sense to have them
accompanied by the tried-and-true robotic explorer: a
rover. These rovers will function as a sort of toolkit
and mobile laboratory for the astronaut, especially
since a rover can be much more maneuverable than
any spacesuit-encumbered astronaut. This re-defined
purpose of the rover, along with the possibility of
ranges far away from the home base, requires an
individual control system by an on-site astronaut.
While this is not a completely new topic, future longterm manned exploratory missions require it more
than any past missions. Current manual control
methods (joystick, flight stick, etc.) are difficult to
use with a bulky spacesuit glove, and a new approach
will be necessary for the new long-term on-surface
explorers of the future.
We chose to explore a gesture control system using a
virtual reality (VR) glove. Gesture control uses
natural motions of one or both hands and translates
them into commands. Because gestures are such a
common part of human interaction, a control system
based on them ought to be natural and easy to learn.
Also, hand gestures allow for fine-tuned control, even
when the hand is constrained within a spacesuit
glove. Use of gesture control in space exploration is
a relatively unexplored topic. This paper presents the
advances of the Penn State Mars Society in VR glove
control system work since last presentation in the
RASC-AL forum.
Our hypothesis was that the VR glove would prove to
be a natural, efficient method of fine-tuned gesture
control, outperforming other historical control
systems. We acquired a pair of 5DT five sensor data
gloves on loan from the Computer Science and
Engineering Department at Penn State and used these
gloves for our own tests of the effectiveness of a
gesture control system. Eventually, the modular
rover we are building will provide an excellent
testbed for gesture control by the glove, but for now,
we have been testing the glove on the commercially
“Integrated Astronaut Control System for EVA”
available ActivMedia Pioneer 2-AT class rover. For
accurate assessment of the VR glove efficiency, we
prepared a comparison of the 5DT data glove with a
joystick and trackball while wearing a hockey glove
to simulate the constraints placed on hand motion by
a spacesuit glove.
1.1
Background
As previously mentioned, the idea of on-site rover
control by astronauts is not a new topic, though it will
become especially important with future manned
missions to Mars. NASA has already developed
several ways for designing methods of control
systems suited to the bulky spacesuits of astronauts.
Here we present the limiting factors on control
systems: current and future spacesuit glove
technologies and their effects on an astronaut’s
movement. We also illuminate control requirements
by an on-site astronaut, along with NASA’s past
solutions to the issue.
1.2
The Spacesuit Glove
Spacesuits in use today contain thirteen layers to
protect astronauts from radiation, extreme
temperatures, micrometeoroids, and the zero-pressure
environment of outer space [1]. Materials like
neoprene-coated nylon and Gore-tex make up seven
of those layers; between such materials are six layers
of vacuum [2]. The suit is pressurized at 4.3 psi with
pure oxygen gas.
The spacesuit glove acts an interface between the
astronaut and his surroundings. The glove must
therefore be durable and flexible enough that it
allows the astronaut to perform his necessary
functions while protecting the astronaut from the
harsh conditions of outer space [3]. The gloves have
bearings on the wrist to assist the astronaut’s wrist
motion, as well as rubberized fingertips for better
grip. Astronauts wear fine-fabric gloves inside the
layered outer glove for comfort purposes [1].
Current spacesuits make free motion difficult for an
astronaut. They are heavy and lack the flexibility
required for moving one’s body quickly. Even the
gloves inhibit movement due to the pressure of the
oxygen-filled suit. The astronaut’s hands tire quickly
during activities such as repairing the outside of a
spacecraft or satellite. This is a major concern, as the
astronaut needs his or her hands to perform these and
various other tasks while in a low-pressure
environment.
2
The Pennsylvania State University Mars Society
An option for spacesuit design that may reduce such
difficulties
is
mechanical
counter-pressure
technology.
The astronaut’s head would be
contained in an oxygen-filled helmet as before, but
the rest of the body would be covered in an elastic
fabric that would apply the same pressure to the body
as the oxygen does to the head [4]. The mechanical
pressure suit uses the elastic fabric to apply pressure
to the body in the same way the current spacesuit
does with air pressure. The thinner elastic fabric
allows greater mobility and flexibility. Mechanical
counter pressure glove research shows that the user
experiences no noticeable physiological effects [5].
If the pressure were controlled properly, mechanical
counter pressure could become a safer and more
mobile alternative to the traditional oxygenpressurized space glove and suit.
“Integrated Astronaut Control System for EVA”
As rovers have already been used effectively to probe
a small extent of the Martian landscape, it is
reasonable to expect that they will be used again to
assist the astronauts in their exploration during
EVAs. The astronaut will have to steer the rover
while he or she is standing outside, far away from the
base. Therefore, it will be essential that the astronaut
does not need to carry any extra parts or be bothered
with dozens of electrical wires, as he may be if a
joystick is used as a control system.
The other essential aspect of an investigative rover is
the visual information from the environment. The
obvious use of the visual feedback is to provide the
operator with information about the rover’s
surroundings, which makes the steering of the rover
possible. Also, the sent images can be used to look
for interesting objects in the gathered samples.
1.4
Previous Control Methods
In the past, rovers have been controlled from Earth
by
a
Silicon
Graphics
Onyx2
graphics
supercomputer. This provides the “driver” of the
rover with a graphical interface. He can choose
commands to send to the rover using window screens
that are accessed by the click of a mouse.
Figure 1- Spacesuit Layers
Another possible glove design is the Power Glove,
developed by the University of Maryland Space
Systems Laboratory and ILC Dover, Inc. In both the
current spacesuit glove and the mechanical counter
pressure glove, even small hand movements tire
astronauts quickly because they must work against
torques that push to restore the gloves’ neutral
position.
The Power Glove is a power-assist
actuation system that provides torques to counter
those caused by the pressure in the glove, allowing
the astronaut to move with little resistance and less
hand and arm fatigue [6].
1.3
Applications for Control Systems
When the first manned Mars mission lands on Mars,
the astronauts will have some basic tasks to begin
immediately upon their arrival. They will need to
construct and maintain facilities and conduct
scientific research on the surface of Mars. Their
research will include geologic fieldwork, collection
of soil and rock samples, the deployment, operation,
and maintenance of scientific instruments, and
telerobotic exploration.
The user first selects or types commands and inputs a
command sequence for the rover to execute. He can
watch a model rover’s movement on the computer
screen, using 3D goggles to enhance his view of the
field. He uses a joystick called Spaceball to move the
model of the rover as if it were moving on Mars. The
program constantly updates the model rover’s
coordinates, telling the Mars rover where to go on the
planet’s surface. The lander rover’s camera sends
stereo images back to the Rover Control Workstation.
These images are processed and turned into a 3D
model of the planet’s surface. The driver can zoom
in on a feature of the terrain from any angle and
avoid any hazards the rover may intercept [7].
Earth control of a rover will always introduce a delay
between command and action because it takes a finite
amount of time for a signal to be transmitted. This is
not much of an issue for lunar missions, since the
time delay is on the order of seconds. Control of
Martian rovers, however, incurs a time delay of up to
forty-five minutes [8]. In the Pathfinder mission, the
rover had to be extremely slow to account for this
time delay. Future manned exploratory missions to
Mars must avoid this problem of time delays: manual
control by the on-site astronaut is the obvious
solution.
3
The Pennsylvania State University Mars Society
2
Approach
The Penn State Mars Society is an organization
dedicated to providing opportunities in undergraduate
research with relevance to the space community, and
to serve and educate the community in all matters of
space science. While much of the organization’s
research is focused on Martian exploration, its
members are interested in developing advances along
any paths of space exploration.
The current projects of the Penn State Mars Society
are the development of a VR glove control system
based on hand gestures and the construction of a
modular rover. These projects were chosen not only
because of their value in manned exploration and
long-term on-surface exploration, but also because
they provide an exciting hands-on project useful in
outreach demonstrations for exciting the local public
about the space program in general. While both of
these projects would be especially useful in Mars
exploration, their applications are much more general
and their development would be important for use in
any future space exploration mission. The Gantt
chart in Figure 2 presents the timeline of the two
projects. Richter et al [8] describes the rover project
in more detail. This paper focuses only on the
research on the VR glove gesture control system.
Figure 2- Project Timeline
From analysis of historical requirements on control
systems and from predictions of necessary
applications for future manned missions to Mars or
the moon, we devised the following general criteria
for our gesture control system.
•
Fine-tuned control
“Integrated Astronaut Control System for EVA”
•
•
•
•
•
No overlap between commands
Efficient response to commands
Simplicity and ease of training
Transmission efficiency (range and power)
Multitasking
We designed two applications for our gesture control
system: steering and camera control. Both of these
applications will probably be used by Martian or
lunar explorers. We worked to satisfy our general
criteria through these applications.
3
The VR Glove
Virtual reality is defined by Jonathan Steuer to be “a
real or simulated environment in which a perceiver
experiences telepresence,” where telepresence is the
sensation, created by a communication medium, of
being within an environment [9]. It is interactive in
nature, and has been applied both to entertainment
and to more practical purposes, such as flight
simulations for training airplane pilots and astronauts
[10]. We will focus on the use of virtual reality
gloves that are sensitive to slight changes in the
user’s hand position used as communication media.
Virtual reality gloves have, to this point, been used to
control the action of video or computer games and 3D simulations for training. They have become more
recently applicable to other areas of research. Dr.
Steven Skiena of Stony Brook University, Francine
Evans of Schlumberger Corp. in Houston, and
Amitabh Varshney of University of Maryland,
College Park, developed Vtype, a software tool that
allows the user to type text without need of a
keyboard by wearing VR gloves and simulating
pressing on a keyboard [11]. Research is also
progressing in the use of VR gloves to control
machines performing medical routines like surgery.
A VR glove gesture control system for space
exploration requires the VR glove to act as an
interface between man and rover. Márcio S. Pinho
and colleagues used a 5DT glove as well as a virtual
reality arm, a similar type of man/robot interaction.
The glove was used both with the robot and in a
virtual reality simulator. Pinho found that the use of
virtual reality to control a robot’s actions improves
the “integration between man and machine” while
decreasing the “risk of accidents in the work place”
[12].
3.1
Capabilities
The VR glove is made of tight-fitting, stretchy fabric
containing fiber-optic sensors that detect slight
4
The Pennsylvania State University Mars Society
motion in the user’s hand. Modern VR gloves give
the user six degrees of motion: translation along the
x, y, and z axes; and yaw, pitch, and roll. We used
the 5DT data glove 5 to perform our experiments.
This glove has one sensor per finger. It has 8-bit
resolution and is able to detect up to 256 different
finger positions [13]. Generally, there is no need for
256 positions: our research requires far fewer, as do
most other applications. The user calibrates the glove
by opening and closing the hand and rolling the arm
left and right. The 5DT glove is available for the left
and right hands, both of which we used in our
experimentation.
“Integrated Astronaut Control System for EVA”
rover’s environment through such feedback from the
robot.
The VR glove is especially suited for controlling a
rover designed specifically for Martian explorations.
The versatility of the VR glove enables an astronaut
to control more than just basic steering. If the
software supports multiple modes of control, the
same basic commands can control different tasks
(e.g., making a fist might halt the rover in steering
mode and select a target in camera mode).
3.2
Integration into the Spacesuit
The Martian environment is not much different from
the inter-planetary space. The astronaut will still
require protection from UV radiation and near
vacuum conditions, so a spacesuit similar to those
currently in use will be necessary. These Mars
spacesuits will still have some of the flaws in
flexibility present in the current spacesuit.
Figure 3- Virtual Reality Glove
Other available VR gloves use slightly different
technologies to achieve similar purposes in reality
simulations. The Pinch Glove has fingertip sensors
that detect contact between the digits of the user’s
hand. The detection is independent of individual
hand geometry and therefore requires no calibration
routine. The Cyberglove is made of stretchy fabric
and breathing mesh palms with flexible sensors that
measure very small changes in hand and finger
positions and curvature. It uses the latest highprecision joint-sensing technology, has a tracking
system on the wristband, and also has a software
programmable switch. It is said to be ideal for
telerobotics, VR, task training, video games, and
medicine [13].
Advances are being made in the development of VR
gloves to be used for space-related applications.
When the information from the environment is fed
back to the VR glove, a better control over the system
can be obtained. For instance, in a project under the
direction of NASA engineer Chris Lovchik, the
feedback information allows the astronaut to actually
“feel” what he/she is touching or lifting with the
rover. NASA is developing a system that allows
astronauts to actually “feel” what they are touching
or lifting with the rover through the compression and
decompression of air pockets in the glove [14]. The
glove is still being perfected, but it may prove useful
to astronauts by increasing their awareness of the
Since it would be impossible for an astronaut to
switch from a spacesuit glove to a special VR control
glove in the field, it makes the most sense to integrate
the two systems into one fully functioning VR rover
control Mars spacesuit glove. Discomfort and loss of
productivity are the major concerns in integrating VR
rover control into the Mars spacesuit glove.
Therefore, the following characteristics are desired
for the glove:
•
•
•
The glove must not inhibit hand and finger
mobility and dexterity during the operation.
The system must be simple, yet highly reliable.
The fiber optics must operate in the temperature
range 0-100°F inside the glove layers.
By integrating the VR glove optical fibers into the
actual Martian spacesuit glove, the flexibility of the
astronaut’s fingers should not decrease. The 5DT
glove, for example, has fibers only 3 mm thick and
weighs only about 5 oz. Other gloves have even
thinner fibers and weigh even less. These dimensions
and masses impose negligible effects on the
flexibility of a spacesuit glove.
Similar work has been done by NASA. For the space
shuttle Discovery mission NASA tested temperature
loggers called HOBO to monitor the inside
temperature of the EMUs. The temperature sensors
were installed between the first and the second layers
of the glove to keep the electronic parts away from
the 100% oxygen environment. A similar approach
could be used to implement the virtual reality glove
in the Mars suit. Instead, the fiber optics must run
5
The Pennsylvania State University Mars Society
through the inner layers of the spacesuit glove, so
that it has the maximum sensitivity to the astronaut’s
hand motions.
4
On a fundamental level, the developed system has
four principle components: data input and filtering,
gesture recognition, state selection, and devicespecific output. Each individual component is run
separately in its own thread of execution, with the
multiple threads communicating via buffer queues.
The overall program efficiency is multi-threaded to
ensure that any individual component can operate
independent of the others’ complexities.
Data Input and Filtering
The virtual reality gloves are used to send raw data
collected from the bend in the fingers as output to the
implementation program. This raw glove data is
interpreted with a vendor-supplied library function,
which returns the actual values measured by the
glove hardware. As the sampled data is returned,
each specific datum is attached to a glove label which
differentiates between the left and right gloves. This
allows for operation with both hands, as will be
discussed in accordance with state selection.
The glove, as previously mentioned, has an 8-bit
reading accuracy. Natural frequencies associated
with muscular twinges and cardiovascular pulses
present in the human hand, coupled with a lack of
perfect control over hand movements and finger
flexions, create noise around the signal.
To
counteract this effect, an exponential filter is applied
to the measurements. After filtering, the data is much
smoother and appropriate for use in the gesture
recognition process.
4.1.2
With 8-bit resolution (28 =256 possible values) for
each finger flexion, as well as pitch and roll, the total
number of possible state for the hand is given by:
Implementation
4.1 Gesture Control System
4.1.1
“Integrated Astronaut Control System for EVA”
Gesture Recognition
After the filtering process, the data can be interpreted
as specific gestures. Gestures are defined as specific
states of the hand, where each finger has a certain
(although not necessarily equal) amount of flexion,
and the entire hand is held with a certain amount of
pitch and roll. Gestures may also be defined by
specific changes in the state of the hand, where
certain hand states are achieved in sequential order
through the duration of a fixed time span. The
specific definition of a gesture varies between these
two described depending upon the requirements of
the application in question.
(2bit resolution )number of channels = (28 )7
= 7.2 x 1016 possible states
While this seems to provide an inexhaustible range of
control options, the fact that the hand is not a
perfectly controllable device must be considered.
Realistically, for producing specific states, the hand
can reliably and consistently produce four states of
finger flexion and five of both pitch and roll. With
the assumption that each finger could be moved
completely independently, there would be 50,176
possible states. Finger dependencies (such as the ring
finger, which in most cases cannot be moved without
affecting surrounding fingers), and a consideration
for the accuracy of states which is realistic for
continual field use limits the system to less than 100
specific, reproducible states. Again, the number of
available states depends completely on the type of
application, and whether continuous or discrete
control (defined as follows) is desired.
Continuous control is ideally suited for real-time
control applications, such as controlling motor speed
or camera position. If the hand’s position is taken to
represent either the speed of a rover or the position of
a camera, the user will need only general control
options. A rover operator will not necessarily know
the exact speed of the rover, or even its desired
speed, at all times while in the field. For this reason,
the glove’s control of rover speed should be limited
to defining changes in speed as a little/lot more/less
at varying degrees specified by hand motion. This
can be easily realized through use of pitch, roll, or
flexion of the fingers, excluding the thumb. The
more the fingers are flexed, the faster the rover would
travel. Continuous control would rely on relative
commands that would make significant use of the
glove’s 8-bit resolution.
Discrete gestures and control methods are vital in
selecting specific, unique commands. This includes
switching between states and selecting particular
modes of device control. As opposed to continuous
control, which utilizes the full range of motion of the
hand, discrete control relies upon more easily
definable and obtainable states within this range. In a
more basic case, this is realized with an open or
closed fist. As mentioned previously, one method of
gesture commands responds to the transition between
such discrete steps in a specified time range. Closing
6
The Pennsylvania State University Mars Society
“Integrated Astronaut Control System for EVA”
and opening a fist in quick succession would easily
accomplish the task of selecting a target in a camera
targeting program.
Both continuous and discrete control methods are
extremely important in the overall functionality of the
gesture-based control system. When both types of
control are used together, an even broader range of
flexibility can be obtained in the field. In one hand
operation where the objective is to control the linear
speed of a rover, the thumb can easily be used
discretely to control the direction (forward/reverse)
of motion. If the four fingers (excluding the thumb)
are monitored, with degree of flexion representing
motor speed, this leaves the thumb available for
discrete steering control. An open thumb would
cause the rover to travel forward, while closing it
would cause the rover to switch into reverse. This is
just one example of many that would allow the two
primary methods of control to be coupled to increase
flexibility.
4.1.3
State Selection
The very reason that the gesture control system is so
attractive – it uses readily available hand gestures as
input commands – also causes a very significant
problem of command confusion; however, this
problem can be easily solved. The entire system
functions by monitoring every single movement of
one of the user’s primary world interaction
mechanisms: the hands. An astronaut on an EVA is
dependent on his hands for everything from picking
up rock samples to waving to a fellow astronaut. An
extreme, but potential, consequence would be an
astronaut waving to someone else, and inadvertently
driving the rover off the side of a cliff. This example
shows the absolute necessity for the definition and
implementation of various states of control.
The state transition diagram is shown in Figure 4.
The system begins in a root state, where there is no
system output. This state allows the operator to use
his hands naturally without them serving as an input
device. The second state is a direct drive state for
real-time operation, such as navigating a rover or
rotating a camera into position. A final state, labeled
“TARGETING” in Figure 4, is actually a menubased state where specific commands, or objects, can
be selected. Figure 4 refers to a camera targeting
program where specific targets can be scrolled
through. The state transitions have been designed so
that when one glove is in a specific state, the other
glove may not enter that same state. This prevents
the obvious problems that will arise if both gloves are
controlling the same rover task.
Figure 4- Control State Diagram
The above state transition diagram described
illustrates only a basic implementation.
More
advanced control methods could implement
variations on the above, wherein many types of
control and operations would be child states of more
general task selection states. In certain situations it
would also be desirable to have one hand in a parent
state of the opposite hand, such as the right hand
selecting devices that the left hand is controlling in a
direct-drive state.
4.1.4
Device-Specific Output
Depending on the current state and gesture inputs, if
any device output is required, the gesture thread will
pass a request along to the appropriate device output
thread. The output threads vary between devices and
their required protocol and location.
Output can
also be directed to external devices, such as
microcontrollers connected via serial port, or internal
devices, such as separate software that controls the
motion of a pan/tilt camera.
4.2
Software Implementation
The majority of the control system software was
developed independently, with the exception of the
libraries that were included with the glove. In order
to focus efforts on development of the control
system, a network robot hardware platform,
“Player/Stage,” was used. This package simplifies
the output stage of the process and provides
simulation software. Player uses a TCP socket-based
client/server model to allow a broad range of input
and output devices that can be connected regardless
of compatibility [15]. This model allowed us to
7
The Pennsylvania State University Mars Society
“Integrated Astronaut Control System for EVA”
readily apply hardware implementations to the
gesture control system.
4.2.1
Rover Navigation
One of the most practical applications for such a
control system is to control the movement of a rover
in the field. Using an ActivMedia Pioneer 2-AT
(graciously lent for use by The Pennsylvania State
University Applied Research Laboratory), the gloves
were imple mented for full navigational purposes.
Hand gestures controlled forward and reverse
movements, in addition to turns and pivots to both
the right and left. For this case, the hand’s state
represented the velocity of the rover.
4.2.2
Figure 5- Rover Viewing Scene for Targeting
Camera Control
An additional control was implemented with the
SmileCam, a pan/tilt USB camera controlled via
serial port. In this scenario, as opposed to that of the
rover, the position of the hand represented a specific
pan/tilt position of the camera, as opposed to the
velocity of each individual servo motor. Although
this differed from the rover implementation, it made
clear the fact that different devices have different
control methods best suited to their purposes.
The control of the camera was further extended to
include targeting and target selection. In this process,
an independent computer vision program analyzes
the scene to find points of interest (for example,
various rocks on the surface of Mars). These points
of interest are relayed back to a display that shows a
representation of each point (termed “blob”). Using
discrete input commands from the gloves, a target
selector scrolls each blob, highlighting the one of
interest. This system has many applications, ranging
from high-resolution imaging to semi -autonomous
navigation. A simulation of the target selection was
done in Stage (of the aforementioned Player/Stage).
The scene is shown in Figure 5 below. The rover is
looking at the scene of various blue and red objects,
and groups them together according to color. Figure
6 is the camera view from the rover, showing these
grouping. The blob on the far right has a bold border,
meaning that it is the target that is currently selected.
Figure 6- Blob Target Selection
5
Testing and Results
While the virtual reality glove has been the primary
input device used for this specific control system, it
certainly is not the only one available. In order to
determine the relative effectiveness of the gloves, it
was compared with a joystick and a trackball, both
feasible alternatives. They were compared on the
basis of time to complete a given task, overall ease of
use, versatility, size and weight, and suit integration.
For each category, each input device was rated on a
scale of 1-10, with 10 being the highest obtainable
score, and placed into a selection matrix, shown in
Table 2.
5.1
Time to Complete a Given Task
To test how quickly tasks can be performed, each
device was used to navigate a rover through the
course shown in Figure 7.
To simulate the
inflexibility of a spacesuit glove, a modified hockey
glove was worn on the operating hand. While the
hockey glove simulated the bulk and stiffness of an
actual spacesuit glove, it allowed more of a sense of
touch than can be expected from a spacesuit glove.
8
The Pennsylvania State University Mars Society
From the starting point on the course, where the rover
is drawn, it must follow the illustrated path around
points 1 and 2. After passing point 3, the rover must
travel in reverse until it is line with the final obstacle,
which entails driving directly in between points 4.
Two users were timed for the course, using the three
input devices, and their times are given below in
Table 1. It should be noted that User B had more
experience using the glove, which accounted for the
significantly faster course time with that device.
“Integrated Astronaut Control System for EVA”
5.1.2
Versatility is based on how broad-ranging the control
applications and abilities are.
As previously
discussed, the gloves can be used together to provide
significantly increased flexibility, especially when
coupled with their inherently large range of input
options. Due to its size, the joystick requires two
hands to operate, limiting usage to only one device.
With spacesuit gloves on, an astronaut will not be
able to manipulate many buttons, leaving few input
options outside of the stalk itself. A trackball is
somewhat smaller, and assuming a surface to set it
on, two could be used. This would, however, be
awkward, and in all likelihood an astronaut would
have to hold a single trackball in his hand. Again,
very few buttons would be able to be implemented
outside of the ball itself, allowing for minimal
expandability.
5.1.3
Figure 7- Timed Rover Course
Device
Glove
Joystick
Trackball
User A
2:15
1:30
4:27
User B
1:32
1:12
4:32
Table 1: Time to Complete Course
For each case, the joystick was clearly the fastest, and
the trackball was hands down the slowest. While the
glove fell behind the joystick, with sufficient training
the difference between the two could be minimized.
5.1.1
Overall Ease of Utilization
The ease of use for each input device is a very
subjective category. After using each, the user was
asked to rate how comfortable he felt using it. As
with most comfort level issues, there really is no
ideal test for this category. Instead, it is left to the
opinion of the users.
Versatility
Size and Weight
The virtual reality gloves are very slim and
lightweight devices. They weigh less than a pound,
and consist of only a thin glove, and two very small
boxes containing electronics. On the opposite end of
the spectrum, a joystick is a rather large device that
would have to be carried separately in a backpack.
This brings about a major consideration that is not
shown in the time category discussed previously.
While the glove is immediately accessible (as it is
being worn), the joystick will take time to remove
from the backpack and activate. The added burden of
this extra equipment is one that would likely want to
be avoided if at all possible. A trackball itself is a
rather small device, and if integrated into a spacesuit
(on an opposite forearm, for example) could prove to
be almost as small and inconspicuous as the gloves.
5.1.4
Suit Integration
The gloves, as previously described, are ideal for
integrating into a spacesuit. They would become a
part of the spacesuit gloves and virtually disappear.
Only a minimal amount of hardware would need to
be added to the spacesuit itself. A joystick is
obviously designed as a stand-alone device which
cannot be integrated in its current form into a
spacesuit. While it is conceivable that such a joystick
could be designed, it would be a generally unsafe
idea as the stalk would be very easy to snap off. A
trackball, as discussed in the last category, could very
easily be integrated into the opposite forearm on the
spacesuit. However, one difficulty which would be
encountered would be the presence of tiny dust fines
in the air of Mars would get into the trackball’s well
and interfere with sensors there.
9
The Pennsylvania State University Mars Society
Device
Glove
Joystick
Trackball
Time to Complete
Task
7
9
1
“Integrated Astronaut Control System for EVA”
Ease of Use
Versatility
8
8
1
8
3
3
Size and
Weight
10
3
9
Suit Integration
Total Score
10
2
8
43
25
22
Table 2: Input Device Selection Matrix
6
Conclusions
The device selection matrix in Table 2 shows that the
glove is the only device of the three that is
consistently strong in each of the five categories,
while the joystick and trackball are both lacking in
three of the areas. As has been discussed throughout
this paper, the glove design is a very flexible and
efficient control method for an astronaut on an EVA
independent of the assistance of mission control and
ground base. It is unobtrusive, making itself present
only when called upon, yet provides extraordinary
flexibility for a field control system.
7
Future Development
The Martian astronaut who is controlling the rover
from a distance must make his/her decisions merely
based on the live images that are received from a
camera onboard the exploratory rover. To direct the
rover towards a specific target, the astronaut should
be able to put the rover in the tracking mode in which
the rover will follow a straight path to reach the
target. Targets can be easily detected by using
software in an onboard computer. However, target
selection must be done by the operator. The proposed
solution is to incorporate the VR glove with the
technology of the Pinch Glove, so that the current
design can accommodate more features such as target
selection. In other words, the VR glove will switch
from the steering mode to the target selection mode,
and the astronaut can simply select the desired target
by touching the corresponding fingertip sensor. The
rover “eye” will be locked on the target until the
rover reaches its desired distance from the target.
Another improvement to the current design could be
the integration of our data glove with a force
feedback system that allows the astronaut to “feel”
any unexpected situation that the rover might face.
This feedback information can save the out-of-reach
rover in the field from possible damages to wheels
and mechanical equipments. An example of
exploiting force feedback in virtual reality devices
can be found in commercially available VR devices
such as CyberTouch which is designed and
developed by Virtual Technologies, Inc.
Because time and resources are scarce in an
interplanetary mission, the VR glove should not limit
the astronaut to interacting with only one rover in the
field. In an actual mission to Mars, more than one
rover will be working at the same time. Rovers can
be given different tasks and thus require individual
attention from a trained operator. As explained in
section 4.1.3, a menu-based operating system is best
suited for such a multitasking operation. The
astronaut could use a menu to switch the control to
another rover or to change the operation modes. The
menu system will
•
•
•
save valuable time during EVAs,
give the astronaut the ability of supervising
multiple rovers, and
lessens the number of gestures that would
otherwise be necessary.
The only requirement of this improvement is to equip
the astronaut with a wearable computer during the
mission. These computers are already being used by
U.S. military forces.
8
Outreach
The International Mars Society is dedicated to
instilling the vis ion of pioneering Mars through broad
public outreach [16]. The Penn State chapter of the
Mars Society is active in achieving this goal by
organizing and assisting with various educational
programs in both school and community settings.
Our chapter ran the Mars Society information booth
during Penn State’s Spring 2003 “Lectures on the
Frontiers of Science” series, consisting of five
lectures given by experts in the field of space science
and exploration.
We also participated in the
university’s fourth annual Space Day on April 12,
10
The Pennsylvania State University Mars Society
2003. This day was a chance for campus professors
and researchers in space-related departments to share
their work with the general public. The program was
attended by 1400 people from the community. The
Mars Society displayed several posters about Mars
and exploration of the planet. We demonstrated our
pan/tilt camera, controlled by the VR gloves, for
passersby.
On April 28, 2003, the Mars Society presented
arguments for settling Mars before settling the Moon
in a Mars First vs. Moon First debate. The debate
was open to the public and was attended by many
students and some community members who were
particularly interested in space exploration and
colonization.
Our most recent outreach activity was a visit to Penns
Valley Elementary School on May 12, 2003, to teach
the fifth grade class about the Solar System and the
possibility of life existing on Mars. The largest
public outreach program our society has been
planning for the future is this summer’s Mars Week,
running from June 2-7, 2003. During this week we
will display posters on Mars exploration, have
speakers giving presentations on space exploration,
and give “rover workshops” during which we will
teach participants about building and operating the
Mars rover our chapter has designed, as well as
demonstrate its abilities. The virtual reality glove
will be key in these demonstrations, as we will be
able to show the public the difference between using
a joystick and using a sensitive glove to control the
rover’s movement.
9
References
[1] Freudenrich, Craig C., Ph. D. “How Spacesuits
Work.” How Stuff Works Media Network, 2003.
URL: http://www.howstuffworks.com/spacesuit.htm
[2] “In Space With a Tough Little Data Logger.”
Onset Computer Corporation, April 22, 2003.
URL: http://www.onsetcomp.com/
Applications/Discovery/3290_space.html
[3] “Space Suits: Glove.” Hamilton Sundstrand
Space Systems International, 2003. URL:
http://www.hsssi.com/Applications/SpaceSuits/G
loves.html
“Integrated Astronaut Control System for EVA”
http://www.lpi.usra.edu/publications/reports/CB1106/ucb01.pdf
[5] Tourbier, D. et al. “Physiological Effects of a
Mechanical Counter Pressure Suit.” 2000.
URL: http://www.dsls.usra.edu/dsls/
meetings/bio2001/pdf/140p.pdf
[6] “Power-Assisted Space Suit Glove.” Space
Systems Laboratory, February 21, 2003. URL:
http://www.ssl.umd.edu/projects/
PowerGlove/powerglove.html
[7] Cooper, Brian K. “Rover Control Workstation.”
MFEX: Microrover Flight Experiment, Jet
Propulsion Laboratory, California Institute of
Technology and the National Aeronautics and
Space Administration, 1997. URL:
http://mars.jpl.nasa.gov/MPF/roverctrlnav/rcw.ht
ml
[8] Richter, Joel et al. “Modular Research Rover
and Gesture Control System for EVA.” RASCAL 2002 Advanced Concept Design Presentation
proceedings, Nov. 6-8, 2002.
[9] Steuer, Jonathan. “Defining Virtual Reality:
Dimensions Determining Telepresence.”
Journal of Communication, 42(4) (Autumn,
1992), 73- 93. URL:
http://cyborganic.com/People/jonathan/
Academia/Papers/Web/defining-vr1.html , 1995.
[10] Tate, Scott. “Virtual Reality: A Historical
Perspective.” September 28, 1996. URL:
http://ei.cs.vt.edu/~history/Tate.VR.html
[11] Kocijan, Iva. “Stony Brook Scientists Awarded
Patents for Virtual Reality Software, Oral
Bacteria Control, and Computer-Based Focusing
and Assembly Apparatus.” Stony Brook
University, Engineering/Science Press Release,
2002. URL:
http://commcgi.cc.stonybrook.edu/artman/publis
h/article_36.shtml
[12] Pinho, Marcio S. et al. “Robot Programming
and Simulation Using Virtual Reality
Techniques.” Virtual Reality Group, PUCRS
School of Informatics, 1999. URL:
http://grv.inf.pucrs.br/Pagina/Publicacoes/
Robo/Ingles/RoboRVIng.htm
[4] Gorguinpour, Camron et al. “Advanced TwoSystem Spacesuit.” University of California,
Berkeley, May 7, 2003. URL:
11
The Pennsylvania State University Mars Society
“Integrated Astronaut Control System for EVA”
[13] “Data Gloves.” Virtual Realities: Global
Distributor of Quality Virtual Reality Products,
2003. URL:
http://www.vrealities.com/glove.html
[14] Cook, Stephanie. “High School Whiz Improves
on Virtual Reality Glove.” The Nando Times,
Nando Media, 2001. URL:
http://archive.nandotimes.com/technology/story/
0,1643,500299565-500478367-5032309580,00.html
[15] “Player/Stage FAQ.” Player/Stage,
Sourceforge.net, 2003. URL:
http://playerstage.sourceforge.net/faq.html
[16] “The Purpose of the Mars Society.” The Mars
Society, 2001. URL:
http://www.marssociety.org/about/ purpose.asp
12