[go: up one dir, main page]

0% found this document useful (0 votes)
128 views21 pages

Virtual Reality & Interaction: Virtual Reality Input Devices Output Devices Augmented Reality Applications

Download as pdf or txt
Download as pdf or txt
Download as pdf or txt
You are on page 1/ 21

Virtual Reality

& Interaction

Virtual
Virtual Reality
Reality
Input Devices
Input Devices
Output
Output Devices
Devices
Augmented
Augmented Reality
Reality
Applications
Applications

What is Virtual Reality?


narrow:
immersive environment with head tracking, head-
mounted display, glove or wand

broad:
interactive computer graphics

our definition:
an immersive interactive system

1
Fooling the Mind
The mind has a strong desire to believe that the world
it perceives is real. -Jaron Lanier

• Illusion of depth:
– Stereo parallax
– Head motion parallax
– Object motion parallax
– Texture scale
• Interaction: grab and move an object
• Proprioceptive cues: when you reach out and see a hand
where you believe your hand to be, you accept the hand as
your own

• Often you will accept what you see as “real” even if


graphics are poor

Interactive Cycle

Display must be continuously Recalc


Tracking geometry
redrawn (usually in stereo).
1. User is constantly moving.
Positions are tracked (head,
hands, or whole body).
2. Position of objects in the
environment is updated. Redisplay
3. Display is redrawn with new
view position, new user body
configuration (if tracking head,
hands, or whole body), new
object locations.
4. And back to step one.

2
Low Latency is Key

• latency: time lag between sensing a change and updating the picture
• 1 msec latency leads to 1 mm error
– at common head/hand speeds
• 50 msec (1/20 sec.) is common and generally seen as acceptable

• Otherwise user feels nausea


– if inner ear says you’ve moved but your eyes say otherwise
– effect is strongest for peripheral vision
– nausea is a serious problem for motion platforms (simulator sickness)
– filmmakers know to pan slowly
• Our system for full body tracking has 100ms latency—not so good.
– Measured with a record player…
– Blame assignment is hard and the path from user action -> display is complicated.

Input: Tracking Head/Hand


• Magnetic
– Transmitters stationary, receiver in hand / on hat
– Oldest, most common
– Fast (4 ms latency, 120Hz for Polhemus Fasttrak)
– Metal objects, magnetic fields cause interference (e.g. CRT’s)
• Acoustic
– Works well over small areas
– Background noise interferes
• Optical (1): Camera on head looks at LEDs on ceiling (UNC HiBall)
– Very accurate (.2 mm position), fast (1 ms latency, 1500 Hz)
– Recently currently available, and not terribly expensive
• Optical (2): Camera on head looks at markers in environment
– Vision system calculates camera position
– Very simple, quite inexpensive
– Slow (may fall a whole frame behind - 30 ms)

3
Input: Tracking Head/Hand 2
• Optical (3): Cameras in world look at markers on user
– Expensive
– 120Hz
– Can do whole body with some IK, disambiguation problems
• Inertial
– Tiny accelerometers
– Subject to drift (add gyros)
• Hybrids
– Intersense combines inertial for speed, ultrasound to prevent drift
– 150 Hz updates, extremely low latency
– http://www.isense.com

UNC HiBall Tracker

• Camera looks through six lenses at pulsed LED’s in ceiling


• Very accurate (.2 mm position error)
• Fast (1 ms latency, 1500 Hz)
• http://www.3rdTech.com/HiBall.htm (commercial version)
• http://www.cs.unc.edu/~tracker/

4
Input: Sensing the Hand

• Primitive technologies:
– mouse
» ok for 2-D positioning, poor for drawing/orienting
– joystick, trackball
» good for small/slow movement
– pressure-sensitive stylus
» good for drawing
• Wand
– tracker with buttons attached
– may also include a joystick/joybutton or trackball
– a simple way of grasping virtual objects
– rotating object in your “hand” provides some sense of reality but no force feedback
• Data glove
– measures joint angles of each knuckle in each finger
– more degrees of freedom than needed
– low accuracy

Input: Whole Body Tracking

• Realtime whole body tracking with Vicon System

5
Input: Whole Body Tracking

Getting good data in realtime is hard—no filtering

Input: Whole Body Tracking

• Low pass filter

6
Input: Whole Body Tracking

• Kalman filter

Example Application: Tai Chi Training

• How best to
present
feedback to
the user?
– Visually or
otherwise?
– Orientation,
overlay, number
of copies?

7
Input and Output: Haptics

• Haptic means relating to the sense of


touch

• input: sense hand/finger


position/orientation
• output: force-feedback

examples:
• mechanical force-feedback joystick: 2 Phantom
or 3 degree of freedom (DOF):
x,y,(twist)
• robot arm, e.g. Phantom

Input and Output: Haptics


Another example:
• magnetic levitation 6 DOF haptic
device
– Ralph Hollis at CMU
– http://www.cs.cmu.edu/afs/cs/project/msl/www/haptic/haptic_
desc.html

8
UNC NanoManipulator
http://www.cs.unc.edu/Research/nano/

feeling carbon nanotubes with an Atomic Force Microscope

Input: Presence Measure


• Sense user’s immersion:
– Heart rate
– Palm sweat
• Can then vary frame rate, latency, etc. and see how it
affects immersion

• Use of passive haptics

UNC

9
Input: Affective Computing
• Sense user’s attention and emotions:
– gesture
– posture
– voice
– eye gaze
– breathing
– pulse & blood pressure
– electrical activity of muscles
– skin conductance
http://www.media.mit.edu/affect/

• Alter system behavior accordingly (how exactly?)

Output: Rendering Pictures


• Historically, big SGIs
• Now PCs are in the range, except:
– Some issues with stereo
– Internal bandwidth

• System Demands
– At least 30 frames/sec; 60 is better
– times 2 for stereo
– at as much resolution as you can get
– 1 K to 40K displayed polygons per frame (more would be nice)

10
Output: Display Technologies
• Projection displays
– CAVE-type
– IDesk/IScreen
– Fishbowl VR
• Head mounts
– Immersive
– Non-immersive (augmented reality)
• To do stereo, you must get a
different image to each eye
– trivial for head mounts
– shutter glasses
» left & right images temporally interleaved
– polarized glasses or red/blue glasses
» left & right images optically
superimposed

CAVEs
• A room with walls and/or floor formed by rear projection
screens.

11
CAVE Details
• Typical size: 10’ x 10’ x 10’ room
• 2 or 3 walls are rear projection screens
• Floor is projected from above

• One user is tracked (usually magnetically)


• He/she also wears stereo shutter goggles…
• And carries a wand to manipulate or move through the scene
• Computer projects 3D scenes for that viewer’s point of view on
walls
• Presto! Walls vanish, user perceives a full 3D scene
– Turning head doesn’t necessitate redraw, so latency problems are reduced

• But, view is only correct for that viewer!


• cost is fairly high

CAVE Painting
http://www.cs.brown.edu/~dfk/cavepainting/index.html

12
CAVE Painting
http://www.cs.brown.edu/~dfk/cavepainting/index.html

Video Walls
• IDesks and their relatives
– (This is the Pittsburgh Supercomputing Center’s IScreen)
• Fishbowl VR is also in this category
Acoustic emitter for head tracker
Emitters for
stereo glasses
Rear projection screen

SGI Onyx with


“Infinite Reality” Graphics
& 4 Processors

13
Video Walls
• Princeton video wall
• Behind the curtain are n PC’s and n projectors
• Calibration is a (nearly solved) research issue

Office of the Future

14
Classic Immersive Headmounts
• Typical: small LCDs, one per eye
• Higher resolution: tiny little CRTs
• Flat panel displays are pushing this
technology
• Can get 1Kx1K or more, but heavy and
expensive (>$10K)
– Good for the military head-mounted display
• Serious problems with latency and tracking Bell Helicopter, 1967
errors
– Leads to nausea
• Field of view is pretty limited, maybe 35o
– Serious problem for some applications
– Prevents seeing your body in a natural way even with
full body tracking
• Can now be wireless
IO Systems I-glasses
640x480 resolution stereo
~$4K, 1999

Virtual Retinal Display


• Eric Seibel, U. Washington Human Interface Technology Lab
– http://www.hitl.washington.edu/research/vrd/
– www.mvis.com (commercial version)

• Simple enough: shine a laser in your eye and modulate it real fast.

• Potential for wearable very high resolution virtual reality

Video Drive
Source Electronics

Photon Intensity Beam Optical


Generator Modulator Scanning Projection

15
Virtual Retinal Display In Use

Tom Furness of HITL


Uses a prototype

Microvision’s “Nomad” Product

Augmented Reality Headmount Systems


• Augmented Reality means augmenting the
image of real environment with virtual one,
rather than replacing
– “heads-up display”

• One approach is to look through prisms or


semi-transparent LCDs
• Alternatively, video see-through
– Cameras are cheap and fast
– Image-based tracking
– Allows virtual objects to hide real objects

• Augmented VR is very sensitive to latency!


• But the user is comfortable and stays
oriented, and can still see office / lab

http://www.cs.unc.edu/~azuma/azuma_AR.html
note: many AR devices are small & lightweight!

16
Augmented Reality Headmount Systems
• http://www.cs.columbia.edu/graphics/
• Applications in assembly and maintenance
• Also in navigation

A Nice Little Augmented Reality System

• This project is from HITL


• Video see-through
– Inexpensive but low-res
• Video-based tracking
– Tracker recognizes the glyph on the
card
– Inexpensive but high latency
• Multiple cards with different
characters
• Characters interact when you
get them close to each other

17
Output: Audio

• Audio is important!
• Synthesis techniques
– library of canned samples
» one at a time
» mixed (compositing)
» MP3 digital audio compression format
– parametric model
» engine sound as a function of speed, incline, gear, throttle
www.staccatosys.com
» human voice driven by phonemes, inflection, emphasis, etc.
• Spatialized sound
– make sound seem to come from any point in space (not the
loudspeaker)
– need several loudspeakers, carefully phased
– might need model of listener’s head shape

Moving Through the Environment


• Best way is to walk—study at UNC comparing flying, walking in
place, walking showed that walking gave a greater sense of
presence
• With a wand, you can grab the environment and pull it past
yourself…
– This feels surprisingly natural
• Or you can fly through the environment.
– Sounds like fun...
– But your vision says you are moving while your inner ear says you are standing
still
– Surprise! Nausea is common
– Less severe if the image doesn’t cover your peripheral vision

• More clever:
– move a little doll replica of yourself through a little dollhouse replica of the
environment.
– You then shrink down into the dollhouse, and a new dollhouse appears.
– (All this pushing context bothers programmers, but not lay people).

18
Perceptual Issues Really Matter
• Re-directed walking – UNC

movie movie

Shared Virtual Environments


• Simple idea: two or more people look at the same geometry
• They can be widely separated; just draw avatars for those that aren’t
present locally.
• Have to avoid getting network latency into the loop
• What do you do if one person throws a virtual ball to the other?

19
Applications
• Flight simulators
• Architectural walk-throughs
• Design - interference testing (e.g. engine assembly)
• Teleoperation of robots in dangerous (Chernobyl) or
distant (Mars) locations
• Medical X-ray vision (e.g. ultrasound)
• Remote surgery
• Psychotherapy (e.g. fear of heights)
• Interactive microscopy

More Applications
• Video Games
• Location-Based Entertainment
– DisneyQuest
– Sony Metreon
– www.xulu.com
• Entertainment Technology (CMU)
– http://www.etc.cmu.edu/
• Virtualized Reality (CMU)
– http://www.ri.cmu.edu/projects/project_144.html
• Office of the Future (UNC)
– use walls / desktops as displays
– http://www.cs.unc.edu/Research/stc/office/
• Ubiquitous computing and wearable computers
– information superimposed on the environment

20
Other Graphics Courses

• Fall 2004
– 15-463 Advanced Rendering and Image Processing (Efros)
– 15-869 Physically Based Character Animation (Pollard)

• Spring 2005
– 15-493 Computer Game Programming (Kuffner)
– 15-505 / 60-414 Animation Art and Technology (Hodgins / Duesing)

– 15-864 Advanced Computer Graphics (James)


– Grad seminar (James)
– Grad seminar (Efros, tentative)

Announcements

• Grades for prog. project #3 and HW #3 out tonight

• Office hour 2-3 Friday to pick up homeworks, other questions


– NSH 4207

• No class Tuesday, April 27

• Thursday, April 29 (last class) – course review

• Course surveys

21

You might also like