WO1997019398A1 - A method for presenting virtual reality enhanced with tactile stimuli, and a system for executing the method - Google Patents
A method for presenting virtual reality enhanced with tactile stimuli, and a system for executing the method Download PDFInfo
- Publication number
- WO1997019398A1 WO1997019398A1 PCT/IB1996/001231 IB9601231W WO9719398A1 WO 1997019398 A1 WO1997019398 A1 WO 1997019398A1 IB 9601231 W IB9601231 W IB 9601231W WO 9719398 A1 WO9719398 A1 WO 9719398A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- tactile
- user
- control device
- virtual reality
- tactile stimulus
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 24
- 230000000007 visual effect Effects 0.000 claims abstract description 34
- 238000012545 processing Methods 0.000 claims abstract description 7
- 238000013507 mapping Methods 0.000 claims abstract description 6
- 230000004044 response Effects 0.000 claims abstract description 5
- 230000008713 feedback mechanism Effects 0.000 claims abstract description 4
- 230000008859 change Effects 0.000 claims description 4
- 230000002596 correlated effect Effects 0.000 claims 1
- 230000000694 effects Effects 0.000 description 8
- 241001417955 Agonidae Species 0.000 description 4
- 238000006243 chemical reaction Methods 0.000 description 4
- 238000010586 diagram Methods 0.000 description 4
- 239000000725 suspension Substances 0.000 description 4
- 238000005094 computer simulation Methods 0.000 description 3
- 239000004576 sand Substances 0.000 description 3
- 244000025254 Cannabis sativa Species 0.000 description 2
- 241000227653 Lycopersicon Species 0.000 description 2
- 235000007688 Lycopersicon esculentum Nutrition 0.000 description 2
- 239000013256 coordination polymer Substances 0.000 description 2
- 230000001419 dependent effect Effects 0.000 description 2
- 230000010354 integration Effects 0.000 description 2
- 230000035807 sensation Effects 0.000 description 2
- 230000035939 shock Effects 0.000 description 2
- 230000002123 temporal effect Effects 0.000 description 2
- 230000009466 transformation Effects 0.000 description 2
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 2
- 241001494479 Pecora Species 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 239000010426 asphalt Substances 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 239000012141 concentrate Substances 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 230000002708 enhancing effect Effects 0.000 description 1
- 230000000763 evoking effect Effects 0.000 description 1
- 244000144992 flock Species 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 230000001788 irregular Effects 0.000 description 1
- 238000012423 maintenance Methods 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 230000021317 sensory perception Effects 0.000 description 1
- 239000004575 stone Substances 0.000 description 1
- 230000002195 synergetic effect Effects 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/25—Output arrangements for video game devices
- A63F13/28—Output arrangements for video game devices responding to control signals received from the game device for affecting ambient conditions, e.g. for vibrating players' seats, activating scent dispensers or affecting temperature or light
- A63F13/285—Generating tactile feedback signals via the game input device, e.g. force feedback
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/55—Controlling game characters or game objects based on the game progress
- A63F13/57—Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/80—Special adaptations for executing a specific game genre or game mode
- A63F13/803—Driving vehicles or craft, e.g. cars, airplanes, ships, robots or tanks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/016—Input arrangements with force or tactile feedback as computer generated output to the user
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/25—Output arrangements for video game devices
- A63F13/28—Output arrangements for video game devices responding to control signals received from the game device for affecting ambient conditions, e.g. for vibrating players' seats, activating scent dispensers or affecting temperature or light
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/10—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
- A63F2300/1037—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted for converting control signals received from the game device into a haptic signal, e.g. using force feedback
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/60—Methods for processing data by generating or executing the game program
- A63F2300/64—Methods for processing data by generating or executing the game program for computing dynamical parameters of game objects, e.g. motion determination or computation of frictional forces for a virtual car
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/80—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
- A63F2300/8017—Driving on land or water; Flying
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/80—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
- A63F2300/8082—Virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/01—Indexing scheme relating to G06F3/01
- G06F2203/013—Force feedback applied to a game
Definitions
- a method for presenting virtual reality enhanced with tactile stimuli and a system for executing the method.
- the invention relates to a method for enabling a user of a data processing system to navigate through a virtual reality environment, wherein the method comprises generating a sequence of visual representations of the virtual reality environment on a visual display in response to the user manipulating a control device.
- the invention also relates to a data processing system for executing the method. Further advantageous aspects are recited in dependent Claims.
- US Patent 5,459,382 to Jacobus et al describes a method for providing tactile feedback in a virtual reality environment with six degrees of freedom.
- Dependent on the actual motion control effected by the user person on an interface device first a visual re- ality force field generator is activated, which in its turn causes a force signal to be generated back to the interface device.
- a conservative force field Through the definition of a conservative force field, various ty ⁇ pes of force feedback can be emulated to the user, for so allowing a user to exercise operati ⁇ ons that later will have to be executed in real world.
- the reference does not allow exercising with unexpected, and in particular, dynamic disturbances that would make real-life operating so much more complicated.
- the reference relates principally to simulating real-world, rather than a fully virtual reality such as is proper to entertainment systems.
- the method according to the invention is characterized by providing via a feedback mechanism appertaining to the manipulated control device, a preprogrammed dynamic tactile stimulus from an available database with tactile stimuli, to the user under selective and combined control by an actual interval of the sequence of generated visual representations.
- tactile stimuli in particular dynamic stimuli, when added to visual, and possibly auditory, stimuli, can contribute significantly to the user's impression of being actually involved in a virtual reality scenery such as in video games.
- Video and/or graphics are used to create visual representations of a specific scenery from a variety of viewpoints.
- the visual representation is altered through the control device as if the user were moving through the scenery.
- the visual representation shown on the display is a mapping of the virtual scenery onto a plane selected through the control device.
- a tactile texture is joined to the video information.
- tactile representations are mappings of attributes, pre-assigned to events in the virtual reality scenery, onto the control device.
- the tactile stimulus in the present invention is generated under control of the occurrence of at least a specific one of the visual representations of the virtual scenery, in combination with an actual pointing direction of the control device on a predetermined image subfield.
- the occurrence of a specific pixel pattern or texture, a particular color, or a particular level of greyness is accompanied by one or more particular tactile stimuli.
- This integration of visual and tactile is utilized to let the user associate a tactile stimulus with an object, an area or an event in the virtual reality scenery.
- the tactile stimulus in the invention is generated depending on a rate at which certain visual representations succeed each other in the sequence. This combination is employed to let the user associate the tactile stimulus with a speed at which the user moves through the virtual reality scenery.
- a video recording is made of a real scenery from a plurality of positions and from a plurality of viewing angles.
- the video data are used to provide the user with particular ones of a variety of visual representations of a virtual reality on a display when moving through the scenery. That is, the displayed image as a whole changes when the user moves, about as if perceived through a moving car's windscreen.
- the user is enabled to navigate through this virtual reality by controlling, for example, a mouse, trackball or joystick provided with features that provide force feedback. Tactile textures are associated with the visual representation.
- the visual representation of grass is combined with a slight amount of drag felt through the mouse or trackball when moving across a lawn.
- a representation of a tree is felt as an impenetrable object.
- a lake is represented by a pit and a dynamic wave resistance pattern when moving.
- a resistance with directional characteristics gives the impression of going uphill or going in the teeth of a gale, etcetera.
- the landscape has different types of surface regarding texture (hardness, flatness) and extent, such as mud, asphalt, grass, loose sand, cobblestones, brushwood, frozen underground, and all kinds of obstacles such as trees, stone walls, hills, rivers, ditches, flock of sheep, turnpikes, etcetera.
- texture hardness, flatness
- extent such as mud, asphalt, grass, loose sand, cobblestones, brushwood, frozen underground, and all kinds of obstacles such as trees, stone walls, hills, rivers, ditches, flock of sheep, turnpikes, etcetera.
- the relevant area of the landscape being displayed at a particular moment depends on the bike's actual direction and location with respect to the scenery.
- the surface of the area is tactilely represented by stimuli experienced by the user through the control device.
- the above tactile stimuli correspond to spatial characteristics, such as locations or orientations in the virtual scenery.
- a conditional tactile stimulus could be the occurrence of irregular and tiny shocks such as simulating tomatoes thrown by a (virtual) angry farmer when being passed by too close.
- tactile stimuli give a new dimension to virtual reality and considerably increase the sense of the user of actually being involved.
- the synergetic combination of video, audio and tactile stimuli in a software application considerably extends the concept of multimedia applications. Not only video games but also (semi-) professional application may benefit from tactile enhancement.
- a computer model of a novel suspension for a motorcycle or a computer model of a novel steering geometry for a car may use the invention to provide a first impression of the behaviour of the suspension or steering in all kinds of terrain and at all kinds of speed.
- Figure 1 is a block diagram of a system in the invention
- FIGS 2-4 illustrate various events in an example of a software application that combines video and tactile stimuli
- Figure 5 is a first geometry diagram of the invention
- FIG. 6 is a second geometry diagram of the invention. Throughout the drawing, same reference numerals indicate similar or corresponding features.
- FIG. 1 is a diagram of a data processing system 100 according to the invention.
- System 100 comprises a display 102, a control device 104 and a data processing apparatus 106.
- Control device 104 comprises, for example, one or more of the devices dis ⁇ closed in published European patent application 0 489 469.
- Manipulation of device 104 controls, via apparatus 106, an image or a sequence of images that are shown on display 102.
- Apparatus 106 supplies to device 104 control signals that generate stimuli for being tactilely sensed by the user. The signals have a predetermined relationship with the images actually shown on display 102.
- Apparatus 106 supplies to display 102 video and/or graphics data that represents a virtual reality scenery.
- the visual representation of the scenery changes in response to manipulation of control device 104.
- Manipulation of device 104 gives the visual impression of travelling through the scenery. For example, successive images are interrelated through a perspective transformation or through a displacement transformation.
- Apparatus 106 also supplies the control signals to evoke tactile feedback to the user through control device 104.
- the control signals are generated in synchronism with the movement through the virtual reality scenery or with the occurrence of some event in the virtual reality scenery, i.e. , the control signals are related to a succession of images.
- Apparatus 106 therefore is operative to combine video or graphics data with tactile data, the latter determining the generation of the control signals.
- the parallel supply of tactile data and video data is pre-determined in the sense that the generation of a particular image or of a sequence of particular images is accompanied by a particular tactile feedback. This is realized, for example, much in the same way as video or graphics data are accompanied by audio data in conventional video games.
- the user of the system in the invention is enabled to select the video data interactively and the corresponding tactile data are then generated automatically. For example, a more rapid sequence of images then also brings about a more rapid sequence of the associated tactile stimuli.
- system 100 comprises a first storage means 108 to store video data for generation of the visual representations under control of control device 104, and second storage means 110 to store tactile data for control of the generation of the tactile stimulus in a predetermined relationship with the video data.
- First and second storage means 108 and 110 are preferably physically integrated with one another, and the tactile data is preferably logically combined with the video data.
- the video data throughput must be correspondingly high, while the throughput of tactile data is lower than that of video data as tactile stimuli are typically low-frequency signals.
- the integration allows reading of the video data and the tactile data, which is logically combined with the video data, without substantially hampering the flow of video data by undesired, time-consuming context switching.
- One way to achieve this is, for example, merging tactile data with some video data records, identifying the tactile data as such upon reading, and supplying the tactile data to a look-up table 1 12.
- the tactile data are then just signals to control the generation of the tactile stimuli at control device 104.
- look-up table 112 is user-programmable to enable selection or tuning of the tactile feedback.
- storage means 108 and 1 10 are integrated with each other in a single information carrier 1 14 or cartridge such as a diskette or a CD. System 100 can then be used with different software applications by simply exchanging the carrier 1 14.
- Figures 2-4 give some examples of snap-shots of a video game of the kind mentioned above, wherein the user is supposed to play the role of a police officer on a motorcycle 200.
- Control device 104 controls speed and direction of motorcycle 200 through the virtual reality scenery.
- Control device comprises, for example, a first trackball or joystick for directional control and a second trackball or joystick for speed control, both the speed controller and the directional controller being provided with force-feedback means.
- the officer is to chase a vehicle 202 driven by poachers 204, 206, 208 and 210. Poachers 204-210 are strategically trying to stay beyond reach of the strong arm of the law and drive their truck 202 wildly through the scenery. The scenery changes as the hunt continues. End of the game is when, for example, either the poachers escape or the user has come so close that the truck's license plate 212 can be read, i.e.
- plate 212 is being displayed in legible characters.
- the video representation of motorcycle 200 is made to lean with respect to the vertical and horizontal axes in the scenery in synchronism with a change of direction brought about via control device 104 to intensify the impression of the user riding motorcycle 200 himself.
- motorcycle 200 is trying to catch truck 202 speeding along a bumpy path 204 of loose sand. It is hard to ride a motorcycle along a predetermined line through loose sand as the rear wheel tends to start wobbling, trying to throw the motorcycle off course, and as steering requires a considerable torque to be applied to the handlebars when changing directions.
- control device 104 This is made to be felt at control device 104 by system 100 exerting alternating transversal forces on the user's hand depending on, e.g., speed and leaning angle.
- the user therefore has to concentrate very hard in order not to loose control over motorcycle 200 and, at the same time, not to loose sight of truck 202.
- motorcycle 200 is ridden on a grey autumn day, in a drizzle at dusk (dinner time), along a road 300 with bumps and potholes that are made to be felt as sudden shocks at control device 104, and fallen leaves that tend to gather where the road makes a bend, and that are made to be felt as a sudden loss of resistance at control device 104.
- motorcycle 200 is chasing truck 202 along a street.
- Poacher 210 is trying to hit the officer by throwing objects at him, such as, say, tomatoes, spare wheels and other things that are carried in the trunk of a truck for this purpose. So, in addi ⁇ tion to keep motorcycle 200 on course, the user has to avoid projectiles 402, 404 and 406 that otherwise may hit the windscreen of the motorcycle and block the user's vision, or, worse, throw motorcycle 200 off the track altogether.
- the tactile stimuli introduced under Figures 2 and 3 relate to a spatial characteristic: the virtual texture of some areas in the scenery.
- the tactile stimulus in Figure 4 is made conditional in that it depends on the occurrence of the situation wherein certain events coincide: both the projectile 400 and motorcycle 200 are to occupy the same space in the virtual reality scenery at the same time in order to generate the tactile stimuli.
- temporal tactile stimuli may be introduced.
- the front tyre of motorcycle 200 may sustain damage as a result of which it is loosing pressure.
- Steering motorcycle 200 then becomes increasingly difficult, which is felt at control device 104 as wobbling reaction forces of increasing amplitude.
- the pave ⁇ ment of the route through the virtual reality scenery can be made to feel increasingly slippery with time, e.g. , through reduction of reaction forces on the user manipulating control device 104.
- tactile stimuli can be used in a video game dealing with how to negotiate obstacles on different types of terrain in a cross-country run, steeple chase or voyage of discovery; tactile stimuli can be used to si ⁇ mulate the water waves in a video game relating to canoeing in wild water or to the mooring of a vessel to an oil-rig in a storm; in a video game concerning a low-level flight of a heli ⁇ copter over broken ground and among obstacles tactile cues may be used to transmit the o vibration, that in reality is inherent in this type of aircraft.
- the tactile data to accompany the video data may be calculated in real-time when moving through the virtual reality scenery.
- This approach can be used, for example in a professional or semi-professional environment, when a computer model for a suspension system or a steering geometry of a vehicle is tested.
- Using the invention provides a first impression of the behaviour of the suspension in all kinds of terrain and at all kinds of speed. The developer is thus enabled to make a purposive selection of parameter values describing the best performing computer trials for real testing later on.
- tactual fields can be used in combination with auditory and visual images. For example, one could spatially navigate through linked vi ⁇ deo information, similar to moving through a virtual reality environment, and encounter various heard, seen and felt objects. Tactual representations for specific objects may extend beyond a representation of the object's physical surface plane. Dynamic fields can be created using tactual information to suggest a certain feeling or experience. They can also be used to guide the user in a suggested direction of movement. The fields may be displayed using force feedforward (active hand-independent device movement) and feedback through a device such as a 3-D trackball with force feedback. Furthermore two or more dynamic tactual fields can be combined to create a new experience. Sound effects (e.g. , from a video source) may also accompany tactual effects. Several examples of dynamic tactual fields are described below.
- the conveyor belt acts like a walking platform that moves.
- the input device e.g. trackball
- the input device begins to move and leads the user to a predesignated position.
- This movement can be based on using a constant force, which causes the input device to accelerate.
- the latter motion can be time-based.
- the time in which the user should be moved from point a to point b is specified.
- the amount of force on the device is adjusted in real-time to maintain the average velocity at a required value, given time and distance.
- the bumpy road consists of two combined tactual fields, being a "road field” and a "texture field”, respectively.
- the road guides the user along a predesignated course by providing force feedback walls. The user can also feel the width of the path. Unlike the conveyor belt, the device does not move on it's own, rather the user can passively follow a felt path.
- the texture can be changed at any point along the path by superposing a texture field on the road field.
- the texture field can be adjusted to provide "bump" sensations at a given intensity.
- the control-pixel distance between "bumps” can also be specified in the x and y directions. By using more than one texture field on a path, the texture can change as the user moves along the road. Sounds associated with moving over bumps have also been used in synchronization with the felt bumps to enhance the experience.
- a cyclone or swirling effect can be created to enhance the experience of falling into a "hole” .
- the user can see and hear effects and feel the device (e.g., ball) enter a circling-like mode.
- the effect can also be used to create vibrations using very tight swirls.
- ConvF.F is the force vector, composed of two elements x and y, applied on the conveyor belt (force based) object, f (user-variable) is the force magnitude, and alpha (user-variable) is the angle over which the conveyor is rotated.
- kp, kd and ki are constants with typical values of 48, 16 and 0.01 respectively, and
- errorP is the difference between the desired cursor position DP at a particular moment and the current cursor position CP
- a is the cursor position at the moment the conveyor belt started
- b is the position the cursor should be moved towards
- total_time user variable
- delta errorP * errorP - (previous value of errorP)
- road.F is the force vector, composed of two elements x and y, applied to the road object, F is the force magnitude, and alpha is the angle over which the road is rotated, and
- texture.F is the force vector applied on the texture object, composed of two elements x and y, and F is the force magnitude
- Gx is the granularity of the texture in the X direction and Gy Is the granularity of the texture in the y direction.
- F is the force vector, composed of two elements x and y, applied during the Swirl effect
- Cl and C2 are constants representing the force magnitude in the x and y directions, respectively
- f is the frequency of the swirl
- t (O ⁇ t ⁇ d) is the current time
- d is the duration of the effect
- ml and m2 are contants (O ⁇ ml ⁇ l , 0 ⁇ m2 ⁇ l) representing the start and end points, respectively, of the slope function affecting the amplitude.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Processing Or Creating Images (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
Claims
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP9519547A JPH10513593A (en) | 1995-11-24 | 1996-11-15 | Method for presenting virtual reality enhanced by tactile stimuli and system for performing the method |
EP96935264A EP0806002A1 (en) | 1995-11-24 | 1996-11-15 | A method for presenting virtual reality enhanced with tactile stimuli, and a system for executing the method |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP95203234 | 1995-11-24 | ||
EP95203234.0 | 1995-11-24 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO1997019398A1 true WO1997019398A1 (en) | 1997-05-29 |
Family
ID=8220862
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/IB1996/001231 WO1997019398A1 (en) | 1995-11-24 | 1996-11-15 | A method for presenting virtual reality enhanced with tactile stimuli, and a system for executing the method |
Country Status (3)
Country | Link |
---|---|
EP (1) | EP0806002A1 (en) |
JP (1) | JPH10513593A (en) |
WO (1) | WO1997019398A1 (en) |
Cited By (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO1999038064A2 (en) * | 1998-01-23 | 1999-07-29 | Koninklijke Philips Electronics N.V. | Multiperson tactual virtual environment |
GB2337828A (en) * | 1998-05-25 | 1999-12-01 | Daewoo Electronics Co Ltd | Virtual reality simulation apparatus |
WO2000071217A1 (en) * | 1999-05-21 | 2000-11-30 | Michael Charles Cooke | A feedback assembly for computer games |
WO2007117649A2 (en) * | 2006-04-06 | 2007-10-18 | Immersion Corporation | Systems and methods for enhanced haptic effects |
CN105094798A (en) * | 2014-05-20 | 2015-11-25 | 意美森公司 | Haptic design authoring tool |
US9468846B2 (en) | 2009-01-30 | 2016-10-18 | Performance Designed Products Llc | Tactile feedback apparatus and method |
CN106128323A (en) * | 2016-09-06 | 2016-11-16 | 卓汎有限公司 | Car window virtual reality display system |
US9836117B2 (en) | 2015-05-28 | 2017-12-05 | Microsoft Technology Licensing, Llc | Autonomous drones for tactile feedback in immersive virtual reality |
EP3267290A1 (en) * | 2016-07-08 | 2018-01-10 | Thomson Licensing | Method, apparatus and system for rendering haptic effects |
WO2018010823A1 (en) * | 2016-07-15 | 2018-01-18 | Irdeto B.V. | Obtaining a user input |
US9898864B2 (en) | 2015-05-28 | 2018-02-20 | Microsoft Technology Licensing, Llc | Shared tactile interaction and user safety in shared space multi-person immersive virtual reality |
US9911232B2 (en) | 2015-02-27 | 2018-03-06 | Microsoft Technology Licensing, Llc | Molding and anchoring physically constrained virtual environments to real-world environments |
CN110045826A (en) * | 2019-04-01 | 2019-07-23 | 北京小马智行科技有限公司 | Virtual reality experience methods, devices and systems applied to vehicle |
Families Citing this family (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JO1406B1 (en) * | 1984-11-02 | 1986-11-30 | سميث كلاين اند فرينش لابوراتوريز ليمتد | Chemical compounds |
US7623114B2 (en) | 2001-10-09 | 2009-11-24 | Immersion Corporation | Haptic feedback sensations based on audio output from computer devices |
US6703550B2 (en) * | 2001-10-10 | 2004-03-09 | Immersion Corporation | Sound data output and manipulation using haptic feedback |
WO2007029811A1 (en) * | 2005-09-08 | 2007-03-15 | Sega Corporation | Game machine program, game machine, and recording medium storing game machine program |
KR101244442B1 (en) * | 2011-04-13 | 2013-03-18 | 한국기술교육대학교 산학협력단 | Haptic Controller and Device Controlling System Using Thereof |
WO2015121970A1 (en) * | 2014-02-14 | 2015-08-20 | 富士通株式会社 | Educational tactile device and system |
CN110456973B (en) * | 2019-07-16 | 2022-08-19 | 江苏铁锚玻璃股份有限公司 | Touch-based transparent and opaque one-key switching method and device for intelligent vehicle window |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO1992018925A1 (en) * | 1991-04-20 | 1992-10-29 | W. Industries Limited | Haptic computer output device |
US5255211A (en) * | 1990-02-22 | 1993-10-19 | Redmond Productions, Inc. | Methods and apparatus for generating and processing synthetic and absolute real time environments |
US5405152A (en) * | 1993-06-08 | 1995-04-11 | The Walt Disney Company | Method and apparatus for an interactive video game with physical feedback |
-
1996
- 1996-11-15 WO PCT/IB1996/001231 patent/WO1997019398A1/en not_active Application Discontinuation
- 1996-11-15 EP EP96935264A patent/EP0806002A1/en not_active Withdrawn
- 1996-11-15 JP JP9519547A patent/JPH10513593A/en active Pending
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5255211A (en) * | 1990-02-22 | 1993-10-19 | Redmond Productions, Inc. | Methods and apparatus for generating and processing synthetic and absolute real time environments |
WO1992018925A1 (en) * | 1991-04-20 | 1992-10-29 | W. Industries Limited | Haptic computer output device |
US5405152A (en) * | 1993-06-08 | 1995-04-11 | The Walt Disney Company | Method and apparatus for an interactive video game with physical feedback |
Cited By (27)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO1999038064A3 (en) * | 1998-01-23 | 1999-09-30 | Koninkl Philips Electronics Nv | Multiperson tactual virtual environment |
WO1999038064A2 (en) * | 1998-01-23 | 1999-07-29 | Koninklijke Philips Electronics N.V. | Multiperson tactual virtual environment |
GB2337828A (en) * | 1998-05-25 | 1999-12-01 | Daewoo Electronics Co Ltd | Virtual reality simulation apparatus |
WO2000071217A1 (en) * | 1999-05-21 | 2000-11-30 | Michael Charles Cooke | A feedback assembly for computer games |
WO2007117649A2 (en) * | 2006-04-06 | 2007-10-18 | Immersion Corporation | Systems and methods for enhanced haptic effects |
WO2007117649A3 (en) * | 2006-04-06 | 2008-09-12 | Immersion Corp | Systems and methods for enhanced haptic effects |
US10152124B2 (en) | 2006-04-06 | 2018-12-11 | Immersion Corporation | Systems and methods for enhanced haptic effects |
EP3287874A1 (en) * | 2006-04-06 | 2018-02-28 | Immersion Corporation | Systems and methods for enhanced haptic effects |
US9468846B2 (en) | 2009-01-30 | 2016-10-18 | Performance Designed Products Llc | Tactile feedback apparatus and method |
CN105094798A (en) * | 2014-05-20 | 2015-11-25 | 意美森公司 | Haptic design authoring tool |
US10191552B2 (en) | 2014-05-20 | 2019-01-29 | Immersion Corporation | Haptic authoring tool using a haptification model |
EP2947547A1 (en) * | 2014-05-20 | 2015-11-25 | Immersion Corporation | Haptic design authoring tool |
US9921653B2 (en) | 2014-05-20 | 2018-03-20 | Immersion Corporation | Haptic authoring tool using a haptification model |
US9330547B2 (en) | 2014-05-20 | 2016-05-03 | Immersion Corporation | Haptic effect authoring tool based on a haptification model |
US9911232B2 (en) | 2015-02-27 | 2018-03-06 | Microsoft Technology Licensing, Llc | Molding and anchoring physically constrained virtual environments to real-world environments |
US9898864B2 (en) | 2015-05-28 | 2018-02-20 | Microsoft Technology Licensing, Llc | Shared tactile interaction and user safety in shared space multi-person immersive virtual reality |
US9836117B2 (en) | 2015-05-28 | 2017-12-05 | Microsoft Technology Licensing, Llc | Autonomous drones for tactile feedback in immersive virtual reality |
EP3267290A1 (en) * | 2016-07-08 | 2018-01-10 | Thomson Licensing | Method, apparatus and system for rendering haptic effects |
EP3267288A1 (en) * | 2016-07-08 | 2018-01-10 | Thomson Licensing | Method, apparatus and system for rendering haptic effects |
US20180013640A1 (en) * | 2016-07-08 | 2018-01-11 | Thomson Licensing | Method, apparatus and system for rendering haptic effects |
CN107589830A (en) * | 2016-07-08 | 2018-01-16 | 汤姆逊许可公司 | For the methods, devices and systems of haptic effect to be presented |
WO2018010823A1 (en) * | 2016-07-15 | 2018-01-18 | Irdeto B.V. | Obtaining a user input |
CN109416613A (en) * | 2016-07-15 | 2019-03-01 | 爱迪德技术有限公司 | Obtain user's input |
US11113380B2 (en) | 2016-07-15 | 2021-09-07 | Irdeto B.V. | Secure graphics |
US11727102B2 (en) | 2016-07-15 | 2023-08-15 | Irdeto B.V. | Obtaining a user input |
CN106128323A (en) * | 2016-09-06 | 2016-11-16 | 卓汎有限公司 | Car window virtual reality display system |
CN110045826A (en) * | 2019-04-01 | 2019-07-23 | 北京小马智行科技有限公司 | Virtual reality experience methods, devices and systems applied to vehicle |
Also Published As
Publication number | Publication date |
---|---|
EP0806002A1 (en) | 1997-11-12 |
JPH10513593A (en) | 1998-12-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP0806002A1 (en) | A method for presenting virtual reality enhanced with tactile stimuli, and a system for executing the method | |
Hock et al. | Carvr: Enabling in-car virtual reality entertainment | |
EP3082122B1 (en) | Applied layout in virtual motion-acceleration spherical simulator | |
KR102161646B1 (en) | System and method for interworking virtual reality and indoor exercise machine | |
US6426752B1 (en) | Game device, method of processing and recording medium for the same | |
CN109478341B (en) | Simulation system, processing method, and information storage medium | |
KR101094858B1 (en) | Real time virtual reality sports platform device | |
US20040110565A1 (en) | Mobile electronic video game | |
JPH07507402A (en) | Driver training system with performance data feedback | |
KR20000064948A (en) | Image processing apparatus and image processing method | |
EP0970414B1 (en) | Multiperson tactual virtual environment | |
WO1996006410A1 (en) | Three-dimensional simulator and image synthesizing method | |
JP3273038B2 (en) | Virtual experience type game device | |
CN110728878A (en) | Somatosensory interactive VR driving simulation device | |
JP4114822B2 (en) | Image generating apparatus and information storage medium | |
US7246103B2 (en) | Probabilistic model of distraction for a virtual reality environment | |
JPH06277362A (en) | Three-dimensional game device | |
RU2149667C1 (en) | Apparatus for exercising and competing, particularly, for performing sportive locomotions and games | |
JP3273017B2 (en) | Image synthesis device and virtual experience device using the same | |
JP3770290B2 (en) | Image processing device, amusement facility and vehicle for amusement facility | |
JPH07121096A (en) | Spatial perception ataxia training device | |
JPH113437A (en) | Image synthesizer and image synthesizing method | |
JPH11134515A (en) | Game device and game screen compositing method | |
JP3638669B2 (en) | Image composition method and game device | |
Lawlor | Virtual reality vehicle simulator phase 1 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): JP |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): AT BE CH DE DK ES FI FR GB GR IE IT LU MC NL PT SE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1996935264 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref country code: JP Ref document number: 1997 519547 Kind code of ref document: A Format of ref document f/p: F |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWP | Wipo information: published in national office |
Ref document number: 1996935264 Country of ref document: EP |
|
WWW | Wipo information: withdrawn in national office |
Ref document number: 1996935264 Country of ref document: EP |