WO2018198777A1 - Dispositif et programme de fourniture d'image de réalité virtuelle - Google Patents
Dispositif et programme de fourniture d'image de réalité virtuelle Download PDFInfo
- Publication number
- WO2018198777A1 WO2018198777A1 PCT/JP2018/015260 JP2018015260W WO2018198777A1 WO 2018198777 A1 WO2018198777 A1 WO 2018198777A1 JP 2018015260 W JP2018015260 W JP 2018015260W WO 2018198777 A1 WO2018198777 A1 WO 2018198777A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- virtual reality
- avatar
- image
- data
- hmd
- Prior art date
Links
- 208000013057 hereditary mucoepithelial dysplasia Diseases 0.000 description 19
- 238000013500 data storage Methods 0.000 description 18
- 238000001514 detection method Methods 0.000 description 15
- 210000003128 head Anatomy 0.000 description 15
- 238000004891 communication Methods 0.000 description 6
- 238000010586 diagram Methods 0.000 description 5
- 210000005252 bulbus oculi Anatomy 0.000 description 4
- 230000005540 biological transmission Effects 0.000 description 3
- 238000000034 method Methods 0.000 description 3
- 210000005069 ears Anatomy 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 1
- 230000008929 regeneration Effects 0.000 description 1
- 238000011069 regeneration method Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
Definitions
- the present invention relates to a virtual reality image providing apparatus and a virtual reality image providing program, and is particularly suitable for use in a virtual reality image providing apparatus configured to display a viewer's avatar image in a virtual reality space. .
- VR virtual reality
- Patent Document 1 discloses a configuration in which an avatar is positioned based on position information of a user's eyeball wearing an HMD when displaying an avatar image in a three-dimensional space. Specifically, it describes that a person's head is positioned and an avatar is moved based on positional information of two eyeballs, a first eyeball and a second eyeball.
- Patent Document 2 discloses that the direction in which the user is looking in the virtual reality space can be easily determined by changing the orientation of the head of the avatar.
- the three-dimensional output server sequentially transmits screen data to be displayed to the HMD, while receiving data relating to the rotation of the HMD in the yaw direction and the pitch direction from the HMD, and based on the received data relating to the rotation.
- changing the user's avatar data corresponding to the HMD is described.
- the three-dimensional output server sequentially transmits screen data to a plurality of HMDs.
- the screen data to be transmitted includes avatar data that is data related to an avatar that is a part of each user wearing each HMD, and spatial data that is data related to a virtual three-dimensional space.
- Each user's HMD displays a virtual three-dimensional space screen of the entire sky centered on itself, and also displays a plurality of avatars in the virtual three-dimensional space.
- the same virtual three-dimensional space screen is displayed on any HMD, and a plurality of avatars are displayed in the same manner on any HMD.
- the image does not reflect the real space in which a plurality of users wearing the HMD exist, and there is a problem that the display has a sense of reality or a sense of reality.
- the present invention was made to solve such a problem, and when displaying avatar images corresponding to a plurality of viewers wearing HMDs in the real space in the virtual reality space image of the HMD, It is an object to provide a virtual reality space image with a more realistic feeling.
- spatial data related to a virtual reality space avatar data related to an avatar that is at least another viewer among a plurality of viewers existing in the real space, a plurality of Based on the relative position data representing the actual relative positional relationship of the viewers, the user's own position is set as a predetermined position in the screen, and the actual relative positional relationship of a plurality of viewers is reflected with reference to the predetermined position in the screen.
- the virtual reality space image in which the avatar image exists at the selected position is displayed.
- each viewing is performed for each HMD.
- An avatar image is displayed at a position reflecting the relative positional relationship of a plurality of viewers with the user's own position as a reference.
- the virtual reality space image displayed on each HMD becomes the one in which the avatar image of the other viewer is displayed at the same relative position as the reality for each viewer, and provides a more realistic virtual reality space image. can do.
- FIG. 1 is a diagram illustrating an example of the overall configuration of a VR image display system to which the virtual reality image providing apparatus according to the present embodiment is applied.
- the VR image display system includes a plurality of HMDs 100 ⁇ 1 , 100 ⁇ 2 ,... 100 ⁇ n (hereinafter, collectively referred to as HMD 100) worn by a plurality of viewers, And a computer 200.
- the HMD 100 corresponds to the virtual reality image providing apparatus of the present embodiment.
- the plurality of HMDs 100 and the external computer 200 are connected by a wired or wireless communication network 300.
- the plurality of HMDs 100 each have a built-in computer for reproducing three-dimensional spatial data related to the virtual reality space.
- the external computer 200 transmits the avatar data regarding the avatar to be displayed in the three-dimensional space to each HMD 100.
- Each HMD 100 generates and displays a virtual reality space image in which each viewer's avatar image is superimposed in a three-dimensional space image.
- the above functional blocks 11 to 16 can be configured by any of hardware, DSP (Digital Signal Processor), and software.
- each of the functional blocks 11 to 16 is actually configured to include a CPU, RAM, ROM, etc. of a computer built in the HMD 100, and is recorded in a RAM, ROM, hard disk, semiconductor memory, or the like. This is realized by operating a program stored in a medium.
- the spatial data storage unit 101 stores spatial data related to the virtual reality space.
- the spatial data stored in the spatial data storage unit 101 is three-dimensional spatial data of the whole sky so that the display can be changed according to the viewing direction of the viewer.
- This three-dimensional spatial data can be generated in advance using a known technique related to virtual reality.
- the three-dimensional spatial data is generated by, for example, a personal computer installed with a dedicated editor and stored in the spatial data storage unit 101 of the HMD 100.
- the spatial data acquisition unit 11 acquires 3D spatial data from the spatial data storage unit 101 and supplies the acquired 3D spatial data to the image generation unit 14.
- the acquisition of the three-dimensional space data is executed when the virtual reality space image is displayed on the HMD 100.
- the avatar data acquisition unit 12 acquires avatar data related to an avatar, which is a clone of a viewer other than itself among a plurality of viewers existing in the real space, from the external computer 200 via the communication network 300, and acquired avatar data Is stored in the avatar data storage unit 102.
- the avatar data acquired by the avatar data acquisition unit 12 is three-dimensional image data that can be changed to display in a state viewed from different directions by changing the viewpoint.
- the relative position data acquisition unit 13 acquires relative position data representing the actual relative positional relationship of a plurality of viewers existing in the real space from the external computer 200 via the communication network 300, and the acquired relative position data is relative.
- the data is stored in the position data storage unit 103.
- the position where a plurality of viewers wearing the HMD 100 exist is arbitrary, but the positional relationship between them is not dynamically changed, but fixed. For example, a case where a seat where a plurality of viewers wearing the HMD 100 are seated is determined in advance.
- the positions of a plurality of viewers existing in the real space are determined in advance, and relative position data representing the actual relative positional relationship of the plurality of viewers is set in advance in the external computer 200. Yes.
- the relative position data acquisition unit 13 of each HMD 100 acquires the relative position data stored in the external computer 200.
- FIG. 3 is a diagram for explaining an example of the relative position data.
- FIG. 3A shows a state in which the relative positions of the other five HMDs 100 -2 to 100 -6 are expressed with reference to the position of the HMD 100 -1 located at the left end of the arc.
- a line connecting two HMDs 100 indicates a relative positional relationship.
- the relative position data corresponding to this line may be constituted by a vector data relative to the position of the HMD 100 -1, it may be constituted by the coordinate data.
- FIG. 3 (a) case where the relative position data relative to the position of the HMD 100 -1, the relative position data HMD 100 -1 to get.
- the relative position data HMD 100 -2 is obtained, in which reference to the position of the HMD 100 -2, expressed another five HMD 100 -1, the relative positional relationship between the 100 -3 100 -6 respectively .
- the relative position data HMD 100 -3 is obtained, which on the basis of the position of the HMD 100 -3, expressed another five HMD 100 -1 to 100 -2, the relative positional relationship between the 100 -4 to 100 -6 respectively It is.
- the relative position data acquired by the other HMDs 100 -4 to 100 -6 represent the relative positional relations with the other five HMDs 100 based on their own positions.
- Each of the HMDs 100 -2 to 100 -6 acquires relative position data corresponding to itself from the external computer 200.
- FIG. 3B shows another example of relative position data.
- the relative position data represents the relative positional relationship between the HMDs 100 adjacent to each other. That is, the relative positional relationship between the HMD 100 -1 and HMD 100 -2, the relative positional relationship between the HMD 100 -2 and HMD 100 -3, relative positional relationship between the HMD 100 -3 and HMD 100 -4, the HMD 100 -4 and HMD1005 -5
- Relative position data is configured to represent the relative position relationship and the relative position relationship between the HMD 100 -5 and the HMD 100 -6 .
- the relative position data acquired by the six HMDs 100 -1 to 100 -6 are all the same. However, data indicating which position the user belongs to among the six HMDs 100 -1 to 100 -6 included in the relative position data is necessary. As for the data indicating its own position, each of the six HMDs 100 -1 to 100 -6 acquires the corresponding data from the external computer 200.
- the spatial data acquisition unit 11 it is indispensable to synchronize the acquisition of the three-dimensional spatial data by the spatial data acquisition unit 11, the acquisition of the avatar data by the avatar data acquisition unit 12, and the acquisition of the relative position data by the relative position data acquisition unit 13. Absent. For example, when avatar data and relative position data are acquired from the external computer 200 in advance and stored in the avatar data storage unit 102 and the relative position data storage unit 103 and the virtual reality space image is displayed on the HMD 100, spatial data acquisition is performed. The unit 11 may acquire the three-dimensional spatial data from the spatial data storage unit 101.
- the direction detection unit 16 detects the direction in which the head of the viewer wearing the HMD 100 is facing. That is, the HMD 100 is equipped with a gyro sensor and an acceleration sensor. The direction detection unit 16 can detect the movement of the viewer's head based on detection signals from these sensors.
- the spatial data acquisition unit 11 reads from the spatial data storage unit 101 so that the three-dimensional space realized on the display of the HMD 100 changes dynamically according to the movement of the viewer's head detected by the direction detection unit 16. Change the 3D spatial data to be acquired.
- the spatial data reproduction unit 14A of the image generation unit 14 reproduces the three-dimensional spatial data acquired by the spatial data acquisition unit 11 for display in accordance with the movement of the viewer's head. As a result, a three-dimensional space image in which the front three-dimensional space is expanded when the viewer faces the front is reproduced, and a three-dimensional space image in which the right three-dimensional space is expanded when the viewer is turned to the right is reproduced. When the viewer turns to the left, a three-dimensional space image is reproduced so that the left three-dimensional space is expanded.
- the avatar image generation unit 14B generates an avatar so that the orientation of the avatar image realized on the display of the HMD 100 changes dynamically according to the movement of the viewer's head detected by the direction detection unit 16. Change the image. For example, when it is detected by the direction detection unit 16 that the viewer has turned to a direction in which another viewer is viewed in front, the avatar image generation unit 14B displays the avatar in a state where the other viewer is viewed straight. Generate an image. In addition, when the direction detection unit 16 detects that the viewer has turned to a direction in which another viewer is viewed obliquely, the avatar image generation unit 14B is in a state of viewing the other viewer from an oblique direction. Generate an avatar image.
- FIG. 4 shows a three-dimensional space expressed by the three-dimensional space data, a relative positional relationship between each avatar (viewer) expressed by the relative position data, and the orientation of the viewer's head detected by the direction detection unit 16. It is a figure which shows the relationship with the range of the virtual reality space image produced
- FIGS. 4A and 4B show an example when the viewer 111 -1 (the viewer wearing the HMD 100 -1 ) located at the left end of the arc is used as a reference.
- FIG. 4C shows an example in which the viewer 111 -3 (viewer wearing the HMD 100 -3 ) located in the middle of the arc is used as a reference.
- FIG. 5 is a diagram showing an example of a virtual reality space image generated by the image generation unit 14 corresponding to the states of FIGS. 4 (a) to 4 (c). That is, FIGS. 5A and 5B show virtual reality space images displayed on the HMD 100 -1 worn by the viewer 111 -1 located at the left end of the arc. Further, FIG. 5 (c) shows a virtual reality space image viewer 111 -3 located arc center is displayed on the HMD 100 -3 mounted. In FIG. 5, for convenience of explanation, only the avatar image is shown, and the three-dimensional space image is not shown.
- the image generation unit 14 of the viewer 111 -1 wears HMD 100 -1 which is located at the left end of the arc, the center position of the three-dimensional space 41 which is represented by a three-dimensional spatial data It is grasped that the viewer 111-1 exists.
- other viewers 111 -2 to 111 -6 viewers respectively mounted HMD 100 -2 to 100 -6) are real relative represented by the relative position data It is grasped that each exists at a position reflecting the positional relationship.
- the image generation unit 14 sets the position of the viewer 111 -1 as the center position at the lower end of the screen, A virtual reality space image is generated so that the three-dimensional space in the direction indicated by the range 42 is expanded. Therefore, as shown in FIG. 5A, the avatar images related to the other viewers 111 -2 to 111 -6 do not exist in the virtual reality space image generated by the image generation unit 14.
- the direction detection unit 16 detects that the viewer 111 -1 is facing the direction of the arrow B (the direction 90 degrees to the left of the arrow A).
- the image generating unit 14 the position of the viewer 111 -1 to the center position of the bottom of the screen, to generate a virtual reality space image as spread three-dimensional space in the direction indicated by the range 43. Therefore, as shown in FIG. 5B, avatar images related to the other viewers 111 -2 to 111 -6 exist in the virtual reality space image generated by the image generation unit 14.
- the avatar images of the viewers 111 -2 to 111-6 reflect the perspective at the positions that reflect the actual relative positional relationship with the center position at the bottom of the screen where the viewers 111-1 exist as a reference. Each size is displayed.
- the virtual reality space images generated by the image generation unit 14 include avatars related to the other viewers 111 -1 to 111 -2 and 111 -5 to 111 -6. There is an image.
- the avatar images of the viewers 111 -1 to 111 -2 and 111 -5 to 111 -6 reflect the actual relative positional relationship with reference to the center position of the lower end of the screen where the viewer 111 -3 exists. Are displayed in a size reflecting the perspective.
- the spatial data related to the virtual reality space the avatar data related to the avatar that is a substitute for the viewer other than the self among the plurality of viewers existing in the real space, and the plurality of viewers Based on the relative position data representing the actual relative positional relationship, a position that reflects the actual relative positional relationship of a plurality of viewers with the user's own position as a predetermined position in the screen and the predetermined position in the screen as a reference
- a virtual reality space image in which an avatar image exists is displayed.
- the virtual reality image providing apparatus may be mounted on the external computer 200.
- the HMD 100 includes a display control unit 15 and a direction detection unit 16.
- the external computer 200 includes a spatial data storage unit 101, an avatar data storage unit 102 and a relative position data storage unit 103, a spatial data acquisition unit 11, an avatar data acquisition unit 12, a relative position data acquisition unit 13 and an image generation unit 14. Prepare.
- the avatar data acquisition part 12 acquires avatar data of viewers other than himself among the some viewers which exist in real space
- this invention is not limited to this.
- the avatar data acquisition unit 12 acquires the avatar data of all viewers existing in the real space including itself, and the image generation unit 14 generates the virtual reality space image including the own avatar image. May be.
- the avatar images are reflected by reflecting the direction of the head that the actual viewers are facing. May be generated.
- the viewer 111 -6 who is at the right end of the arc, it has been detected by the direction detecting section 16 facing to the right there are the viewer 111 -1, viewers the avatar image corresponding to the viewer 111 -6 in the virtual reality space image to be displayed on the 111 -1 HMD 100 -1, and image facing the viewer 111 -1.
- each HMD 100 notifies the external computer 200 of the direction of the viewer's head detected by the direction detection unit 16 via the communication network 300.
- the external computer 200 notifies the direction of the viewer's head notified from each HMD 100 to the HMD 100 other than the notification source via the communication network 300.
- the image generation unit 14 generates an avatar image of the other viewer in consideration of the head direction of the other viewer notified from the external computer 200.
- FIG. 6 is a view showing a modification of the VR image display system to which the virtual reality image providing apparatus according to the present embodiment is applied.
- FIG. 6 shows a configuration related to audio output in addition to the display of the virtual reality space image.
- the VR image display system shown in FIG. 6 includes an external speaker 400 shared by a plurality of viewers and a speaker mounted on the HMD 100 (hereinafter referred to as a mounted speaker).
- the on-board speaker included in the HMD 100 may be a headphone-type speaker configured to be positioned near both ears when a viewer wears the HMD 100, or may be configured to be positioned other than near both ears. It may be a small speaker.
- the HMD 100 may include a headset having a microphone in addition to a headphone type speaker.
- the external computer 200 reproduces sound in synchronization with the display of the virtual reality space image and outputs it from the external speaker 400.
- the individual HMDs 100 worn by a plurality of viewers also reproduce sound in synchronization with the display of the virtual reality space image and output from the mounted speaker.
- the main sound is output from the external speaker 400 and the auxiliary sound is output from the speaker mounted on the HMD 100.
- a high volume main sound is output from the external speaker 400 while a low volume sub sound is output from the speaker mounted on the HMD 100.
- the audio data related to the sub-audio output from the speaker mounted on the HMD 100 may be stored in advance in the spatial data storage unit 101 of the HMD 100, or the HMD 100 may acquire it from the external computer 200 during reproduction.
- the audio data related to the secondary audio output from the speaker mounted on the HMD 100 may be data obtained by transmitting the speaker's speaker audio input from a microphone of a certain HMD 100 to another HMD 100 via the external computer 200. . In this way, the viewer can hear the main sound output from the external speaker 400 and can hear the speaker sound of other viewers as auxiliary sound from the speaker mounted on the HMD 100.
- the transmission source of the voice data is from the mounted speaker.
- the speaker voice may be output from either the left speaker or the right speaker depending on whether it is positioned relatively to the left side or the right side when viewed from another HMD 100 that outputs.
- the HMD 100 -1 worn by the viewer 111 -1 located at the left end of the arc from the speaker mounted on the HMD 100 -3 worn by the viewer 111 -3 at the center of the arc shown in FIG.
- the speaker voice of the viewer 111 -1 is output based on the received voice data
- the HMD 100 -1 of the viewer 111 -1 is positioned on the right side as viewed from the central HMD 100 -3.
- the speaker voice is output only from
- the volume of the speaker voice to be output may be adjusted according to the relative distance between the HMD 100 that is the transmission source of the audio data and the HMD 100 that is the transmission destination. In this way, since the speaker voice can be heard from the direction where the speaker is actually present at a volume corresponding to the actual relative distance, the sense of reality can be increased.
Landscapes
- Engineering & Computer Science (AREA)
- Computer Graphics (AREA)
- Computer Hardware Design (AREA)
- General Engineering & Computer Science (AREA)
- Software Systems (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Processing Or Creating Images (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Le but de la présente invention est de fournir des images plus réalistes de l'espace de réalité virtuelle. Le but est atteint de la manière suivante. Une image d'un espace de réalité virtuelle est affichée sur la base de données spatiales tridimensionnelles relatives à un espace de réalité virtuelle, des données d'avatar relatives à des avatars qui jouent le rôle d'alter ego numériques pour une pluralité de participants existant dans le monde réel, et des données de position relative représentant les positions dans le monde réel de la pluralité de participants en relation les uns avec les autres. La propre position d'un participant donné est définie comme une position de visualisation prescrite. En affichant, avec la position de visualisation prescrite en tant que référence, une image de l'espace de réalité virtuelle où des images d'avatar sont présentes dans des positions reflétant les positions relatives dans le monde réel de la pluralité de participants, l'image de l'espace de réalité virtuelle affichée sur chaque visiocasque est telle que chaque participant perçoit des images d'avatar d'autres participants afin d'occuper les mêmes positions relatives lorsque lesdits participants se trouvent dans le monde réel.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2018563937A JP6506486B2 (ja) | 2017-04-28 | 2018-04-11 | 仮想現実画像提供装置および仮想現実画像提供用プログラム |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2017-090112 | 2017-04-28 | ||
JP2017090112 | 2017-04-28 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2018198777A1 true WO2018198777A1 (fr) | 2018-11-01 |
Family
ID=63918228
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2018/015260 WO2018198777A1 (fr) | 2017-04-28 | 2018-04-11 | Dispositif et programme de fourniture d'image de réalité virtuelle |
Country Status (2)
Country | Link |
---|---|
JP (1) | JP6506486B2 (fr) |
WO (1) | WO2018198777A1 (fr) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2022065363A (ja) * | 2020-10-15 | 2022-04-27 | スペースラボ株式会社 | 仮想空間におけるコミュニケーションシステム用サーバ装置及び仮想空間におけるコミュニケーションシステム用クライアント装置 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH1055257A (ja) * | 1996-08-09 | 1998-02-24 | Nippon Telegr & Teleph Corp <Ntt> | 3次元仮想空間表示方法 |
JPH11252523A (ja) * | 1998-03-05 | 1999-09-17 | Nippon Telegr & Teleph Corp <Ntt> | 仮想空間画像の生成装置および仮想空間システム |
JP2006025281A (ja) * | 2004-07-09 | 2006-01-26 | Hitachi Ltd | 情報源選択システム、および方法 |
JP2017062720A (ja) * | 2015-09-25 | 2017-03-30 | キヤノンマーケティングジャパン株式会社 | 情報処理装置、情報処理システム、その制御方法及びプログラム |
-
2018
- 2018-04-11 JP JP2018563937A patent/JP6506486B2/ja active Active
- 2018-04-11 WO PCT/JP2018/015260 patent/WO2018198777A1/fr active Application Filing
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH1055257A (ja) * | 1996-08-09 | 1998-02-24 | Nippon Telegr & Teleph Corp <Ntt> | 3次元仮想空間表示方法 |
JPH11252523A (ja) * | 1998-03-05 | 1999-09-17 | Nippon Telegr & Teleph Corp <Ntt> | 仮想空間画像の生成装置および仮想空間システム |
JP2006025281A (ja) * | 2004-07-09 | 2006-01-26 | Hitachi Ltd | 情報源選択システム、および方法 |
JP2017062720A (ja) * | 2015-09-25 | 2017-03-30 | キヤノンマーケティングジャパン株式会社 | 情報処理装置、情報処理システム、その制御方法及びプログラム |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2022065363A (ja) * | 2020-10-15 | 2022-04-27 | スペースラボ株式会社 | 仮想空間におけるコミュニケーションシステム用サーバ装置及び仮想空間におけるコミュニケーションシステム用クライアント装置 |
Also Published As
Publication number | Publication date |
---|---|
JP6506486B2 (ja) | 2019-04-24 |
JPWO2018198777A1 (ja) | 2019-06-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10410562B2 (en) | Image generating device and image generating method | |
CN109691141B (zh) | 空间化音频系统以及渲染空间化音频的方法 | |
US12226696B2 (en) | Gaming with earpiece 3D audio | |
WO2017183319A1 (fr) | Robot et boîtier | |
JP6580516B2 (ja) | 処理装置および画像決定方法 | |
CN115428032A (zh) | 信息处理装置、信息处理方法和程序 | |
JP2021512402A (ja) | マルチビューイング仮想現実ユーザインターフェース | |
WO2017183294A1 (fr) | Dispositif actionneur | |
JP2023546839A (ja) | 視聴覚レンダリング装置およびその動作方法 | |
JP6538003B2 (ja) | アクチュエータ装置 | |
US11314082B2 (en) | Motion signal generation | |
WO2018198777A1 (fr) | Dispositif et programme de fourniture d'image de réalité virtuelle | |
JP7053074B1 (ja) | 鑑賞システム、鑑賞装置及びプログラム | |
JP6615716B2 (ja) | ロボットおよび筐体 | |
GB2558279A (en) | Head mountable display system | |
JP2025500188A (ja) | 3dビュー及び3d-音響を提示する裸眼立体ディスプレイ装置 | |
WO2017183292A1 (fr) | Dispositif de traitement et procédé de détermination d'image | |
US12353790B1 (en) | Method and system for time-aligned audio playback | |
EP4451620A1 (fr) | Systèmes et procédés de mise à jour programmatique de contextes pour conférences multi-utilisateurs | |
JP6518620B2 (ja) | 位相差増幅装置 | |
JP2025106335A (ja) | 視聴覚レンダリング装置およびその動作方法 | |
US20200302761A1 (en) | Indicator modes |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
ENP | Entry into the national phase |
Ref document number: 2018563937 Country of ref document: JP Kind code of ref document: A |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 18789874 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 18789874 Country of ref document: EP Kind code of ref document: A1 |