US11995789B2 - System and method of creating, hosting, and accessing virtual reality projects - Google Patents
System and method of creating, hosting, and accessing virtual reality projects Download PDFInfo
- Publication number
- US11995789B2 US11995789B2 US18/210,239 US202318210239A US11995789B2 US 11995789 B2 US11995789 B2 US 11995789B2 US 202318210239 A US202318210239 A US 202318210239A US 11995789 B2 US11995789 B2 US 11995789B2
- Authority
- US
- United States
- Prior art keywords
- media files
- project
- text file
- file
- objects
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/20—Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/10—Office automation; Time management
- G06Q10/103—Workflow collaboration or project management
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/02—Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
- G11B27/031—Electronic editing of digitised analogue information signals, e.g. audio or video signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/234—Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs
- H04N21/2343—Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
- H04N21/23439—Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements for generating different versions
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/25—Management operations performed by the server for facilitating the content distribution or administrating data related to end-users or client devices, e.g. end-user or client device authentication, learning user preferences for recommending movies
- H04N21/266—Channel or content management, e.g. generation and management of keys and entitlement messages in a conditional access system, merging a VOD unicast channel into a multicast channel
- H04N21/2662—Controlling the complexity of the video stream, e.g. by scaling the resolution or bitrate of the video stream based on the client capabilities
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/60—Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client
- H04N21/65—Transmission of management data between client and server
- H04N21/658—Transmission by the client directed to the server
- H04N21/6582—Data stored in the client, e.g. viewing habits, hardware capabilities, credit card number
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/85—Assembly of content; Generation of multimedia applications
- H04N21/854—Content authoring
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2219/00—Indexing scheme for manipulating 3D models or images for computer graphics
- G06T2219/024—Multi-user, collaborative environment
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2219/00—Indexing scheme for manipulating 3D models or images for computer graphics
- G06T2219/20—Indexing scheme for editing of 3D models
- G06T2219/2004—Aligning objects, relative positioning of parts
Definitions
- the present invention is directed to Virtual Reality (VR), and more particularly to the distributed management of VR projects.
- VR Virtual Reality
- VR data is generally developed for being played on supporting VR devices, i.e., VR devices including specifications that are compatible with and sufficient to process the VR data.
- VR devices including specifications that are compatible with and sufficient to process the VR data.
- the VR data is made available in different formats/versions compatible with different VR devices. In such cases, the VR devices do not possess the capability to identify the optimal VR data from several formats/versions of the VR data.
- programmers are required to write massive program codes defining the instructions for the processing of media files. Based on these instructions, the media files are processed to generate VR data. Development of these program codes requires a vast knowledge of 3D programming and processing of media files.
- VR Virtual Reality
- the present disclosure provides a technical feature in which a system and a method of creating, hosting, and accessing a VR project is described.
- Media files may be imported into a software application running on a user device, by a VR project creator.
- the VR project creator may also provide creative inputs such as interaction between the media files. Details of interaction between the media files may be provided to the software application.
- the software application may process the media files based on the details of the interaction, for creation of a VR project.
- a user device may place a request to a server hosting multiple versions of media files related to the VR project, for accessing the VR project.
- the server may identify an optimal version of media files of the corresponding VR project from the multiple versions of media files, based on the user device's configuration, such as its hardware capabilities (processing, display, storage, and communication capability) and firmware type and version.
- the user device may play the optimal version of the media files of the VR project.
- a system for creation and management of VR projects may comprise a memory and a processor operatively coupled to the memory.
- the memory may be configured to store programmed instructions.
- the processor may be configured to execute the programmed instructions to allow a VR project creator to select a plurality of media files and define interactions between the plurality of media files.
- the processor may be further configured to create a VR project including a text file and the plurality of media files.
- the text file includes references to storage locations of the plurality of media files and details of the interactions between the plurality of media files.
- the processor may be further configured to manage storage of the text file and provide access of the text file to a user device in response to a user request.
- the text file is a VRXF file.
- the plurality of media files is stored in the memory or is available over Internet.
- the processor executes one or more applications for editing the plurality of media files.
- the processor executes generative data models for generating a VRXF file and one or more media files to be used for creating the VR project.
- the text file is executed by a VR playing component running on the user device.
- the storage of the text file is performed, or the access of the text file is provided based on rules set by a third party.
- the processor processes the plurality of media files to generate multiple versions of the plurality of media files suitable for different configurations of different user devices, wherein the processing includes modifying resolution, modifying orientation, and compression of the plurality of media files.
- the processor provides access of an optimal version of the plurality of media files to the user device, and wherein the processor determines the optimal version of the plurality of media files based on configuration details of the user device, received through the user request.
- the configuration details include software type and version, hardware capability, and Degrees of Freedom (DoF) available for interaction.
- DoF Degrees of Freedom
- the creation of the VR project is performed by a first processing device and the storage and access of the text file is managed by a second processing device.
- a method of creation and management of VR projects comprises allowing a VR project creator to select a plurality of media files and define interactions between the plurality of media files.
- the method further comprises creating a VR project including a text file and the plurality of media files.
- the text file includes references to storage locations of the plurality of media files and details of the interactions between the plurality of media files.
- the method further comprises managing storage of the text file and providing access of the text file to a user device in response to a user request.
- FIG. 1 illustrates an example illustration of an environment for distributed management of Virtual Reality (VR) projects, in accordance with one embodiment of present disclosure
- FIG. 2 illustrates an example User Interface of a software application used for generation of a VR project, in accordance with one embodiment of present disclosure
- FIG. 3 illustrates a first network implementation for creation and management of VR projects, in accordance with an embodiment of present invention
- FIG. 4 illustrates a second network implementation for creation and management of VR projects, in accordance with an embodiment of present invention
- FIG. 5 illustrates a third network implementation for creation and management of VR projects, in accordance with an embodiment of present invention
- FIG. 6 illustrates a flow chart of a method for generating a VR project, in accordance with one embodiment of present disclosure.
- FIG. 7 illustrates a flow chart of a method of identifying and executing an optimal version of media files of the VR projects, in accordance with one embodiment of present disclosure.
- FIG. 1 illustrates an example illustration of an environment for distributed management of VR projects.
- a system 102 operable by a VR project creator 104 may be connected with a server 106 .
- the system 102 may be connected with the server 106 through a communication network 108 .
- the VR project creator 104 may access a software application running on the system 102 or the server 106 .
- the VR project creator 104 may access the software application by operating one of several input means 110 including a touch-controlled device, a mouse, or a handheld controller.
- the software application may receive one or more media files or one or more media assets from the VR project creator 104 .
- the one or more media files may be present in a local storage 112 or a remote storage 114 .
- the local storage 112 may be a memory of the system 102 or a memory of data storage/processing device connected with the system 102 through a wired or wireless connection.
- the remote storage 114 may correspond to a web location present over Internet or over a private network or a memory 120 of the server 106 .
- the one or more media files may include 3600 and 2D image files, it may be possible to utilize other forms of media files such as 360° and 2D video files, audio files, 3D model files, 2D images, 2D videos, and livestream media files in different implementations.
- the VR project creator 104 may define interaction between the one or more media files, for example sequence, position, orientation, distance, and behavior of the one or more media files.
- the VR project creator 104 may define the interactions between the one or more media files by operating one of the several input means 110 .
- the VR project creator 104 may define the interactions by performing simple actions, such as dragging and dropping the one or more media files, and zooming in and out of a scene of a media file through pinching action.
- the VR project creator 104 may establish a connection between one or more scenes. Further, the VR project creator 104 may add an image or an audio file to the scene. The VR project creator 104 may add actions such as show or hide objects, activate or deactivate objects, start, pause or stop animation, loop or unloop video sound, start or pause video sound, open Uniform Resource Locator (URL), and send webhook.
- actions such as show or hide objects, activate or deactivate objects, start, pause or stop animation, loop or unloop video sound, start or pause video sound, open Uniform Resource Locator (URL), and send webhook.
- URL Uniform Resource Locator
- the server 106 may include a processor 118 and the memory 120 .
- the memory 120 may be configured to store programmed instructions.
- the processor 118 may be configured to execute the programmed instructions to implement different functions described henceforth with reference to FIG. 3 through FIG. 5 .
- nodes 204 , 206 , and 208 indicate three media files, for example 360° images.
- connecting lines 210 , 212 , and 214 joining the three media files indicate the relation between the three media files.
- media files could be added/deleted and the relations between the media files could be defined and changed by simple user actions, including drag and drop operations. It must be understood that three media files and linear relations between them have been illustrated and explained for the ease of illustration and explanation, and the desired number of media files could be added, and any form of relation could be defined between them, as per the requirement.
- the software application running on the system 102 may generate a VR project.
- the VR project may include the one or more media files and a VRXF file including details of the interactions between the one or more media files.
- the VRXF file may be a text file and may be hosted over the server 106 .
- the VRXF file may be hosted over the server 106 .
- the VRXF file may be accessed by programmers using processing devices, such as desktops, laptops, and tablets.
- the VRXF file may be inspected in detail to make any changes by accessing a Command User Interface (CUI).
- CCI Command User Interface
- a snippet of the VRXF file defining the position, rotation, and scale of a scene of a media file is provided below.
- the VRXF file may include all details of a VR project.
- a group of user devices 116 may connect to the server 106 , through the communication network 108 , for accessing an optimal version of media files of the VR project.
- the group of user devices 116 may include applications and devices capable of playing VR projects, such as, desktops 116 - 1 , smartphones or tablets 116 - 2 , VR headsets 116 - 3 , and browsers 116 - 4 .
- the VR project may be optimal for some user devices but not all because user devices manufactured by same or different vendors may have different configurations.
- Configuration of each user device of the group of user devices 116 may include hardware capabilities, firmware/software type and version, and Degrees of Freedom (DoF) available for interaction.
- DoF Degrees of Freedom
- the hardware capabilities may correspond to one or more of processing power, display type and resolution, data compression-decompression capability, storage volume, and communication capability.
- An increase in DoF available for interaction increases quality of immersive experience obtained by a user. For example, a user device supporting 6-DoF would provide a better VR experience than a user device supporting 3-DoF.
- the configurations may need to be considered for the generation of different versions of media files of the VR project, optimal for each user device.
- the configurations of the group of user devices 116 may be available with the server 106 .
- the server 106 may create several versions of media files of the VR project by processing the media files based on the different configurations/specifications of the group of user devices 116 . Processing of the media files based on the configuration of each user device may include modifying resolutions of the media files, modifying orientations of the media files, and compression of the media files.
- Different versions of media files of the VR project described to be generated above may be stored alongside a VRXF file in a VR project package.
- the VRXF file may be a text file including details of the interaction/relation between the media files.
- the VR project package may be hosted over the server 106 .
- the package of a VR project may be accessed using the group of user devices 116 .
- the server 106 may determine if an access request of the user device fulfills the privacy requirements of a publisher of the VR project package.
- the server 106 checks privacy settings associated with the VR project. If the VR project should no longer be accessible, the server 106 prevents its display. This ensures that a publisher of the VR project maintains control over accessibility of his project and manage privacy requirements effectively.
- the server 106 may identify the optimal version of media file of the VR project for the user device. Post identification, the server 106 may transmit the optimal version of the media files of the VR project to the user device, through the communication network 108 .
- the VR project package may include a media files version 1 having a 720p (HD) resolution, a media files version 2 having a Full HD (1080p) resolution, a media files version 3 having a 4K (Ultra HD) resolution, and a media files version 4 having an 8K (Full Ultra HD) resolution.
- a user device 116 - 1 may be compatible to play a 720p media files
- a user device 116 - 2 may be compatible to play a Full HD media file
- a user device 116 - 3 may be compatible to play a 4K media file
- a user device 116 - 4 may be compatible to play an 8K media file.
- the user device 116 - 2 when the user device 116 - 2 tries to access an optimal version of the media file of the VR project, the user device 116 - 2 may be provided an access to the media files version 2 having the Full HID (1080p) resolution.
- the user device 116 - 4 when the user device 116 - 4 tries to access an optimal media file, the user device 116 - 4 may be provided an access to the media files version 4 having the 8K resolution.
- FIG. 3 illustrates a first network implementation for creation and management of VR projects, in accordance with an embodiment of present invention.
- the first network implementation includes a first processing device 102 , a second processing device 106 , and a user device 116 .
- the first processing device 102 refers to a device accessed by a VR project creator
- the second processing device 106 refers to a cloud based data processing device such as the server 106
- the user device 116 refers to a device using which a user i.e. a viewer gains a VR experience.
- the first processing device 102 may execute a VR project creation component 302 .
- the VR project creation component 302 may be a software application installed on the first processing device 102 for creation of VR projects.
- the VR project creation component 302 may be a software application hosted over a network cloud, such as the second processing device 106 .
- the VR project creator may access the VR project creation component 302 to select media files stored in the memory 112 of the first processing device 102 .
- the VR project creator may further define interactions between the media files for generation of a VR project. By processing the media files based on the interactions defined between them, the first processing device 102 may create the VR project.
- the VR project denotes the media files and a VRXF file including references i.e. network locations of the media files along with the interactions specified between the media files. By including the references of the media files instead of the content of the media files, size of the VRXF file would reduce significantly and its transmission would occur quickly.
- the VRXF file may be transferred to the second processing device 106 .
- the second processing device 106 may include a project management component 306 , a content storage and processing component 308 , and a content distribution component 310 .
- the media files may be shared along with the VRXF file to the content storage and processing component 308 .
- the project management component 306 may manage receipt of the VRXF file by the content storage and processing component 308 and distribution of the VRXF file by the content distribution component 310 .
- the content storage and processing component 308 may process the VRXF file or the media file associated with the VRXF file to perform one or more operations, such as upscaling or downscaling of the media files.
- the content storage and processing component 308 may process the media files to generate multiple versions of the media files which may be suitable for different configurations of different user devices. For example, the content storage and processing component 308 may produce different versions of the media files having different resolutions. The different versions of the media files may be generated before or after a user request to access the VRXF file is received by the second processing device 106 .
- the content distribution component 310 may provide the VRXF file to a VR playing component 312 of the user device 116 .
- the VR playing component 312 may be a software package having configuration information required to understand and execute the VRXF file. Such configuration information may be similar to the configuration information of the VR project creation component used for creation of the VRXF file.
- the VR playing component 312 may be developed for use by all types of the user device 116 or may be developed specifically for use on a particular type or brand of the user device 116 , such as VR headset developed by HTC®.
- the VR playing component 312 may also be configured to collect analytical playback data related to the VR projects for gaining access and playback information, to further improve creation and distribution of the VR projects. Further, security policies adhering to corporate, institutional, and/or government agencies could also be predefined within the VR playing component 312 .
- the VR player component 312 may encrypt media files downloaded on the user device 116 to ensure that the media files are not plagiarized. Further, the VR player component 312 may remotely remove unpublished VR projects from all user devices.
- the content distribution component 310 may provide access of an optimal version of media files to the user device 116 .
- the optimal version of media files may be determined based on the configuration details of the user device 116 .
- 4K media files may be provided to the VR headset 116 - 3 along with the VRXF file.
- the VR playing component 312 may execute the VRXF file to provide a VR experience to the user.
- the configuration information required to understand and execute the VRXF file may be provided within a firmware for example, as a part of an Operating System (OS) or the browser 116 - 4 .
- OS Operating System
- the user device 116 running such OS and/or the browser 116 - 4 such as the desktop 116 - 1 may be able to execute the VRXF file without requiring to separately download and install the VR playing component 312 .
- the above described first network implementation allows usage of media files stored locally i.e. within the first processing device 102 itself which is used for creation of VR projects. In this manner, more control is available over the media files used in the VR projects. Further, the first network implementation provides turnkey solutions i.e. VR project creation facility can be easily integrated with existing system(s) of an organization.
- FIG. 4 illustrates a second network implementation for creation and management of VR projects, in accordance with an embodiment of present invention.
- the media files used for creation of the VRXF file are present over internet 402 , instead of the memory 112 of the first processing device 102 .
- References such as URLs of the media files may be used by the VR project creation component 302 for creation of the VRXF file. In this manner, any content publicly available over the internet 402 could be referenced for creation of the VRXF file.
- the VRXF file may be transferred to the user device 116 through the second processing device 106 .
- the VR playing component 312 running on the user device 116 may execute the VRXF file to provide the VR experience to the user.
- an author of the media files manages access of content through enforcement of desired security policies.
- the author corresponds to a person owning the media files, for example a copyright holder of the media files.
- the media files can be distributed on a network satisfying requirements of the author e.g. a networked node nearest to intended users/target consumers. Also, access or restriction to access of content based on geolocation and security and/or government policies can be implemented.
- FIG. 5 illustrates a third network implementation for creation and management of VR projects, in accordance with an embodiment of present invention.
- Some of the media files used for creation of the VRXF file may be present over the internet 402 and some of the media files may be stored in the memory 112 of the first processing device 102 .
- the VR project creation component 302 may utilize references of the media files required for generation of the VR project.
- the first processing device 102 may also include a text editing component 502 .
- the text editing component 502 may allow for manual creation or editing of a VRXF file according to the VRXF specification and reference the associated media files.
- the first processing device 102 may also include a third party component 504 having plugins for receiving VR projects created on other processing devices/systems and/or itself creating VR projects.
- the first processing device 102 may also include generative AI models 506 trained to produce a VRXF file and media files or any other content required for generation of the VR projects.
- VRXF file generated or edited using the VR project creation component 302 , the text editing component 502 , the third party component 504 , and the generative AI models 506 may be transferred to the project management component 306 through Application Programming Interfaces (APIs).
- APIs Application Programming Interfaces
- the author of the media files may be able to manage access of content through enforcement of desired security policies.
- the media files can be distributed on a network satisfying requirements of the author e.g. a networked node nearest to intended users/target consumers. Access or restriction to access of content based on geolocation and security and/or government policies can be implemented.
- the VR projects could be created by any person on his system and shared with the third party component 504 . Further, data available for creation of the VR projects does not remain limited to the media files, instead, required media files could be created using the generative AI models.
- the first processing device 102 and the second processing device 106 have been shown separately and their functionalities have been described separately to clearly explain, in a sequential manner, different steps that occur from creation of VR projects till distribution of VR projects.
- the functionalities of the first processing device 102 and the second processing device 106 could be configured on a single processing device, such as the second processing device 106 itself.
- the project management component 306 , the content storage and processing component 308 , and the content distribution component 310 have been shown to be configured within a single processing device i.e. the second processing device 106 . In different implementations, such components may be configured on different systems and may be managed by a single party or different parties.
- VR projects storage, management, and distribution of VR projects could be decentralized and suited as per requirements of an author of the media files, creator of the VR projects, and the user of the VR projects.
- a Data Rights Management (DRM) service could be integrated within any of the project management component 306 , the content storage and processing component 308 , or the content distribution component 310 . Integration of the DRM service would allow an author to control access of the media files and a creator to control access the VR projects.
- DRM Data Rights Management
- the first processing device 102 may allow optimization of the media files, modification of hue/saturation of the media files, scaling of image content and quality, application of filters, greenscreen keying for creation of VR projects. Further, the first processing device 102 may offer Artificial Intelligence (AI) assisted optimization of complex source 3D models to be consumable in an immersive environment. For example, complex Computer Aided Design (CAD) models include several data points which are not relevant for immersive viewing and appropriate for fast data transfer across networked devices. Such CAD models could be optimized for creation of VR projects. The first processing device 102 may also allow trimming and audio level mixing of audio and video content to only include parts needed for the VR projects.
- AI Artificial Intelligence
- CAD Computer Aided Design
- the first processing device 102 may also allow trimming and audio level mixing of audio and video content to only include parts needed for the VR projects.
- the first processing device 102 may further allow patching of 360° image and video content to extract the required visual parts. Such processing allows better layering of content in complex 360° environments and reduces the overall size of the 360° image and video content for transfer and playback of 360° content.
- the first processing device 102 may offer audio-assisted authoring for execution of actions based on audio commands. Audio-assisted authoring can be very quick compared to usage of the input means 110 . Audio-assisted authoring makes discovery of authoring features more natural to inquisitive nature of humans.
- the generative AI models 506 may be used for generation of project templates with provided prompts and media files. AI may also be used for generation of navigation prompts throughout a VR project based on content of 360° video or 360° image. For example, a door detected in one image may lead to another 360° image having a similar door, but from other side. AI would generate an interaction between the 2 data points i.e. the doors to allow a user to navigate between them.
- the VR project creator 104 may allow preview of a VR project, based on configurations of the group of user devices 116 . In this manner, the VR project creator 104 may be able to view how someone using a specific user device may experience the VR project.
- FIG. 6 a flow chart of a method of creating a VR project is explained, in accordance with one exemplary embodiment of the present disclosure.
- the order in which the flow diagram for creating the VR project is described should not be construed as a limitation, and any number of the described method blocks can be combined in any order to implement the flow diagram or alternate methods. Additionally, individual blocks may be deleted from the flow diagram without departing from the spirit and scope of the subject matter described herein.
- the method can be implemented in any suitable hardware, software, firmware, or combination thereof.
- media files may be received.
- the media files may be received by a system operated by a user. Specifically, the media files may be received on a software application running on the system.
- the media files may be provided from a local storage or a remote storage. Further, the media files may be any of 360° and 2D video file, 360° and 2D image file, audio file, and 3D model file.
- details of interaction between the media files may be received from the user.
- the details of interaction may include, for example, sequence, position, orientation, distance, and behavior of the media files.
- a VR project may be created.
- the VR project may include different versions of the media files.
- the different versions of the media files may be generated through processing of the media files based on details of the interaction between the media files and predefined configurations of different user devices. Processing of the media files based on the configurations of the user devices may include modifying the resolution of the media files, modifying the orientation of the media files, and compression of the media files.
- FIG. 7 a flow chart of a method of identifying and executing optimal versions of media files of a VR project is explained, in accordance with one exemplary embodiment of the present disclosure.
- the order in which the flow diagram for identifying and executing optimal versions of media files of a VR project is described should not be construed as a limitation, and any number of the described method blocks can be combined in any order to implement the flow diagram or alternate methods. Additionally, individual blocks may be deleted from the flow diagram without departing from the spirit and scope of the subject matter described herein.
- the method can be implemented in any suitable hardware, software, firmware, or combination thereof.
- a user device may place a request to access a VR project.
- the request may be shared with a server hosting multiple versions of media files of the VR project.
- the multiple versions of the media files of the VR projects may be produced by modifying resolution and/or orientation, and compression of media files associated with the VR project, based on predefined configurations of different user devices that may access the VR project.
- the server may identify an optimal version of the media files associated with the VR project, for the user device.
- the server may identify the optimal version of the media files based on the configuration of the user device, such as hardware capabilities (processing, display, storage, and communication capability) and firmware type and version of the user device.
- the server may transmit the optimal version of the media files and a corresponding VRXF file of the VR project to the user device.
- the VRXF file may include details of interactions between the media files.
- the user device may play the optimal version of the media files based on the details present in the VRXF file, to deliver a best VR experience of the VR project to the user.
- the terms “component,” “system,” “platform,” “station,” “node,” “interface” are intended to refer to a computer-related entity or an entity related to, or that is part of, an operational apparatus with one or more specific functionalities, wherein such entities can be either hardware, a combination of hardware and software, software, or software in execution.
- a component can be, but is not limited to being, a process running on a processor, a processor, a hard disk drive, multiple storage drives (of optical or magnetic storage medium) including affixed (e.g., screwed or bolted) or removable affixed solid-state storage drives; an object; an executable; a thread of execution; a computer-executable program, and/or a computer.
- affixed e.g., screwed or bolted
- the components may communicate via local and/or remote processes such as in accordance with a signal having one or more data packets (e.g., data from one component interacting with another component in a local system, distributed system, and/or across a network such as the Internet with other systems via the signal).
- a component can be an apparatus with specific functionality provided by mechanical parts operated by electric or electronic circuitry which is operated by a software or a firmware application executed by a processor, wherein the processor can be internal or external to the apparatus and executes at least a part of the software or firmware application.
- a component can be an apparatus that provides specific functionality through electronic components without mechanical parts, the electronic components can include a processor therein to execute software or firmware that provides at least in part the functionality of the electronic components.
- interface(s) can include input/output (I/O) components as well as associated processor, application, or Application Programming Interface (API) components. While the foregoing examples are directed to aspects of a component, the exemplified aspects or features also apply to a system, platform, interface, layer, controller, terminal, and the like.
- the user device refers to devices or applications (mobile applications or desktop applications such as browsers or dedicated applications) capable of playing VR files to provide an immersive experience to users.
- the server may be implemented in a variety of computing systems, such as a laptop computer, a desktop computer, a notebook, a workstation, a virtual environment, a mainframe computer, a network server, or a cloud-based computing environment.
- the communication network providing connection between the user device and the server may be a wireless communication network using a variety of protocols, for example, Hypertext Transfer Protocol (HTTP), Transmission Control Protocol/Internet Protocol (TCP/IP), Wireless Application Protocol (WAP), and the like, to communicate with each other.
- HTTP Hypertext Transfer Protocol
- TCP/IP Transmission Control Protocol/Internet Protocol
- WAP Wireless Application Protocol
- the communication network may include a variety of network devices including routers, switches, bridges, gateways, and the like.
Landscapes
- Engineering & Computer Science (AREA)
- Business, Economics & Management (AREA)
- Human Resources & Organizations (AREA)
- Strategic Management (AREA)
- Entrepreneurship & Innovation (AREA)
- Multimedia (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Signal Processing (AREA)
- Data Mining & Analysis (AREA)
- Quality & Reliability (AREA)
- General Business, Economics & Management (AREA)
- Tourism & Hospitality (AREA)
- Economics (AREA)
- Marketing (AREA)
- Operations Research (AREA)
- Software Systems (AREA)
- Computer Hardware Design (AREA)
- General Engineering & Computer Science (AREA)
- Computer Graphics (AREA)
- Architecture (AREA)
- Computer Security & Cryptography (AREA)
- Databases & Information Systems (AREA)
- Information Transfer Between Computers (AREA)
Abstract
Description
| “transform”: { | |||
| “isHidden”: false, | |||
| “position”: { | |||
| “type”: “cartesian”, | |||
| “x”: −0.572, | |||
| “y”: 0.012, | |||
| “z”: −11.986 | |||
| }, | |||
| “rotation”: { | |||
| “type”: “look-at-position”, | |||
| “xCartesianPosition”: 0, | |||
| “yCartesianPosition”: 0, | |||
| “zCartesianPosition”: 0 | |||
| }, | |||
| “scale”: { | |||
| “x”: 4.81, | |||
| “y”: 4.81, | |||
| “z”: 0.06 | |||
Claims (20)
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US18/210,239 US11995789B2 (en) | 2022-06-15 | 2023-06-15 | System and method of creating, hosting, and accessing virtual reality projects |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US202263352396P | 2022-06-15 | 2022-06-15 | |
| US18/210,239 US11995789B2 (en) | 2022-06-15 | 2023-06-15 | System and method of creating, hosting, and accessing virtual reality projects |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| US20230410448A1 US20230410448A1 (en) | 2023-12-21 |
| US11995789B2 true US11995789B2 (en) | 2024-05-28 |
Family
ID=89169048
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US18/210,239 Active US11995789B2 (en) | 2022-06-15 | 2023-06-15 | System and method of creating, hosting, and accessing virtual reality projects |
Country Status (1)
| Country | Link |
|---|---|
| US (1) | US11995789B2 (en) |
Families Citing this family (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN115733999A (en) * | 2022-10-31 | 2023-03-03 | 北京达佳互联信息技术有限公司 | Service switching method and device, electronic equipment and storage medium |
Citations (60)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US5689669A (en) * | 1994-04-29 | 1997-11-18 | General Magic | Graphical user interface for navigating between levels displaying hallway and room metaphors |
| US6002853A (en) * | 1995-10-26 | 1999-12-14 | Wegener Internet Projects Bv | System for generating graphics in response to a database search |
| US6079982A (en) * | 1997-12-31 | 2000-06-27 | Meader; Gregory M | Interactive simulator ride |
| US6119147A (en) * | 1998-07-28 | 2000-09-12 | Fuji Xerox Co., Ltd. | Method and system for computer-mediated, multi-modal, asynchronous meetings in a virtual space |
| US6179619B1 (en) * | 1997-05-13 | 2001-01-30 | Shigenobu Tanaka | Game machine for moving object |
| US6219045B1 (en) * | 1995-11-13 | 2001-04-17 | Worlds, Inc. | Scalable virtual world chat client-server system |
| US6243091B1 (en) * | 1997-11-21 | 2001-06-05 | International Business Machines Corporation | Global history view |
| US6271843B1 (en) * | 1997-05-30 | 2001-08-07 | International Business Machines Corporation | Methods systems and computer program products for transporting users in three dimensional virtual reality worlds using transportation vehicles |
| US20010018667A1 (en) * | 2000-02-29 | 2001-08-30 | Kim Yang Shin | System for advertising on a network by displaying advertisement objects in a virtual three-dimensional area |
| US6362817B1 (en) * | 1998-05-18 | 2002-03-26 | In3D Corporation | System for creating and viewing 3D environments using symbolic descriptors |
| US6396522B1 (en) * | 1999-03-08 | 2002-05-28 | Dassault Systemes | Selection navigator |
| US6414679B1 (en) * | 1998-10-08 | 2002-07-02 | Cyberworld International Corporation | Architecture and methods for generating and displaying three dimensional representations |
| US20020095463A1 (en) * | 2000-04-28 | 2002-07-18 | Sony Corporation | Information processing apparatus and method, and storage medium |
| US20020113820A1 (en) * | 2000-10-10 | 2002-08-22 | Robinson Jack D. | System and method to configure and provide a network-enabled three-dimensional computing environment |
| US6570563B1 (en) * | 1995-07-12 | 2003-05-27 | Sony Corporation | Method and system for three-dimensional virtual reality space sharing and for information transmission |
| US6573903B2 (en) * | 1995-05-08 | 2003-06-03 | Autodesk, Inc. | Determining and displaying geometric relationships between objects in a computer-implemented graphics system |
| US6590593B1 (en) * | 1999-04-06 | 2003-07-08 | Microsoft Corporation | Method and apparatus for handling dismissed dialogue boxes |
| US6621508B1 (en) * | 2000-01-18 | 2003-09-16 | Seiko Epson Corporation | Information processing system |
| US6690393B2 (en) * | 1999-12-24 | 2004-02-10 | Koninklijke Philips Electronics N.V. | 3D environment labelling |
| US20040113887A1 (en) * | 2002-08-27 | 2004-06-17 | University Of Southern California | partially real and partially simulated modular interactive environment |
| US6784901B1 (en) * | 2000-05-09 | 2004-08-31 | There | Method, system and computer program product for the delivery of a chat message in a 3D multi-user environment |
| US20040193441A1 (en) * | 2002-10-16 | 2004-09-30 | Altieri Frances Barbaro | Interactive software application platform |
| US20050093719A1 (en) * | 2003-09-26 | 2005-05-05 | Mazda Motor Corporation | On-vehicle information provision apparatus |
| US20050128212A1 (en) * | 2003-03-06 | 2005-06-16 | Edecker Ada M. | System and method for minimizing the amount of data necessary to create a virtual three-dimensional environment |
| US6961055B2 (en) * | 2001-05-09 | 2005-11-01 | Free Radical Design Limited | Methods and apparatus for constructing virtual environments |
| US7119819B1 (en) * | 1999-04-06 | 2006-10-10 | Microsoft Corporation | Method and apparatus for supporting two-dimensional windows in a three-dimensional environment |
| US20080030429A1 (en) * | 2006-08-07 | 2008-02-07 | International Business Machines Corporation | System and method of enhanced virtual reality |
| US20080125218A1 (en) * | 2006-09-20 | 2008-05-29 | Kelly James Collins | Method of use for a commercially available portable virtual reality system |
| US7382288B1 (en) * | 2004-06-30 | 2008-06-03 | Rockwell Collins, Inc. | Display of airport signs on head-up display |
| US7414629B2 (en) * | 2002-03-11 | 2008-08-19 | Microsoft Corporation | Automatic scenery object generation |
| US20080235570A1 (en) * | 2006-09-15 | 2008-09-25 | Ntt Docomo, Inc. | System for communication through spatial bulletin board |
| US7467356B2 (en) * | 2003-07-25 | 2008-12-16 | Three-B International Limited | Graphical user interface for 3d virtual display browser using virtual display windows |
| US20090076791A1 (en) * | 2007-09-18 | 2009-03-19 | Disney Enterprises, Inc. | Method and system for converting a computer virtual environment into a real-life simulation environment |
| US20090091583A1 (en) * | 2007-10-06 | 2009-04-09 | Mccoy Anthony | Apparatus and method for on-field virtual reality simulation of US football and other sports |
| US7542040B2 (en) * | 2004-08-11 | 2009-06-02 | The United States Of America As Represented By The Secretary Of The Navy | Simulated locomotion method and apparatus |
| US20090287728A1 (en) * | 2008-05-15 | 2009-11-19 | International Business Machines Corporation | Tag along shopping |
| US20090300528A1 (en) * | 2006-09-29 | 2009-12-03 | Stambaugh Thomas M | Browser event tracking for distributed web-based processing, spatial organization and display of information |
| US7663625B2 (en) * | 2001-03-23 | 2010-02-16 | Dassault Systemes | Collaborative design |
| US20100070378A1 (en) * | 2008-09-13 | 2010-03-18 | At&T Intellectual Property I, L.P. | System and method for an enhanced shopping experience |
| US20100115428A1 (en) * | 2000-02-04 | 2010-05-06 | Browse3D Corporation | System and method for web browsing |
| US7746343B1 (en) * | 2005-06-27 | 2010-06-29 | Google Inc. | Streaming and interactive visualization of filled polygon data in a geographic information system |
| US20100205541A1 (en) * | 2009-02-11 | 2010-08-12 | Jeffrey A. Rapaport | social network driven indexing system for instantly clustering people with concurrent focus on same topic into on-topic chat rooms and/or for generating on-topic search results tailored to user preferences regarding topic |
| US20100214284A1 (en) * | 2009-02-24 | 2010-08-26 | Eleanor Rieffel | Model creation using visual markup languages |
| US7788323B2 (en) * | 2000-09-21 | 2010-08-31 | International Business Machines Corporation | Method and apparatus for sharing information in a virtual environment |
| US7804507B2 (en) * | 2006-07-27 | 2010-09-28 | Electronics And Telecommunications Research Institute | Face-mounted display apparatus for mixed reality environment |
| US7814429B2 (en) * | 2006-06-14 | 2010-10-12 | Dassault Systemes | Computerized collaborative work |
| US7817150B2 (en) * | 2005-09-30 | 2010-10-19 | Rockwell Automation Technologies, Inc. | Three-dimensional immersive system for representing an automation control environment |
| US20100274567A1 (en) * | 2009-04-22 | 2010-10-28 | Mark Carlson | Announcing information about payment transactions of any member of a consumer group |
| US20100274627A1 (en) * | 2009-04-22 | 2010-10-28 | Mark Carlson | Receiving an announcement triggered by location data |
| US7844724B2 (en) * | 2007-10-24 | 2010-11-30 | Social Communications Company | Automated real-time data stream switching in a shared virtual area communication environment |
| US20110010636A1 (en) * | 2009-07-13 | 2011-01-13 | International Business Machines Corporation | Specification of a characteristic of a virtual universe establishment |
| US20110041083A1 (en) * | 2007-12-12 | 2011-02-17 | Oz Gabai | System and methodology for providing shared internet experience |
| US20160292925A1 (en) * | 2015-04-06 | 2016-10-06 | Scope Technologies Us Inc. | Method and appartus for sharing augmented reality applications to multiple clients |
| US20170185261A1 (en) * | 2015-12-28 | 2017-06-29 | Htc Corporation | Virtual reality device, method for virtual reality |
| US9696795B2 (en) * | 2015-02-13 | 2017-07-04 | Leap Motion, Inc. | Systems and methods of creating a realistic grab experience in virtual reality/augmented reality environments |
| US9749367B1 (en) * | 2013-03-07 | 2017-08-29 | Cisco Technology, Inc. | Virtualization of physical spaces for online meetings |
| US9996797B1 (en) * | 2013-10-31 | 2018-06-12 | Leap Motion, Inc. | Interactions with virtual objects for machine control |
| WO2019060985A1 (en) | 2017-09-29 | 2019-04-04 | Eyexpo Technology Corp. | A cloud-based system and method for creating a virtual tour |
| US20200193163A1 (en) * | 2014-02-28 | 2020-06-18 | Second Spectrum, Inc. | Methods and systems of combining video content with one or more augmentations to produce augmented video |
| US11592896B2 (en) | 2018-11-07 | 2023-02-28 | Wild Technology, Inc. | Technological infrastructure for enabling multi-user collaboration in a virtual reality environment |
-
2023
- 2023-06-15 US US18/210,239 patent/US11995789B2/en active Active
Patent Citations (62)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US5689669A (en) * | 1994-04-29 | 1997-11-18 | General Magic | Graphical user interface for navigating between levels displaying hallway and room metaphors |
| US6573903B2 (en) * | 1995-05-08 | 2003-06-03 | Autodesk, Inc. | Determining and displaying geometric relationships between objects in a computer-implemented graphics system |
| US6570563B1 (en) * | 1995-07-12 | 2003-05-27 | Sony Corporation | Method and system for three-dimensional virtual reality space sharing and for information transmission |
| US6002853A (en) * | 1995-10-26 | 1999-12-14 | Wegener Internet Projects Bv | System for generating graphics in response to a database search |
| US6219045B1 (en) * | 1995-11-13 | 2001-04-17 | Worlds, Inc. | Scalable virtual world chat client-server system |
| US6179619B1 (en) * | 1997-05-13 | 2001-01-30 | Shigenobu Tanaka | Game machine for moving object |
| US6271843B1 (en) * | 1997-05-30 | 2001-08-07 | International Business Machines Corporation | Methods systems and computer program products for transporting users in three dimensional virtual reality worlds using transportation vehicles |
| US6243091B1 (en) * | 1997-11-21 | 2001-06-05 | International Business Machines Corporation | Global history view |
| US6079982A (en) * | 1997-12-31 | 2000-06-27 | Meader; Gregory M | Interactive simulator ride |
| US6362817B1 (en) * | 1998-05-18 | 2002-03-26 | In3D Corporation | System for creating and viewing 3D environments using symbolic descriptors |
| US6119147A (en) * | 1998-07-28 | 2000-09-12 | Fuji Xerox Co., Ltd. | Method and system for computer-mediated, multi-modal, asynchronous meetings in a virtual space |
| US6414679B1 (en) * | 1998-10-08 | 2002-07-02 | Cyberworld International Corporation | Architecture and methods for generating and displaying three dimensional representations |
| US6396522B1 (en) * | 1999-03-08 | 2002-05-28 | Dassault Systemes | Selection navigator |
| US7119819B1 (en) * | 1999-04-06 | 2006-10-10 | Microsoft Corporation | Method and apparatus for supporting two-dimensional windows in a three-dimensional environment |
| US6590593B1 (en) * | 1999-04-06 | 2003-07-08 | Microsoft Corporation | Method and apparatus for handling dismissed dialogue boxes |
| US6690393B2 (en) * | 1999-12-24 | 2004-02-10 | Koninklijke Philips Electronics N.V. | 3D environment labelling |
| US6621508B1 (en) * | 2000-01-18 | 2003-09-16 | Seiko Epson Corporation | Information processing system |
| US20100115428A1 (en) * | 2000-02-04 | 2010-05-06 | Browse3D Corporation | System and method for web browsing |
| US20010018667A1 (en) * | 2000-02-29 | 2001-08-30 | Kim Yang Shin | System for advertising on a network by displaying advertisement objects in a virtual three-dimensional area |
| US7653877B2 (en) * | 2000-04-28 | 2010-01-26 | Sony Corporation | Information processing apparatus and method, and storage medium |
| US20020095463A1 (en) * | 2000-04-28 | 2002-07-18 | Sony Corporation | Information processing apparatus and method, and storage medium |
| US6784901B1 (en) * | 2000-05-09 | 2004-08-31 | There | Method, system and computer program product for the delivery of a chat message in a 3D multi-user environment |
| US7788323B2 (en) * | 2000-09-21 | 2010-08-31 | International Business Machines Corporation | Method and apparatus for sharing information in a virtual environment |
| US20020113820A1 (en) * | 2000-10-10 | 2002-08-22 | Robinson Jack D. | System and method to configure and provide a network-enabled three-dimensional computing environment |
| US7663625B2 (en) * | 2001-03-23 | 2010-02-16 | Dassault Systemes | Collaborative design |
| US6961055B2 (en) * | 2001-05-09 | 2005-11-01 | Free Radical Design Limited | Methods and apparatus for constructing virtual environments |
| US7414629B2 (en) * | 2002-03-11 | 2008-08-19 | Microsoft Corporation | Automatic scenery object generation |
| US20040113887A1 (en) * | 2002-08-27 | 2004-06-17 | University Of Southern California | partially real and partially simulated modular interactive environment |
| US20040193441A1 (en) * | 2002-10-16 | 2004-09-30 | Altieri Frances Barbaro | Interactive software application platform |
| US20050128212A1 (en) * | 2003-03-06 | 2005-06-16 | Edecker Ada M. | System and method for minimizing the amount of data necessary to create a virtual three-dimensional environment |
| US7467356B2 (en) * | 2003-07-25 | 2008-12-16 | Three-B International Limited | Graphical user interface for 3d virtual display browser using virtual display windows |
| US20050093719A1 (en) * | 2003-09-26 | 2005-05-05 | Mazda Motor Corporation | On-vehicle information provision apparatus |
| US7382288B1 (en) * | 2004-06-30 | 2008-06-03 | Rockwell Collins, Inc. | Display of airport signs on head-up display |
| US7542040B2 (en) * | 2004-08-11 | 2009-06-02 | The United States Of America As Represented By The Secretary Of The Navy | Simulated locomotion method and apparatus |
| US7746343B1 (en) * | 2005-06-27 | 2010-06-29 | Google Inc. | Streaming and interactive visualization of filled polygon data in a geographic information system |
| US7817150B2 (en) * | 2005-09-30 | 2010-10-19 | Rockwell Automation Technologies, Inc. | Three-dimensional immersive system for representing an automation control environment |
| US7814429B2 (en) * | 2006-06-14 | 2010-10-12 | Dassault Systemes | Computerized collaborative work |
| US7804507B2 (en) * | 2006-07-27 | 2010-09-28 | Electronics And Telecommunications Research Institute | Face-mounted display apparatus for mixed reality environment |
| US20080246693A1 (en) * | 2006-08-07 | 2008-10-09 | International Business Machines Corporation | System and method of enhanced virtual reality |
| US20080030429A1 (en) * | 2006-08-07 | 2008-02-07 | International Business Machines Corporation | System and method of enhanced virtual reality |
| US20080235570A1 (en) * | 2006-09-15 | 2008-09-25 | Ntt Docomo, Inc. | System for communication through spatial bulletin board |
| US20080125218A1 (en) * | 2006-09-20 | 2008-05-29 | Kelly James Collins | Method of use for a commercially available portable virtual reality system |
| US20090300528A1 (en) * | 2006-09-29 | 2009-12-03 | Stambaugh Thomas M | Browser event tracking for distributed web-based processing, spatial organization and display of information |
| US20090076791A1 (en) * | 2007-09-18 | 2009-03-19 | Disney Enterprises, Inc. | Method and system for converting a computer virtual environment into a real-life simulation environment |
| US20090091583A1 (en) * | 2007-10-06 | 2009-04-09 | Mccoy Anthony | Apparatus and method for on-field virtual reality simulation of US football and other sports |
| US7844724B2 (en) * | 2007-10-24 | 2010-11-30 | Social Communications Company | Automated real-time data stream switching in a shared virtual area communication environment |
| US20110041083A1 (en) * | 2007-12-12 | 2011-02-17 | Oz Gabai | System and methodology for providing shared internet experience |
| US20090287728A1 (en) * | 2008-05-15 | 2009-11-19 | International Business Machines Corporation | Tag along shopping |
| US20100070378A1 (en) * | 2008-09-13 | 2010-03-18 | At&T Intellectual Property I, L.P. | System and method for an enhanced shopping experience |
| US20100205541A1 (en) * | 2009-02-11 | 2010-08-12 | Jeffrey A. Rapaport | social network driven indexing system for instantly clustering people with concurrent focus on same topic into on-topic chat rooms and/or for generating on-topic search results tailored to user preferences regarding topic |
| US20100214284A1 (en) * | 2009-02-24 | 2010-08-26 | Eleanor Rieffel | Model creation using visual markup languages |
| US20100274627A1 (en) * | 2009-04-22 | 2010-10-28 | Mark Carlson | Receiving an announcement triggered by location data |
| US20100274567A1 (en) * | 2009-04-22 | 2010-10-28 | Mark Carlson | Announcing information about payment transactions of any member of a consumer group |
| US20110010636A1 (en) * | 2009-07-13 | 2011-01-13 | International Business Machines Corporation | Specification of a characteristic of a virtual universe establishment |
| US9749367B1 (en) * | 2013-03-07 | 2017-08-29 | Cisco Technology, Inc. | Virtualization of physical spaces for online meetings |
| US9996797B1 (en) * | 2013-10-31 | 2018-06-12 | Leap Motion, Inc. | Interactions with virtual objects for machine control |
| US20200193163A1 (en) * | 2014-02-28 | 2020-06-18 | Second Spectrum, Inc. | Methods and systems of combining video content with one or more augmentations to produce augmented video |
| US9696795B2 (en) * | 2015-02-13 | 2017-07-04 | Leap Motion, Inc. | Systems and methods of creating a realistic grab experience in virtual reality/augmented reality environments |
| US20160292925A1 (en) * | 2015-04-06 | 2016-10-06 | Scope Technologies Us Inc. | Method and appartus for sharing augmented reality applications to multiple clients |
| US20170185261A1 (en) * | 2015-12-28 | 2017-06-29 | Htc Corporation | Virtual reality device, method for virtual reality |
| WO2019060985A1 (en) | 2017-09-29 | 2019-04-04 | Eyexpo Technology Corp. | A cloud-based system and method for creating a virtual tour |
| US11592896B2 (en) | 2018-11-07 | 2023-02-28 | Wild Technology, Inc. | Technological infrastructure for enabling multi-user collaboration in a virtual reality environment |
Also Published As
| Publication number | Publication date |
|---|---|
| US20230410448A1 (en) | 2023-12-21 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US11989386B2 (en) | Real-time geospatial collaboration system | |
| US8832576B2 (en) | Methods, apparatus and systems for authenticating users and user devices to receive secure information via multiple authorized channels | |
| US6437786B1 (en) | Method of reproducing image data in network projector system, and network projector system | |
| AU2019216745B2 (en) | Collaborative editing of media in a mixed computing environment | |
| US9256898B2 (en) | Managing shared inventory in a virtual universe | |
| US8495078B2 (en) | System and method for abstraction of objects for cross virtual universe deployment | |
| CN102362269B (en) | real-time kernel | |
| US20120210205A1 (en) | System and method for using an application on a mobile device to transfer internet media content | |
| US10579240B2 (en) | Live-rendered and forkable graphic edit trails | |
| US20160117159A1 (en) | Embeddable Video Capturing, Processing And Conversion Application | |
| BR112021009629A2 (en) | method of processing user interface content, system, and non-transient computer readable media | |
| US20110119587A1 (en) | Data model and player platform for rich interactive narratives | |
| JP6794345B2 (en) | A system, method, and computer program product for directly entering commands and / or content created by a local desktop application on a computer device into a web browser and vice versa. | |
| CN110096370A (en) | Control inversion component service model for virtual environment | |
| CN115202729A (en) | Container service-based mirror image generation method, device, equipment and medium | |
| WO2024250491A1 (en) | Method for generating augmented-reality data, and device and medium | |
| US11995789B2 (en) | System and method of creating, hosting, and accessing virtual reality projects | |
| US20250391137A1 (en) | Virtual gallery space system | |
| US20230283834A1 (en) | Synchronous widget and a system and method therefor | |
| US20230215465A1 (en) | Visual effect design using multiple preview windows | |
| US11949730B2 (en) | Context-aware interface layer for remote applications | |
| US20130046820A1 (en) | Manipulaton of an Inventory of Content Items on a Mobile Device by a Network-Based Application | |
| Phillips | Livespace Technical Overview | |
| SANTOS | Uma extensão ao kubernetes para cargas de trabalho de telepresença em XR | |
| WO2025218065A1 (en) | Method and device for managing augmented reality space, and medium |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| AS | Assignment |
Owner name: VRDIRECT GMBH, GERMANY Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:ILLENBERGER, ROLF;MAKOLA, THEO;REEL/FRAME:063961/0046 Effective date: 20230613 |
|
| FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: MICROENTITY |
|
| FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO SMALL (ORIGINAL EVENT CODE: SMAL); ENTITY STATUS OF PATENT OWNER: MICROENTITY Free format text: ENTITY STATUS SET TO MICRO (ORIGINAL EVENT CODE: MICR); ENTITY STATUS OF PATENT OWNER: MICROENTITY |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED |
|
| STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
| AS | Assignment |
Owner name: NETZTREND UG, GERMANY Free format text: ASSIGNMENT OF ASSIGNOR'S INTEREST;ASSIGNOR:VRDIRECT GMBH;REEL/FRAME:072804/0181 Effective date: 20250731 |