CN115981518A - VR (virtual reality) display user operation method and related equipment - Google Patents
VR (virtual reality) display user operation method and related equipment Download PDFInfo
- Publication number
- CN115981518A CN115981518A CN202310281632.9A CN202310281632A CN115981518A CN 115981518 A CN115981518 A CN 115981518A CN 202310281632 A CN202310281632 A CN 202310281632A CN 115981518 A CN115981518 A CN 115981518A
- Authority
- CN
- China
- Prior art keywords
- panoramic
- touch
- interface
- target
- information
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 68
- 238000004590 computer program Methods 0.000 claims description 22
- 230000002093 peripheral effect Effects 0.000 claims description 13
- 230000007704 transition Effects 0.000 claims description 13
- 238000003384 imaging method Methods 0.000 claims description 6
- 230000002452 interceptive effect Effects 0.000 abstract description 4
- 230000008569 process Effects 0.000 description 13
- 238000010586 diagram Methods 0.000 description 11
- 230000000694 effects Effects 0.000 description 9
- 230000006870 function Effects 0.000 description 7
- 230000003993 interaction Effects 0.000 description 4
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 238000004891 communication Methods 0.000 description 2
- 206010061876 Obstruction Diseases 0.000 description 1
- 230000006835 compression Effects 0.000 description 1
- 238000007906 compression Methods 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 230000009191 jumping Effects 0.000 description 1
- 238000007726 management method Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Images
Classifications
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02D—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
- Y02D10/00—Energy efficient computing, e.g. low power processors, power management or thermal management
Landscapes
- Position Input By Displaying (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The embodiment of the application provides a VR (virtual reality) display user operation method and related equipment, which can solve the problems of poor interactivity and poor interactive matching performance of the current VR panoramic touch interface. Wherein, the method comprises the following steps: under the condition that a target VR panoramic touch interface receives a touch instruction of a user, determining the touch type of the touch instruction and control information in a preset area range corresponding to the touch instruction based on the touch instruction; acquiring user information under the condition that the touch type is a click type and the control information in the preset area range corresponding to the touch instruction indicates that the step control is included in the preset area range corresponding to the touch instruction; and determining theoretical walking duration based on the user information, and switching the target VR panoramic touch interface from the current panoramic interface to a next panoramic interface indicated by the walking control according to the theoretical walking duration.
Description
Technical Field
The application relates to the technical field of computers, in particular to a VR display user operation method and related equipment.
Background
With the great heat of the VR virtual reality technology, functions of house watching, car watching, exhibition watching, shopping and the like of the VR panorama begin to be put into use on each large platform, the functions almost become industrial standard allocation at a time, and the novel fashion in various marketing fields is led. The mode is convenient, the trip cost is saved for the client, the operation efficiency is improved for the enterprise, the experience of the client is improved, and the labor cost is also saved.
However, in the process of experiencing real scenes through VR panorama, when switching between different areas, the switching effect is single, and the switching effect is poor.
Disclosure of Invention
The embodiment of the application provides a VR (virtual reality) display user operation method and related equipment, which can solve the problems of poor interactivity and poor interaction matching of the current VR panoramic touch interface.
A first aspect of an embodiment of the present application provides a VR display user operation method, including:
under the condition that a target VR panoramic touch interface receives a touch instruction of a user, determining the touch type of the touch instruction and control information in a preset area range corresponding to the touch instruction based on the touch instruction;
acquiring user information under the condition that the touch type is a click type and the control information in the preset area range corresponding to the touch instruction indicates that the preset area range corresponding to the touch instruction comprises a walking control;
and determining theoretical walking duration based on the user information, and switching the target VR panoramic touch interface from the current panoramic interface to a next panoramic interface indicated by the walking control according to the theoretical walking duration.
Optionally, the determining the theoretical walking duration based on the user information includes:
determining a theoretical stride of the user based on the user information;
and determining the theoretical walking time length according to the theoretical stride of the user.
Optionally, under the condition that the touch type is a click type and the control information in the preset area range corresponding to the touch instruction indicates that the preset area range corresponding to the touch instruction includes a step control, acquiring user information, including:
under the condition that the touch type is a click type and the control information in the preset area range corresponding to the touch instruction indicates that the preset area range corresponding to the touch instruction comprises a walking control, acquiring image information based on a front-facing imaging device to which a target VR panoramic touch interface belongs, wherein the image information is acquired by at least two different images;
and under the condition that the image comprises a preset calibration object, acquiring the user information based on the preset calibration object, wherein the user information comprises at least one of height, age and gender of the user.
Optionally, the method further comprises:
determining a real scene actual step distance by a current panoramic interface and a next panoramic interface indicated by the step control based on the target VR panoramic touch interface;
the determining the theoretical walking duration based on the user information includes:
and determining theoretical walking duration based on the user information and the actual walking distance of the real scene.
Optionally, the method further comprises:
generating a virtual reality transition image based on the target VR panoramic touch interface from a current panoramic interface and a next panoramic interface indicated by the step control;
and switching the target VR panoramic touch interface from the current panoramic interface to the next panoramic interface indicated by the walking control according to the theoretical walking duration and the virtual reality transition image.
Optionally, the method further includes:
under the condition that the target VR panoramic touch interface is a real house panoramic touch interface, acquiring position information of a corresponding real house in the target VR panoramic touch interface;
simulating the outdoor light brightness based on the position information of the real house and the current moment;
and generating the view foreground of the real lighting window on the target VR panoramic touch interface according to the light brightness.
Optionally, the method further includes:
under the condition that the target VR panoramic touch interface is a real house panoramic touch interface, acquiring position information of a corresponding real house in the target VR panoramic touch interface;
acquiring peripheral shelter information based on the position information of the real house;
simulating the incident angle and the intensity of outdoor light according to the position information, the height information, the peripheral shelter information and the current time of the real house;
and generating a view foreground of the real lighting window and an indoor perspective view foreground in the target VR panoramic touch interface according to the incidence angle and the intensity.
A second aspect of the embodiments of the present application provides a VR display user operating apparatus, including:
the device comprises a receiving unit and a processing unit, wherein the receiving unit is used for acquiring real interactive object information of a preset area range corresponding to a touch instruction under the condition that a target VR panoramic touch interface receives the touch instruction of a first user, and the VR panoramic touch interface is obtained by panoramic shooting based on a real scene;
the acquisition unit is used for acquiring instruction content information based on a touch track corresponding to the touch instruction;
and the determining unit is used for determining target instruction content of the first user according to the identity information of the first user and the real interactive object information under the condition that the instruction content information is at least two, and executing the target instruction content in the VR panoramic touch interface.
A third aspect of the embodiments of the present application provides an electronic device, which includes a memory and a processor, where the processor is configured to implement the steps of the VR user operation displaying method when executing a computer program stored in the memory.
A fourth aspect of the present embodiment provides a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the steps of the VR method for displaying user operations described above.
In summary, in the VR display user operation method provided by the embodiment of the application, when a target VR panorama touch interface receives a touch instruction of a user, a touch type of the touch instruction and control information in a preset area range corresponding to the touch instruction are determined based on the touch instruction; acquiring user information under the condition that the touch type is a click type and the control information in the preset area range corresponding to the touch instruction indicates that the preset area range corresponding to the touch instruction comprises a walking control; and determining theoretical walking duration based on the user information, and switching the target VR panoramic touch interface from the current panoramic interface to a next panoramic interface indicated by the walking control according to the theoretical walking duration. Because a user can switch scene points through a touch screen in the process of watching in a panoramic interface or interacting with a real scene, the current scene point switching cannot well show a walking effect, generally the zooming of the previous scene picture is followed by the expanding of the next scene picture, and the whole switching process is usually completed in a very short time, for example, the switching from the central position of one room to the central position of the next room is performed at the same speed regardless of the distance between two scene points and the type of the scene points. By the method, the step duration theoretically required by the user when the current panoramic interface is switched to the point position corresponding to the next panoramic interface indicated by the step control can be predicted by obtaining the user information, the step effect can be adaptively adjusted according to different types of users and different interface scene distances, and the problems of poor interactivity and poor interaction matching performance of the current VR panoramic touch interface are solved.
Accordingly, the VR demonstration user operating device, the electronic device and the computer readable storage medium provided by the embodiment of the invention also have the technical effects.
Drawings
Fig. 1 is a schematic flowchart of a possible VR display user operation method according to an embodiment of the present disclosure;
FIG. 2 is a schematic block diagram of a possible VR display user-operated device provided by an embodiment of the present application;
FIG. 3 is a diagram illustrating a hardware structure of a user-operated device for one possible VR according to an embodiment of the present disclosure;
fig. 4 is a schematic structural block diagram of a possible electronic device provided in an embodiment of the present application;
fig. 5 is a schematic structural block diagram of a possible computer-readable storage medium provided in an embodiment of the present application.
Detailed Description
The embodiment of the application provides a VR (virtual reality) display user operation method and related equipment, which can solve the problems of poor interactivity and poor interactive matching performance of the current VR panoramic touch interface.
The terms "first," "second," "third," "fourth," and the like in the description and in the claims of the present application and in the drawings described above, if any, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It will be appreciated that the data so used may be interchanged under appropriate circumstances such that the embodiments described herein may be practiced otherwise than as specifically illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus. The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments.
Referring to fig. 1, a flowchart of a VR display user operation method provided in an embodiment of the present application may specifically include: S110-S130.
S110, under the condition that a target VR panoramic touch interface receives a touch instruction of a user, determining the touch type of the touch instruction and control information in a preset area range corresponding to the touch instruction based on the touch instruction.
And S120, acquiring user information under the condition that the touch type is a click type and the control information in the preset area range corresponding to the touch instruction indicates that the step control is included in the preset area range corresponding to the touch instruction.
For example, when the user switches from the current panoramic interface to the next panoramic interface indicated by the step control from the target VR panoramic touch interface, the user may click the control in the direction of the next panoramic interface.
S130, determining theoretical walking duration based on the user information, and switching the target VR panoramic touch interface from the current panoramic interface to the next panoramic interface indicated by the walking control according to the theoretical walking duration.
According to the VR user operation display method provided by the embodiment, under the condition that a target VR panoramic touch interface receives a touch instruction of a user, the touch type of the touch instruction and control information in a preset area range corresponding to the touch instruction are determined based on the touch instruction; acquiring user information under the condition that the touch type is a click type and the control information in the preset area range corresponding to the touch instruction indicates that the preset area range corresponding to the touch instruction comprises a walking control; and determining theoretical walking duration based on the user information, and switching the target VR panoramic touch interface from the current panoramic interface to a next panoramic interface indicated by the walking control according to the theoretical walking duration. Because a user can switch scene points through a touch screen in the process of watching in a panoramic interface or interacting with a real scene, but the current scene point switching cannot well show a walking effect, the zooming of the previous scene picture is generally followed by the unfolding of the next scene picture, and the whole switching process is usually completed in a very short time, for example, the switching from the central position of one room to the central position of the next room is performed at the same speed regardless of the distance between two scene points and the type of the scene points. By the method, the step duration theoretically required by the user when the current panoramic interface is switched to the point position corresponding to the next panoramic interface indicated by the step control can be predicted by obtaining the user information, the step effect can be adaptively adjusted according to different types of users and different interface scene distances, and the problems of poor interactivity and poor interaction matching performance of the current VR panoramic touch interface are solved.
According to some embodiments, the determining a theoretical step length based on the user information comprises:
determining a theoretical stride of the user based on the user information;
and determining the theoretical walking time length according to the theoretical stride of the user.
According to some embodiments, the obtaining user information when the touch type is a click type and the control information in the preset area range corresponding to the touch instruction indicates that the preset area range corresponding to the touch instruction includes a step control includes:
under the condition that the touch type is a click type and the control information in the preset area range corresponding to the touch instruction indicates that the preset area range corresponding to the touch instruction comprises a walking control, acquiring image information based on a front-facing imaging device to which a target VR panoramic touch interface belongs, wherein the image information is acquired by at least two different images;
and under the condition that the image comprises a preset calibration object, acquiring the user information based on the preset calibration object, wherein the user information comprises the calibration object acquired from at least one of the height, the age and the gender of the user to calculate the height of the user. The different depths of field between the preset calibration object and the user can be obtained through at least two different images, so that the height of the user can be predicted through the preset calibration object. Therefore, the user information can be conveniently predicted under the condition that the user information cannot be obtained through the information prestored by the user. In addition, under the condition that the registered user is different from the user, the user information of the user can be predicted, so that the theoretical walking time length is more consistent with the current user.
Illustratively, the age and gender of the user may be predicted by image recognition. Can also be obtained by at least
According to some embodiments, further comprising:
determining a real scene actual step distance by a current panoramic interface and a next panoramic interface indicated by the step control based on the target VR panoramic touch interface;
the determining the theoretical walking duration based on the user information includes:
and determining theoretical walking duration based on the user information and the actual walking distance of the real scene.
For example, the actual walking distance of the real scene may be determined by querying a current panoramic interface and a next panoramic interface indicated by the walking control through prestored scene information of the target VR panoramic touch interface. And predicting the current panoramic interface and the next panoramic interface indicated by the walking control according to the scene picture of the VR panoramic touch interface to determine the actual walking distance of the real scene. Therefore, the theoretical walking time length is more accurate and real by combining the actual walking distance of the real scene, and the user experience is more real.
According to some embodiments, further comprising:
generating a virtual reality transition image based on the target VR panoramic touch interface from a current panoramic interface and a next panoramic interface indicated by the step control;
and switching the target VR panoramic touch interface from the current panoramic interface to the next panoramic interface indicated by the walking control according to the theoretical walking duration and the virtual reality transition image.
It should be noted that the target VR panoramic touch interface is switched from the current panoramic interface to the next panoramic interface indicated by the walking control according to the theoretical walking duration and the virtual reality transition image, so that the user feels more real under the condition of the current panoramic interface and the next panoramic interface indicated by the walking control, and a jumping feeling and a splitting feeling are avoided.
According to some embodiments, further comprising:
under the condition that the target VR panoramic touch interface is a real house panoramic touch interface, acquiring position information of a corresponding real house in the target VR panoramic touch interface;
simulating the outdoor light brightness based on the position information of the real house and the current moment;
and generating a view foreground of the real lighting window on the target VR panoramic touch interface according to the light brightness.
For example, the position information of the corresponding real house in the target VR panorama touch interface is obtained, and the outdoor illumination condition may be determined based on the position information of the real house and the current time. By simulating outdoor light brightness and generating the view foreground of the real lighting window on the target VR panoramic touch interface according to the light brightness, the user can see the house more truly.
According to some embodiments, further comprising:
under the condition that the target VR panoramic touch interface is a real house panoramic touch interface, acquiring position information of a corresponding real house in the target VR panoramic touch interface;
acquiring peripheral obstruction information based on the position information of the real house;
simulating the incident angle and the intensity of outdoor light according to the position information, the height information, the peripheral shelter information and the current time of the real house;
and generating a view foreground of the real lighting window and an indoor perspective view foreground in the target VR panoramic touch interface according to the incidence angle and the intensity.
Illustratively, the incident angles and the intensities of outdoor light rays from different lighting windows of a house can be simulated according to the position information, the height information, the peripheral shelter information and the current time of the real house, and more real house-watching experience can be provided for users. In addition, the incidence angles and the intensities of outdoor light from different lighting windows of a house at different moments in a day can be demonstrated in a short time according to the time compression ratio.
The VR exhibition user operation method in the embodiment of the present application is described above, and the VR exhibition user operation device in the embodiment of the present application is described below.
Referring to fig. 2, an embodiment of a VR demonstration user operating apparatus described in the embodiment of the present application may include:
the determining unit 201 is configured to determine, based on a touch instruction, a touch type of the touch instruction and control information in a preset area range corresponding to the touch instruction when a target VR panoramic touch interface receives the touch instruction of a user;
an obtaining unit 202, configured to obtain user information when the touch type is a click type and the control information in the preset area range corresponding to the touch instruction indicates that the preset area range corresponding to the touch instruction includes a walking control;
and the execution unit 203 is configured to determine a theoretical walking duration based on the user information, and switch the target VR panoramic touch interface from the current panoramic interface to a next panoramic interface indicated by the walking control according to the theoretical walking duration.
According to the VR user operation display device provided by the embodiment, when a touch instruction of a user is received at a target VR panoramic touch interface, the touch type of the touch instruction and control information in a preset area range corresponding to the touch instruction are determined based on the touch instruction; acquiring user information under the condition that the touch type is a click type and the control information in the preset area range corresponding to the touch instruction indicates that the step control is included in the preset area range corresponding to the touch instruction; and determining theoretical walking duration based on the user information, and switching the target VR panoramic touch interface from the current panoramic interface to a next panoramic interface indicated by the walking control according to the theoretical walking duration. Because a user can switch scene points through a touch screen in the process of watching in a panoramic interface or interacting with a real scene, the current scene point switching cannot well show a walking effect, generally the zooming of the previous scene picture is followed by the expanding of the next scene picture, and the whole switching process is usually completed in a very short time, for example, the switching from the central position of one room to the central position of the next room is performed at the same speed regardless of the distance between two scene points and the type of the scene points. By the method, the step duration theoretically required by the user when the current panoramic interface is switched to the point position corresponding to the next panoramic interface indicated by the step control can be predicted by obtaining the user information, the step effect can be adaptively adjusted according to different types of users and different interface scene distances, and the problems of poor interactivity and poor interaction matching performance of the current VR panoramic touch interface are solved.
In the above description of fig. 2, the VR exhibition user operation device in the embodiment of the present application is described from the perspective of a modular functional entity, and in the following description of the VR exhibition user operation device in the embodiment of the present application in detail from the perspective of hardware processing, referring to fig. 3, an embodiment of a VR exhibition user operation device 300 in the embodiment of the present application includes:
an input device 301, an output device 302, a processor 303 and a memory 304, wherein the number of the processor 303 may be one or more, and one processor 303 is taken as an example in fig. 3. In some embodiments of the present application, the input device 301, the output device 302, the processor 303 and the memory 304 may be connected by a bus or other means, wherein fig. 3 illustrates the connection by the bus.
Wherein, by calling the operation instruction stored in the memory 304, the processor 303 is configured to perform the following steps:
under the condition that a target VR panoramic touch interface receives a touch instruction of a user, determining the touch type of the touch instruction and control information in a preset area range corresponding to the touch instruction based on the touch instruction;
acquiring user information under the condition that the touch type is a click type and the control information in the preset area range corresponding to the touch instruction indicates that the preset area range corresponding to the touch instruction comprises a walking control;
and determining theoretical walking duration based on the user information, and switching the target VR panoramic touch interface from the current panoramic interface to a next panoramic interface indicated by the walking control according to the theoretical walking duration.
Optionally, the determining the theoretical walking duration based on the user information includes:
determining a theoretical stride of the user based on the user information;
and determining the theoretical walking time length according to the theoretical stride of the user.
Optionally, under the condition that the touch type is a click type and the control information in the preset area range corresponding to the touch instruction indicates that the preset area range corresponding to the touch instruction includes a step control, acquiring user information, including:
under the condition that the touch type is a click type and the control information in the preset area range corresponding to the touch instruction indicates that the step control is included in the preset area range corresponding to the touch instruction, acquiring image information based on a front-end imaging device to which a target VR panoramic touch interface belongs, wherein the image information is acquired by at least two different images;
and under the condition that the image comprises a preset calibration object, acquiring the user information based on the preset calibration object, wherein the user information comprises at least one of height, age and gender of the user.
Optionally, the method further comprises:
determining a real scene actual walking distance based on the target VR panoramic touch interface from a current panoramic interface and a next panoramic interface indicated by the walking control;
the determining the theoretical walking duration based on the user information includes:
and determining theoretical walking duration based on the user information and the actual walking distance of the real scene.
Optionally, the method further comprises:
generating a virtual reality transition image based on the target VR panoramic touch interface from a current panoramic interface and a next panoramic interface indicated by the walking control;
and switching the target VR panoramic touch interface from the current panoramic interface to the next panoramic interface indicated by the walking control according to the theoretical walking duration and the virtual reality transition image.
Optionally, the method further includes:
under the condition that the target VR panoramic touch interface is a real house panoramic touch interface, acquiring position information of a corresponding real house in the target VR panoramic touch interface;
simulating the outdoor light brightness based on the position information of the real house and the current moment;
and generating a view foreground of the real lighting window on the target VR panoramic touch interface according to the light brightness.
Optionally, the method further includes:
under the condition that the target VR panoramic touch interface is a real house panoramic interface, acquiring position information of a corresponding real house in the target VR panoramic touch interface;
acquiring peripheral shelter information based on the position information of the real house;
simulating the incidence angle and the intensity of outdoor light according to the position information, the height information, the peripheral shelter information and the current moment of the real house;
and generating a view foreground of the real lighting window and an indoor perspective view foreground in the target VR panoramic touch interface according to the incidence angle and the intensity.
The processor 303 is also configured to perform any of the methods in the corresponding embodiments of fig. 1 by calling the operation instructions stored in the memory 304.
Referring to fig. 4, fig. 4 is a schematic view of an embodiment of an electronic device according to an embodiment of the present disclosure.
As shown in fig. 4, an electronic device 400 according to an embodiment of the present application includes a memory 410, a processor 420, and a computer program 411 stored in the memory 410 and executable on the processor 420, where the processor 420 executes the computer program 411 to implement the following steps:
under the condition that a target VR panoramic touch interface receives a touch instruction of a user, determining the touch type of the touch instruction and control information in a preset area range corresponding to the touch instruction based on the touch instruction;
acquiring user information under the condition that the touch type is a click type and the control information in the preset area range corresponding to the touch instruction indicates that the preset area range corresponding to the touch instruction comprises a walking control;
and determining theoretical walking duration based on the user information, and switching the target VR panoramic touch interface from the current panoramic interface to a next panoramic interface indicated by the walking control according to the theoretical walking duration.
Optionally, the determining the theoretical walking duration based on the user information includes:
determining a theoretical stride of the user based on the user information;
and determining the theoretical walking time length according to the theoretical stride of the user.
Optionally, under the condition that the touch type is a click type and the control information in the preset area range corresponding to the touch instruction indicates that the preset area range corresponding to the touch instruction includes a step control, acquiring user information, including:
under the condition that the touch type is a click type and the control information in the preset area range corresponding to the touch instruction indicates that the preset area range corresponding to the touch instruction comprises a walking control, acquiring image information based on a front-facing imaging device to which a target VR panoramic touch interface belongs, wherein the image information is acquired by at least two different images;
and under the condition that the image comprises a preset calibration object, acquiring the user information based on the preset calibration object, wherein the user information comprises at least one of height, age and gender of the user.
Optionally, the method further comprises:
determining a real scene actual step distance by a current panoramic interface and a next panoramic interface indicated by the step control based on the target VR panoramic touch interface;
the determining a theoretical walking duration based on the user information includes:
and determining theoretical walking duration based on the user information and the actual walking distance of the real scene.
Optionally, the method further comprises:
generating a virtual reality transition image based on the target VR panoramic touch interface from a current panoramic interface and a next panoramic interface indicated by the step control;
and switching the target VR panoramic touch interface from the current panoramic interface to the next panoramic interface indicated by the walking control according to the theoretical walking duration and the virtual reality transition image.
Optionally, the method further includes:
under the condition that the target VR panoramic touch interface is a real house panoramic touch interface, acquiring position information of a corresponding real house in the target VR panoramic touch interface;
simulating the outdoor light brightness based on the position information of the real house and the current moment;
and generating a view foreground of the real lighting window on the target VR panoramic touch interface according to the light brightness.
Optionally, the method further includes:
under the condition that the target VR panoramic touch interface is a real house panoramic touch interface, acquiring position information of a corresponding real house in the target VR panoramic touch interface;
acquiring peripheral shelter information based on the position information of the real house;
simulating the incident angle and the intensity of outdoor light according to the position information, the height information, the peripheral shelter information and the current time of the real house;
and generating a view foreground of the real lighting window and an indoor perspective view foreground in the target VR panoramic touch interface according to the incidence angle and the intensity.
In a specific implementation, when the processor 420 executes the computer program 411, any of the embodiments corresponding to fig. 1 may be implemented.
Since the electronic device described in this embodiment is a device used for implementing a system resource management apparatus in this embodiment, based on the method described in this embodiment, a person skilled in the art can understand a specific implementation manner of the electronic device of this embodiment and various variations thereof, so that how to implement the method in this embodiment by the electronic device is not described in detail herein, and as long as the person skilled in the art implements the device used for implementing the method in this embodiment, the device is within the scope of protection intended by this application.
Referring to fig. 5, fig. 5 is a schematic diagram illustrating an embodiment of a computer-readable storage medium according to the present application.
As shown in fig. 5, the present embodiment provides a computer-readable storage medium 500 having a computer program 511 stored thereon, the computer program 511 implementing the following steps when executed by a processor:
under the condition that a target VR panoramic touch interface receives a touch instruction of a user, determining the touch type of the touch instruction and control information in a preset area range corresponding to the touch instruction based on the touch instruction;
acquiring user information under the condition that the touch type is a click type and the control information in the preset area range corresponding to the touch instruction indicates that the preset area range corresponding to the touch instruction comprises a walking control;
and determining theoretical walking duration based on the user information, and switching the target VR panoramic touch interface from the current panoramic interface to a next panoramic interface indicated by the walking control according to the theoretical walking duration.
Optionally, the determining a theoretical walking duration based on the user information includes:
determining a theoretical stride of the user based on the user information;
and determining the theoretical walking time length according to the theoretical stride of the user.
Optionally, under the condition that the touch type is a click type and the control information in the preset area range corresponding to the touch instruction indicates that the preset area range corresponding to the touch instruction includes a step control, acquiring user information, including:
under the condition that the touch type is a click type and the control information in the preset area range corresponding to the touch instruction indicates that the preset area range corresponding to the touch instruction comprises a walking control, acquiring image information based on a front-facing imaging device to which a target VR panoramic touch interface belongs, wherein the image information is acquired by at least two different images;
and under the condition that the image comprises a preset calibration object, acquiring the user information based on the preset calibration object, wherein the user information comprises at least one of height, age and gender of the user.
Optionally, the method further comprises:
determining a real scene actual step distance by a current panoramic interface and a next panoramic interface indicated by the step control based on the target VR panoramic touch interface;
the determining a theoretical walking duration based on the user information includes:
and determining theoretical walking duration based on the user information and the actual walking distance of the real scene.
Optionally, the method further comprises:
generating a virtual reality transition image based on the target VR panoramic touch interface from a current panoramic interface and a next panoramic interface indicated by the step control;
and switching the target VR panoramic touch interface from the current panoramic interface to the next panoramic interface indicated by the walking control according to the theoretical walking duration and the virtual reality transition image.
Optionally, the method further includes:
under the condition that the target VR panoramic touch interface is a real house panoramic touch interface, acquiring position information of a corresponding real house in the target VR panoramic touch interface;
simulating the outdoor light brightness based on the position information of the real house and the current moment;
and generating the view foreground of the real lighting window on the target VR panoramic touch interface according to the light brightness.
Optionally, the method further includes:
under the condition that the target VR panoramic touch interface is a real house panoramic touch interface, acquiring position information of a corresponding real house in the target VR panoramic touch interface;
acquiring peripheral shelter information based on the position information of the real house;
simulating the incident angle and the intensity of outdoor light according to the position information, the height information, the peripheral shelter information and the current time of the real house;
and generating a view foreground of the real lighting window and an indoor perspective view foreground in the target VR panoramic touch interface according to the incidence angle and the intensity.
In a specific implementation, the computer program 511 may implement any of the embodiments corresponding to fig. 1 when being executed by a processor.
It should be noted that, in the foregoing embodiments, the description of each embodiment has an emphasis, and reference may be made to the related description of other embodiments for a part that is not described in detail in a certain embodiment.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
Embodiments of the present application further provide a computer program product, where the computer program product includes computer software instructions, and when the computer software instructions are run on a processing device, the processing device executes a flow in the VR display user operation method in the embodiment corresponding to fig. 1.
The computer program product includes one or more computer instructions. When loaded and executed on a computer, cause the processes or functions described in accordance with the embodiments of the application to occur, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored on a computer readable storage medium or transmitted from one computer readable storage medium to another, for example, from one website, computer, server, or data center to another website, computer, server, or data center via wire (e.g., coaxial cable, fiber optic, digital Subscriber Line (DSL)) or wireless (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that a computer can store or a data storage device, such as a server, a data center, etc., that is integrated with one or more available media. The usable medium may be a magnetic medium (e.g., floppy disk, hard disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., solid State Disk (SSD)), among others.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other manners. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be substantially implemented or contributed to by the prior art, or all or part of the technical solution may be embodied in a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
The above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not depart from the spirit and scope of the corresponding technical solutions in the embodiments of the present application.
Claims (10)
1. A VR user operation method, comprising:
under the condition that a target VR panoramic touch interface receives a touch instruction of a user, determining the touch type of the touch instruction and control information in a preset area range corresponding to the touch instruction based on the touch instruction;
acquiring user information under the condition that the touch type is a click type and the control information in the preset area range corresponding to the touch instruction indicates that the preset area range corresponding to the touch instruction comprises a walking control;
and determining theoretical walking duration based on the user information, and switching the target VR panoramic touch interface from the current panoramic interface to a next panoramic interface indicated by the walking control according to the theoretical walking duration.
2. The method of claim 1, wherein the determining a theoretical step length based on the user information comprises:
determining a theoretical stride of the user based on the user information;
and determining the theoretical walking time length according to the theoretical stride of the user.
3. The method according to claim 1, wherein the obtaining user information when the touch type is a click type and the control information in the preset area range corresponding to the touch instruction indicates that the preset area range corresponding to the touch instruction includes a step control comprises:
under the condition that the touch type is a click type and the control information in the preset area range corresponding to the touch instruction indicates that the preset area range corresponding to the touch instruction comprises a walking control, acquiring image information based on a front-facing imaging device to which a target VR panoramic touch interface belongs, wherein the image information is acquired by at least two different images;
and under the condition that the image comprises a preset calibration object, acquiring the user information based on the preset calibration object, wherein the user information comprises at least one of height, age and gender of the user.
4. The method of claim 1, further comprising:
determining a real scene actual step distance by a current panoramic interface and a next panoramic interface indicated by the step control based on the target VR panoramic touch interface;
the determining a theoretical walking duration based on the user information includes:
and determining theoretical walking duration based on the user information and the actual walking distance of the real scene.
5. The method of claim 1, further comprising:
generating a virtual reality transition image based on the target VR panoramic touch interface from a current panoramic interface and a next panoramic interface indicated by the walking control;
and switching the target VR panoramic touch interface from the current panoramic interface to the next panoramic interface indicated by the walking control according to the theoretical walking duration and the virtual reality transition image.
6. The method of claim 1, further comprising:
under the condition that the target VR panoramic touch interface is a real house panoramic touch interface, acquiring position information of a corresponding real house in the target VR panoramic touch interface;
simulating the outdoor light brightness based on the position information of the real house and the current moment;
and generating a view foreground of the real lighting window on the target VR panoramic touch interface according to the light brightness.
7. The method of claim 1, further comprising:
under the condition that the target VR panoramic touch interface is a real house panoramic touch interface, acquiring position information of a corresponding real house in the target VR panoramic touch interface;
acquiring peripheral shelter information based on the position information of the real house;
simulating the incident angle and the intensity of outdoor light according to the position information, the height information, the peripheral shelter information and the current time of the real house;
and generating a view foreground of the real lighting window and an indoor perspective view foreground in the target VR panoramic touch interface according to the incidence angle and the intensity.
8. A VR exhibit user operated device, comprising:
the device comprises a determining unit, a processing unit and a processing unit, wherein the determining unit is used for determining the touch type of a touch instruction and control information in a preset area range corresponding to the touch instruction based on the touch instruction under the condition that a target VR panoramic touch interface receives the touch instruction of a user;
the obtaining unit is used for obtaining user information under the condition that the touch type is a click type and the control information in the preset area range corresponding to the touch instruction indicates that the step control is included in the preset area range corresponding to the touch instruction;
and the execution unit is used for determining theoretical walking duration based on the user information and switching the target VR panoramic touch interface from the current panoramic interface to the next panoramic interface indicated by the walking control according to the theoretical walking duration.
9. An electronic device comprising a memory, a processor, wherein the processor is configured to implement the steps of the VR demonstration user operation method of any of claims 1 to 7 when executing a computer program stored in the memory.
10. A computer-readable storage medium having stored thereon a computer program, characterized in that: the computer program when executed by a processor implementing the steps of the VR demonstration user operational method of any of claims 1 to 7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310281632.9A CN115981518B (en) | 2023-03-22 | 2023-03-22 | VR demonstration user operation method and related equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310281632.9A CN115981518B (en) | 2023-03-22 | 2023-03-22 | VR demonstration user operation method and related equipment |
Publications (2)
Publication Number | Publication Date |
---|---|
CN115981518A true CN115981518A (en) | 2023-04-18 |
CN115981518B CN115981518B (en) | 2023-06-02 |
Family
ID=85972575
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310281632.9A Active CN115981518B (en) | 2023-03-22 | 2023-03-22 | VR demonstration user operation method and related equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115981518B (en) |
Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2018120708A1 (en) * | 2016-12-29 | 2018-07-05 | 深圳前海弘稼科技有限公司 | Method and apparatus for virtual tour of scenic area |
CN108310768A (en) * | 2018-01-16 | 2018-07-24 | 腾讯科技(深圳)有限公司 | The display methods and device of virtual scene, storage medium, electronic device |
CN108830692A (en) * | 2018-06-20 | 2018-11-16 | 厦门市超游网络科技股份有限公司 | Long-range panorama sees room method, apparatus, user terminal, server and storage medium |
CN109745699A (en) * | 2018-12-29 | 2019-05-14 | 维沃移动通信有限公司 | A method and terminal device for responding to touch operation |
CN109814713A (en) * | 2019-01-10 | 2019-05-28 | 重庆爱奇艺智能科技有限公司 | A kind of method and apparatus for the switching of VR user perspective |
US20190344167A1 (en) * | 2016-11-30 | 2019-11-14 | Interdigital Ce Patent Holdings | 3d immersive method and device for a user in a virtual 3d scene |
US20200289933A1 (en) * | 2019-01-22 | 2020-09-17 | Electronic Arts Inc. | Controlling character movement in a video-game |
CN111840989A (en) * | 2020-08-05 | 2020-10-30 | 网易(杭州)网络有限公司 | Method and device for processing moving route of virtual object and electronic equipment |
CN114371800A (en) * | 2021-12-15 | 2022-04-19 | 北京城市网邻信息技术有限公司 | Space display method, device, terminal and medium based on VR panorama roaming |
US20220266139A1 (en) * | 2020-11-20 | 2022-08-25 | Tencent Technology (Shenzhen) Company Limited | Information processing method and apparatus in virtual scene, device, medium, and program product |
WO2023036168A1 (en) * | 2021-09-09 | 2023-03-16 | 北京字跳网络技术有限公司 | Page switching method and apparatus, and device and storage medium |
-
2023
- 2023-03-22 CN CN202310281632.9A patent/CN115981518B/en active Active
Patent Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20190344167A1 (en) * | 2016-11-30 | 2019-11-14 | Interdigital Ce Patent Holdings | 3d immersive method and device for a user in a virtual 3d scene |
WO2018120708A1 (en) * | 2016-12-29 | 2018-07-05 | 深圳前海弘稼科技有限公司 | Method and apparatus for virtual tour of scenic area |
CN108310768A (en) * | 2018-01-16 | 2018-07-24 | 腾讯科技(深圳)有限公司 | The display methods and device of virtual scene, storage medium, electronic device |
CN108830692A (en) * | 2018-06-20 | 2018-11-16 | 厦门市超游网络科技股份有限公司 | Long-range panorama sees room method, apparatus, user terminal, server and storage medium |
CN109745699A (en) * | 2018-12-29 | 2019-05-14 | 维沃移动通信有限公司 | A method and terminal device for responding to touch operation |
CN109814713A (en) * | 2019-01-10 | 2019-05-28 | 重庆爱奇艺智能科技有限公司 | A kind of method and apparatus for the switching of VR user perspective |
US20200289933A1 (en) * | 2019-01-22 | 2020-09-17 | Electronic Arts Inc. | Controlling character movement in a video-game |
CN111840989A (en) * | 2020-08-05 | 2020-10-30 | 网易(杭州)网络有限公司 | Method and device for processing moving route of virtual object and electronic equipment |
US20220266139A1 (en) * | 2020-11-20 | 2022-08-25 | Tencent Technology (Shenzhen) Company Limited | Information processing method and apparatus in virtual scene, device, medium, and program product |
WO2023036168A1 (en) * | 2021-09-09 | 2023-03-16 | 北京字跳网络技术有限公司 | Page switching method and apparatus, and device and storage medium |
CN114371800A (en) * | 2021-12-15 | 2022-04-19 | 北京城市网邻信息技术有限公司 | Space display method, device, terminal and medium based on VR panorama roaming |
Also Published As
Publication number | Publication date |
---|---|
CN115981518B (en) | 2023-06-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110399064B (en) | Display interface switching method and device, storage medium and electronic device | |
CN108594999B (en) | Control method and device for panoramic image display system | |
CN112232900A (en) | Information display method and device | |
CN108255291B (en) | Virtual scene data transmission method and device, storage medium and electronic device | |
TW202304212A (en) | Live broadcast method, system, computer equipment and computer readable storage medium | |
CN107168616B (en) | Game interaction interface display method and device, electronic equipment and storage medium | |
CN114615556B (en) | Virtual live broadcast enhanced interaction method and device, electronic equipment and storage medium | |
CN108776544B (en) | Interaction method and device in augmented reality, storage medium and electronic equipment | |
EP2998845A1 (en) | User interface based interaction method and related apparatus | |
CN108829468B (en) | Three-dimensional space model skipping processing method and device | |
CN113407289A (en) | Wallpaper switching method, wallpaper generation method, device and storage medium | |
CN115981517B (en) | VR multi-terminal cooperative interaction method and related equipment | |
CN111973984B (en) | Coordinate control method and device for virtual scene, electronic equipment and storage medium | |
CN115981518B (en) | VR demonstration user operation method and related equipment | |
CN111617475B (en) | Interactive object construction method, device, equipment and storage medium | |
CN115167742A (en) | Barrage display method, device, equipment and storage medium | |
CN113989427B (en) | Lighting simulation method, device, electronic device and storage medium | |
CN113813607B (en) | Game view angle switching method and device, storage medium and electronic equipment | |
CN112399265B (en) | Method and system for adding content to image based on negative space recognition | |
CN115120979B (en) | Virtual object display control method, device, storage medium and electronic device | |
CN109509162B (en) | Image acquisition method, terminal, storage medium and processor | |
CN115591232A (en) | Object information display control method, device, equipment and storage medium | |
CN113810624A (en) | Video generation method and device and electronic equipment | |
CN112887695A (en) | Panorama sharing processing method, system and terminal | |
CN113190110A (en) | Interface element control method and device of head-mounted display equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |