CN110888529B - Virtual reality scene control method, virtual reality device and control device thereof - Google Patents
Virtual reality scene control method, virtual reality device and control device thereof Download PDFInfo
- Publication number
- CN110888529B CN110888529B CN201911128776.0A CN201911128776A CN110888529B CN 110888529 B CN110888529 B CN 110888529B CN 201911128776 A CN201911128776 A CN 201911128776A CN 110888529 B CN110888529 B CN 110888529B
- Authority
- CN
- China
- Prior art keywords
- virtual reality
- control device
- signal
- finger
- image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 26
- 238000004590 computer program Methods 0.000 claims abstract description 6
- 238000013459 approach Methods 0.000 description 7
- 239000011521 glass Substances 0.000 description 5
- 230000009471 action Effects 0.000 description 4
- 230000003993 interaction Effects 0.000 description 4
- 230000008569 process Effects 0.000 description 4
- 239000003086 colorant Substances 0.000 description 3
- 238000010586 diagram Methods 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 230000001133 acceleration Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 230000000877 morphologic effect Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 230000008054 signal transmission Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0346—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Position Input By Displaying (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The invention discloses a virtual reality scene control method, virtual reality equipment and a control device thereof, wherein the method comprises the steps of receiving a form signal and a finger position signal sent by a virtual reality control device, displaying an image of the virtual reality control device on a display screen according to the form signal and the finger position signal, and displaying an image of a virtual finger; and receiving a control signal sent by the virtual reality control device, and executing a preset operation corresponding to the control signal. The virtual reality device of the present invention has a processor and a memory, and the processor can implement the above-described virtual reality scene control method when executing a computer program. The invention can enable the user to intuitively know the form of the control device of the virtual reality equipment and the position of the finger, and avoid the user from pressing wrong keys or operating the touch pad by mistake.
Description
Technical Field
The present invention relates to the field of virtual reality technologies, and in particular, to a virtual reality scene control method, a virtual reality device for implementing the method, and a control device used by the virtual reality device.
Background
Virtual Reality (Virtual Reality) devices display Virtual images through a display screen, and acquire actions of a human body in conjunction with sensor technology, such as by sensor technology, the displayed images vary with the actions of the human body. The existing virtual reality equipment brings good visual experience to users, and is widely accepted by people. At that time, most of the current virtual reality devices are not provided with a control device, for example, a control handle is not provided, so that the experience of a user is poor, and the lagged interaction experience cannot meet the requirement of increasingly complex interaction scenes.
At present, the most common virtual reality equipment is head-mounted equipment, such as VR glasses or VR helmets, and compared with intelligent equipment such as smart phones and tablet computers, the conventional virtual reality equipment has no touch screen, and can not realize click response rapidly, namely, a user can not realize input by clicking the touch screen in the using process. In addition, in the process of using the virtual reality device, the eyes cannot see the real world object generally, but the existing virtual reality device does not have the function of detecting the eye action, and the eyes cannot participate in interaction in the whole interaction process, so that physical examination of the user is affected.
For this reason, some virtual reality devices are currently provided with an interactive handle, for example, chinese patent application No. CN201610266980 discloses a handle for use in a virtual reality device, where a sensor and a key are provided on the handle, and the handle may perform wireless communication with the virtual reality device. In the process of using the control device of the virtual reality device, the handle can be operated by the hand, and a control signal is sent to the virtual reality device by the handle so as to control the work of the virtual reality device.
However, when the user uses the virtual reality device, the user usually wears VR glasses or VR helmets, and eyes watch the display screen of the virtual reality device, so that the shape of the handle and the positions of keys on the handle are often difficult to observe, which brings inconvenience to the user in operating the handle. Because the user cannot see the positions of the keys on the handle, it is often unclear on which key on the handle the finger presses, and the operation performed by the virtual reality device is not the operation desired by the user because the wrong key is easily pressed, thereby affecting the use of the user.
Disclosure of Invention
The invention mainly aims to provide a virtual reality scene control method which is convenient for a user to watch a control device.
Another object of the present invention is to provide a virtual reality apparatus capable of implementing the above virtual reality scene control method.
It is still another object of the present invention to provide a control device for use with the above-described virtual reality apparatus.
In order to achieve the above main object, the present invention provides a virtual reality scene control method, which includes receiving a form signal and a finger position signal sent by a virtual reality control device, displaying an image of the virtual reality control device on a display screen according to the form signal and the finger position signal, and displaying an image of a virtual finger; and receiving a control signal sent by the virtual reality control device, and executing a preset operation corresponding to the control signal.
According to the scheme, when a user controls the control device of the virtual reality device, such as the handle, an image of the control device is displayed on the display screen of the virtual reality device, and an image of the virtual finger is displayed, so that the user can know the position relationship between the finger and the control device, and can easily control the control device, for example, a key to be pressed can be pressed more accurately, and the virtual reality device can execute expected operation.
Preferably, receiving the finger position signal includes: and receiving an operation preparation signal sent by the virtual reality control device, and displaying an area to be operated in the control device image of the virtual reality equipment in a first display mode.
Therefore, when the user approaches the finger to the key to be controlled or touches the key, the key is displayed on the display screen of the virtual reality device in a preset display mode, so that the user knows the key to be pressed by the finger.
Further, the receiving the finger position signal includes: and receiving an operation signal sent by the virtual reality control device, and displaying an operation area in the control device image of the virtual reality equipment in a second display mode.
It can be seen that, when the user presses a certain key or touches a touch screen of the control device and issues a control instruction, the key or the touch area is displayed on the display screen of the virtual reality device, so that the user can know the issued control instruction.
In a further aspect, displaying, in a first display manner, an area to be operated in an image of a control device of a virtual reality device includes: displaying a region to be operated in a control device image of the virtual reality device in a first color; displaying the operation region in the control device image of the virtual reality device in the second display manner includes: an operation region in the control device image of the virtual reality device is displayed in a second color.
Therefore, when the control device sends out operation preparation signals or operation signals, the to-be-operated area or the operation area is respectively displayed by adopting different colors, and a user can more intuitively know the position of the finger on the control device.
In a further aspect, the virtual reality control device includes at least one key; displaying the region to be operated in the control device image of the virtual reality device in the first display mode includes: displaying a key corresponding to the image of the virtual finger in a first color; displaying the operation region in the control device image of the virtual reality device in the second display manner includes: and displaying the keys corresponding to the images of the virtual fingers in the second color.
Therefore, the keys to be operated by the user or the keys which are pressed by the user are respectively displayed in different colors, and the user can more intuitively know what the currently operated key is, so that the operation performed by the virtual reality device is clearly known.
In order to achieve the above another object, the present invention further provides a virtual reality device, including a processor, a memory, and a display screen, where the memory stores a computer program, and the computer program when executed by the processor implements the steps of the above virtual reality scene control method.
In order to achieve the above object, the present invention further provides a control device for a virtual reality device, including a housing, an operation area is disposed on the housing, and a wireless signal transceiver is disposed in the housing; the wireless signal transceiver receives the form signal sent by the form sensor, receives the finger position signal sent by the finger position sensor and sends the form signal and the finger position signal to the virtual reality equipment.
According to the scheme, the control device can send the form signal and the finger position signal to the virtual reality equipment, and after the virtual reality equipment receives the signal, the image and the finger position of the control device can be displayed on the display screen, so that a user can know the relative position between the finger and the control device conveniently, and the operation of the user is facilitated.
Further, when the finger position sensor detects that the finger is close to the operation area, the wireless signal transceiver transmits an operation preparation signal to the virtual reality device, and when the finger position sensor detects that the finger presses or touches the operation area, the wireless signal transceiver transmits an operation signal to the virtual reality device.
Therefore, after the finger of the user approaches to the operation area or touches the operation area, the control device respectively sends an operation preparation signal and an operation signal to the virtual reality equipment, and the virtual reality equipment can respectively display the operation area in different display modes, so that the user can intuitively know the keys to be operated.
Drawings
Fig. 1 is a block diagram of a virtual reality field device embodiment of this invention.
Fig. 2 is a schematic structural diagram of an embodiment of a virtual reality field device control apparatus according to this invention.
Fig. 3 is a schematic structural diagram of another view of an embodiment of the virtual reality field device control apparatus according to this invention.
Fig. 4 is a flowchart of an embodiment of a virtual reality scene control method of the present invention.
Fig. 5 is a flowchart showing a virtual finger image in an embodiment of the virtual reality scene control method of this invention.
The invention is further described below with reference to the drawings and examples.
Detailed Description
The virtual reality scene control method is applied to virtual reality equipment, the virtual reality equipment can be equipment such as VR glasses or VR helmets, and the virtual reality equipment can receive signals of a control device, for example, signals of a handle, and display images of the virtual control device on a display screen according to the signals of the handle.
Referring to fig. 1, the virtual reality apparatus of the present invention has a processor 10, a memory 11, a display 12, and a wireless signal transceiver 13, preferably, the memory 11 stores a computer program, and after the processor 10 reads and executes the computer program, the virtual reality scene control method of the present invention can be implemented, and specific steps of the method will be described in detail below.
The display 12 of the virtual reality device may be an LED display or an LCD display, and may receive the signal sent by the processor 10, and display a corresponding image according to the signal sent by the processor 10, for example, display an image of a preset scene, and after the user wears VR glasses or VR helmet, the user may watch the corresponding image through the display 12, so that the user has an immersive feeling.
Because the virtual reality device of the present invention needs to communicate with the control device, such as the control handle, the virtual reality device is provided with the wireless signal transceiver 13, in this embodiment, the wireless signal transceiver 13 is a bluetooth module, and correspondingly, the wireless signal transceiver on the control handle is also a bluetooth module. When the virtual reality device works, the wireless signal transceiver 13 communicates with the bluetooth module on the control handle, and wireless signal transmission is realized through the bluetooth module.
Referring to fig. 2 and 3, in the present embodiment, the control device used with the virtual reality device is a control handle 20, the control handle 20 has a housing 25, a touch pad 21 is disposed at one end of the housing 25, a key area 21 is disposed on the housing 25, and four keys 23 are disposed in the key area 21, for example, four keys 23 are keys A, B, C, D, respectively. When a user needs to control the virtual reality device, the control handle 20 can be held by hand, and control of the virtual reality device is achieved by pressing keys on the control handle 20.
In order to detect the shape of the control handle 20, a plurality of sensors, for example, a shape sensor, is provided in the control handle 20, the shape sensor of the present embodiment is used to detect the shape of the control handle 20, for example, a gravity sensor, a multi-axis acceleration sensor, an electronic gyroscope, or the like is provided in the housing 25, and the shape sensor is used to detect the rotational posture of the control handle 20, and the shape sensor may constitute the inertial sensor system 27 of the present embodiment. Preferably, the inertial sensor system 27 is enclosed within the housing 25 and may be integrated within a single module and waterproof, sealed arrangement.
The control handle 20 needs to detect not only the form of the control handle itself but also the movement of the hand of the user, for example, whether the user holds the control handle 20 or not and whether the user's finger presses a key, and therefore, the control handle 20 is further provided with a finger position sensor including a temperature sensor 28 provided on the side wall of the housing 25. When the user holds the control handle 20, the finger will be placed on the side of the housing 25 and the temperature sensor 28 can detect a change in temperature, thereby determining that the user's finger is placed on the side wall of the handle 20. Of course, an infrared sensor may be provided on the side wall of the housing 25 to detect whether a user's finger is placed on the side wall of the housing 25 or the infrared sensor alone.
Of course, the temperature sensor and the infrared sensor are not limited to being provided on the side wall of the control handle 20, but may be provided at other positions of the housing 25 of the control handle 20, and the positions of the plurality of fingers may be determined by signals detected by the plurality of temperature sensors and the infrared sensor.
In addition, since the user must press the key 23 using a finger when manipulating the control handle 20, a rectangular touch pad may be provided on the key region 22, and when the user's finger touches the key region 21, the touch sensor may receive a touch signal of the user's finger, thereby determining the position of the finger on the key region 21. For example, a capacitive touch pad may be used, which detects a signal of a user's finger when the distance between the user's finger and the key region 21 is less than a preset distance, for example, 2 cm.
Further, a touch pad, such as a capacitive touch pad, is provided on the surface of each key 23, and is capable of detecting a finger action when the user's finger approaches or presses the key 23. Since the user's finger does not press a key, it may be determined that the user has not actually issued an operation signal, but since the user's finger has approached or touched the key 23, indicating that the user may need to press the key 23, the touch pad may issue a signal, such as an approach of the finger to a certain key 23. The processor in the control handle 20 may receive the signal from the touch pad and transmit the received signal to the virtual reality device through a wireless signal transceiver, such as a bluetooth module.
Preferably, when the user approaches a certain key 23 but does not press the key 23, the signal transmitted from the control handle 20 to the virtual reality device is an operation preparation signal including a signal that the user's finger approaches a certain key 23.
If the user presses a certain key 23, it indicates that the user has sent an actual operation signal, and at this time, after receiving the signal that the user presses a certain key 23, the processor in the control handle 20 sends an operation signal that the user presses a certain key 23 to the virtual reality device through the wireless signal transceiver.
Further, the circular touch pad 21 provided on the top of the control handle 20 may receive a touch signal of a user's finger, for example, when the user's finger approaches a certain area of the touch pad 21, the touch pad 21 transmits a signal of the finger approaching the area to the virtual reality device. Preferably, when the touch panel 21 detects that the distance of the finger from the touch panel 21 is smaller than the first threshold value and larger than the second threshold value, it is judged that the user is ready to operate on the touch panel 21, and at this time, an operation ready signal may be issued. For example, when the distance between the user's finger and the touch panel 21 is between 2 cm and 5 cm, which indicates that the user intends to perform a touch operation on the touch panel 21, but the user does not actually issue a control signal, at this time, the control handle 20 may issue an operation preparation signal to the virtual reality device.
If a finger of a user touches the touch pad 21, such as sliding to form a certain track, double-clicking the touch pad 21, etc., it indicates that the user has issued an instruction for actual operation, at this time, the touch pad 21 detects a specific position of the user touching the touch pad 21, and forms specific data of the touch track or a double-clicking position, and sends an operation signal to the virtual reality device.
It will be understood that in the present embodiment, the touch panel 21 and the key region 22 constitute an operation region of the present embodiment, and the user's finger performs operations such as pressing, touching, sliding, clicking, and the like in the operation region and issues a control signal.
In this way, the virtual reality device can display a virtual image of the control handle 20 on the display 12 after receiving the morphological signal of the control handle 20 transmitted by the control handle 20, and can display an image of a virtual finger on the display 12 after receiving the finger position signal of the user, and the user can understand the relative positional relationship between the finger and the control handle 20 through the image displayed on the display 12.
And, the virtual reality device may display the region to be operated in a first display manner after receiving the operation preparation signal transmitted from the control handle 20, and display the operation region in a second display manner after receiving the operation signal, so as to be used for knowing the position of the finger placed on the control handle 20.
The following describes a control method of the virtual reality scenario according to the present invention with reference to fig. 4 and 5. First, step S1 is executed, and the virtual reality device receives a form signal and a finger position signal of a control device, which in this embodiment is a control handle 20 used by a user. Since the inertial sensor system 27 is provided on the control handle 20, it is possible to detect the shape of the control handle 20, for example, whether the control handle 20 is in a horizontally placed state, a vertically placed state, or an inclined state, and if the control handle 20 is in an inclined state, it is also possible to detect the inclination angle between the control handle 20 and the horizontal direction, the inclination angle between the control handle 20 and the vertical direction, or the like. In addition, the control handle 20 may detect whether the head portion thereof, i.e., the end where the touch pad 21 is located, is placed upward or downward, and may detect whether the side where the key area 22 is located is placed upward or downward, and the inertial sensor system 27 transmits a detection signal to the virtual reality device.
The control handle 20 also detects the position of the user's finger, forming a finger position signal, for example, detecting which area of the control handle 20 the user's finger touches, how many fingers are touching on the control handle 20. Since a large number of temperature sensors, infrared sensors, or touch panels are provided on the control handle 20, the positions of a plurality of fingers of the user on the control handle 20 can be accurately detected.
After receiving the form signal and the finger position signal sent by the control handle 20, the virtual reality device executes step S2, and displays an image of the control handle 20 and an image of the virtual finger on the display screen 12 according to the form signal and the finger position signal. Preferably, the images of the control handle 20 and the virtual finger are displayed in a semi-transparent manner on a predetermined area of the display screen 12. Preferably, the images of the control handle 20 and the virtual finger are displayed on one corner of the display 12, and the image display area of the control handle 20 does not exceed 20% of the display area of the entire display 12, so as to reduce interference with the user's viewing of other images.
Since the hand position signal sent by the control handle 20 to the virtual reality device further includes the operation preparation signal and the operation signal, in order to make it more convenient for the user to see the key pressed by the finger or the touched area, the area on the control handle 20 can be displayed in different display modes. Referring to fig. 5, the virtual reality device first performs step S11, determines whether an operation preparation signal is received, and if so, performs step S12.
For example, the user touches a finger on the surface of the key a, but does not press the key a, at which point it is indicated that the user intends to press the key a and issues a corresponding control instruction. The touch sensor on the control handle 20 may detect a signal that the user touches the finger on the key a, which is an operation preparation signal, and transmit the operation preparation signal to the virtual reality device.
After receiving the signal that the finger of the user touches the key a, the virtual reality device executes step S12 to display the area to be operated in the first display mode. In this embodiment, the area to be operated is an area corresponding to the operation preparation signal, for example, the operation preparation signal is a signal that a finger touches on the key a, and the area to be operated is the area where the key a is located. Therefore, in step S12, the area where the key a is located may be displayed with a first preset color, such as green, to prompt the user that the key 23 touched by the finger is the key a.
Then, step S13 is executed, and the virtual reality device determines whether an operation signal is received, if yes, step S14 is executed, otherwise, step S11 is executed again. In the present embodiment, the operation signal is an actual control signal issued by the user through a pressing or touching operation of a finger, for example, an operation in which the user presses the key a or a double click or slide is performed on the touch panel 21.
If the virtual reality device receives the operation signal, step S14 is performed to display the operation region in the second display manner. In this embodiment, the operation area is an area corresponding to the operation signal, for example, after the user presses the key a, the operation area is an area corresponding to the key a. Therefore, in step S14, the operation area may be displayed using a preset color, such as the area where the red display key a is used. It should be noted that the color used in the second display mode must be different from the color used in the first display mode, and preferably, the two colors need to have a larger color difference, such as red, green, blue, gray, and the like.
Of course, the first display mode and the second display mode are not necessarily different in display color, but may be other display modes, for example, the first display mode is to add a green border to the outline of the area to be operated, and the second display mode is to add a red border to the outline of the area to be operated, and the second display mode is not to flash.
In addition, the area to be operated and the operation area are not necessarily the area where the key is located, but may be any area on the touch panel 21, such as an area where a finger is close to or an area touched by the finger, and if the finger slides on the touch panel 21 to form a sliding track, the area where the sliding track passes is taken as the operation area, and the area of the sliding track is displayed in the second display mode.
Referring to fig. 4, after the virtual reality device generates and displays the virtual image of the control handle 20 and the virtual image of the finger, step S3 is performed to determine whether the control signal transmitted from the control handle 20 is received. For example, the user presses a button, double-clicks or slides on the touch panel 21 to form a track, or the user shakes the control handle 20. After the virtual reality device receives the control signal, step S4 is executed, and according to the received control signal, a preset operation corresponding to the control signal is executed. For example, when the user presses the key a, an operation corresponding to the key a is performed, such as displaying the next virtual scene, or fast forwarding the displayed video for 5 seconds, increasing the volume, or the like.
Finally, step S5 is executed to determine whether to end the control, for example, the user sends out an end signal through the control handle 20, or the user removes VR glasses, VR helmets, etc., and if the user sends out an instruction to end the control, the control over the virtual reality device is ended, and the control handle 20 no longer sends a signal to the virtual reality device.
It can be seen that the scheme of the present invention is to detect the shape of the control handle 20 through a plurality of sensors on the control handle 20, detect the position of the finger of the user, send the detected signals to the virtual reality device, and the virtual reality device displays the shape of the control handle 20, the position of the finger, and the region to be operated and the operating region on the display screen 12 according to the received signals, so that the user can watch the position of the finger on the control handle 20 through the display screen 12, and avoid the user operation error.
In addition, the invention uses different display modes to display the region to be operated and the operation region, so that the user can intuitively know the region touched by the current finger and know which key is pressed by the finger or the track formed by touching. Once the user finds that the finger is placed on the wrong key, the user can timely move away from the finger, so that the user is effectively prevented from sending out wrong control instructions, and the user experience is greatly improved.
Of course, the above-mentioned embodiments are only preferred embodiments of the present invention, and many more variations are possible in practical application, for example, the shape of the control handle may be designed according to practical needs, or the shape sensor and the finger position sensor disposed on the control handle may be adjusted according to the needs of use, and these variations do not affect the implementation of the present invention and should also be included in the scope of protection of the present invention.
Claims (6)
1. The virtual reality scene control method is characterized by comprising the following steps:
receiving a form signal and a finger position signal sent by a virtual reality control device, displaying an image of the virtual reality control device on a display screen in a semitransparent mode according to the form signal and the finger position signal, and displaying an image of a virtual finger in a semitransparent mode;
receiving a control signal sent by the virtual reality control device, and executing a preset operation corresponding to the control signal;
wherein receiving the finger position signal comprises: receiving an operation preparation signal sent by the virtual reality control device, and displaying a region to be operated in an image of the virtual reality control device in a first display mode;
wherein receiving the finger position signal comprises: and receiving an operation signal sent by the virtual reality control device, and displaying an operation area in the image of the control device of the virtual reality equipment in a second display mode.
2. The virtual reality scene control method of claim 1, characterized by:
displaying the region to be operated in the control device image of the virtual reality equipment in a first display mode comprises the following steps: displaying a region to be operated in a control device image of the virtual reality equipment in a first color;
displaying the operation region in the control device image of the virtual reality device in a second display manner includes: and displaying an operation area in the control device image of the virtual reality device in a second color.
3. The virtual reality scene control method according to claim 1 or 2, characterized by:
the virtual reality control device comprises at least one key;
displaying the region to be operated in the control device image of the virtual reality equipment in a first display mode comprises the following steps: displaying a key corresponding to the image of the virtual finger in a first color;
displaying the operation region in the control device image of the virtual reality device in a second display manner includes: and displaying the keys corresponding to the images of the virtual fingers in a second color.
4. A virtual reality device comprising a processor, a memory and a display screen, the memory storing a computer program which, when executed by the processor, performs the steps of the virtual reality scene control method of any of claims 1 to 3.
5. A control device for a virtual reality device comprises
The device comprises a shell, a wireless signal receiving and transmitting device and a wireless signal transmitting and receiving device, wherein an operation area is arranged on the shell;
the method is characterized in that:
the wireless signal transceiver is used for receiving the form signal sent by the form sensor, receiving the finger position signal sent by the finger position sensor and sending the form signal and the finger position signal to the virtual reality equipment;
when the finger position sensor detects that a finger is close to the operation area, the wireless signal transceiver device sends an operation preparation signal to the virtual reality equipment, so that the virtual reality control device displays an area to be operated in a control device image of the virtual reality in a first display mode;
when the finger position sensor detects that the finger presses or touches the operation area, the wireless signal transceiver device sends an operation signal to the virtual reality equipment, so that the virtual reality control device displays the operation area in the control device image of the virtual reality equipment in a second display mode.
6. The control device of a virtual reality apparatus according to claim 5, characterized in that:
the operation area is a key area, and the finger position sensor comprises a touch sensor arranged near the key area.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911128776.0A CN110888529B (en) | 2019-11-18 | 2019-11-18 | Virtual reality scene control method, virtual reality device and control device thereof |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911128776.0A CN110888529B (en) | 2019-11-18 | 2019-11-18 | Virtual reality scene control method, virtual reality device and control device thereof |
Publications (2)
Publication Number | Publication Date |
---|---|
CN110888529A CN110888529A (en) | 2020-03-17 |
CN110888529B true CN110888529B (en) | 2023-11-21 |
Family
ID=69747868
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201911128776.0A Active CN110888529B (en) | 2019-11-18 | 2019-11-18 | Virtual reality scene control method, virtual reality device and control device thereof |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110888529B (en) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113778525B (en) * | 2021-09-16 | 2024-04-26 | 中国南方电网有限责任公司超高压输电公司昆明局 | Air-break control monitoring method and device based on lora communication and computer equipment |
CN114840083A (en) * | 2022-05-05 | 2022-08-02 | 维沃移动通信有限公司 | VR handle, electronic device and control method and control device thereof |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN204291252U (en) * | 2014-11-14 | 2015-04-22 | 西安中科微光医疗技术有限公司 | A kind of panoramic map display system based on virtual implementing helmet |
CN105975061A (en) * | 2016-04-26 | 2016-09-28 | 乐视控股(北京)有限公司 | Control method and apparatus for virtual reality scene as well as handle |
CN106354412A (en) * | 2016-08-30 | 2017-01-25 | 乐视控股(北京)有限公司 | Input method and device based on virtual reality equipment |
CN106445166A (en) * | 2016-10-19 | 2017-02-22 | 歌尔科技有限公司 | Virtual reality helmet and method of switching display information of virtual reality helmet |
CN106484119A (en) * | 2016-10-24 | 2017-03-08 | 网易(杭州)网络有限公司 | Virtual reality system and virtual reality system input method |
CN106873785A (en) * | 2017-03-31 | 2017-06-20 | 网易(杭州)网络有限公司 | For the safety custody method and device of virtual reality device |
CN107291359A (en) * | 2017-06-06 | 2017-10-24 | 歌尔股份有限公司 | A kind of input method, device and system |
CN109407935A (en) * | 2018-09-14 | 2019-03-01 | 歌尔科技有限公司 | A kind of virtual reality display control method, device and system |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20170293351A1 (en) * | 2016-04-07 | 2017-10-12 | Ariadne's Thread (Usa), Inc. (Dba Immerex) | Head mounted display linked to a touch sensitive input device |
WO2017201162A1 (en) * | 2016-05-17 | 2017-11-23 | Google Llc | Virtual/augmented reality input device |
US11054895B2 (en) * | 2017-07-27 | 2021-07-06 | Htc Corporation | Method of display user movement in virtual reality system and related device |
-
2019
- 2019-11-18 CN CN201911128776.0A patent/CN110888529B/en active Active
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN204291252U (en) * | 2014-11-14 | 2015-04-22 | 西安中科微光医疗技术有限公司 | A kind of panoramic map display system based on virtual implementing helmet |
CN105975061A (en) * | 2016-04-26 | 2016-09-28 | 乐视控股(北京)有限公司 | Control method and apparatus for virtual reality scene as well as handle |
CN106354412A (en) * | 2016-08-30 | 2017-01-25 | 乐视控股(北京)有限公司 | Input method and device based on virtual reality equipment |
CN106445166A (en) * | 2016-10-19 | 2017-02-22 | 歌尔科技有限公司 | Virtual reality helmet and method of switching display information of virtual reality helmet |
CN106484119A (en) * | 2016-10-24 | 2017-03-08 | 网易(杭州)网络有限公司 | Virtual reality system and virtual reality system input method |
CN106873785A (en) * | 2017-03-31 | 2017-06-20 | 网易(杭州)网络有限公司 | For the safety custody method and device of virtual reality device |
CN107291359A (en) * | 2017-06-06 | 2017-10-24 | 歌尔股份有限公司 | A kind of input method, device and system |
CN109407935A (en) * | 2018-09-14 | 2019-03-01 | 歌尔科技有限公司 | A kind of virtual reality display control method, device and system |
Also Published As
Publication number | Publication date |
---|---|
CN110888529A (en) | 2020-03-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20170293351A1 (en) | Head mounted display linked to a touch sensitive input device | |
US10620699B2 (en) | Head mounted display, mobile information terminal, image processing apparatus, display control program, display control method, and display system | |
EP2620849B1 (en) | Operation input apparatus, operation input method, and program | |
US9268400B2 (en) | Controlling a graphical user interface | |
JP5167523B2 (en) | Operation input device, operation determination method, and program | |
EP3716031B1 (en) | Rendering device and rendering method | |
US10203760B2 (en) | Display device and control method thereof, gesture recognition method, and head-mounted display device | |
CN110362231B (en) | Head-up touch device, image display method and device | |
US20170038838A1 (en) | Information processing system and information processing method | |
JP2013016060A (en) | Operation input device, operation determination method, and program | |
JP6105822B1 (en) | Touch screen control method and apparatus | |
JP2010015553A (en) | Image recognition device, manipulation determination method, and program | |
US20140022171A1 (en) | System and method for controlling an external system using a remote device with a depth sensor | |
CN109558061A (en) | A kind of method of controlling operation thereof and terminal | |
CN106383652A (en) | Virtual input method and system apparatus | |
US20150193000A1 (en) | Image-based interactive device and implementing method thereof | |
CN111258420B (en) | Information interaction method, head-mounted device and medium | |
KR101872272B1 (en) | Method and apparatus for controlling of electronic device using a control device | |
JP2014026355A (en) | Image display device and image display method | |
CN110888529B (en) | Virtual reality scene control method, virtual reality device and control device thereof | |
JP2016126687A (en) | Head-mounted display, operation reception method, and operation reception program | |
TW201325101A (en) | Distant multipoint remote control device and system | |
KR101708455B1 (en) | Hand Float Menu System | |
CN111240483B (en) | Operation control method, head-mounted device, and medium | |
KR20090085821A (en) | Interface device, games using the same and method for controlling contents |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |