[go: up one dir, main page]

CN103108126B - A kind of video interactive system, method, interaction glasses and terminal - Google Patents

A kind of video interactive system, method, interaction glasses and terminal Download PDF

Info

Publication number
CN103108126B
CN103108126B CN201310022617.9A CN201310022617A CN103108126B CN 103108126 B CN103108126 B CN 103108126B CN 201310022617 A CN201310022617 A CN 201310022617A CN 103108126 B CN103108126 B CN 103108126B
Authority
CN
China
Prior art keywords
terminal
infrared
user
camera
video
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201310022617.9A
Other languages
Chinese (zh)
Other versions
CN103108126A (en
Inventor
钟增梁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
TCL Corp
Original Assignee
TCL Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by TCL Corp filed Critical TCL Corp
Priority to CN201310022617.9A priority Critical patent/CN103108126B/en
Publication of CN103108126A publication Critical patent/CN103108126A/en
Application granted granted Critical
Publication of CN103108126B publication Critical patent/CN103108126B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • User Interface Of Digital Computer (AREA)
  • Details Of Television Systems (AREA)
  • Selective Calling Equipment (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

The invention provides a kind of video interactive method and system, glasses and terminal.A kind of interactive system, including the first terminal connected by internet video and second terminal, there are the first terminal one or more to be used to obtaining the controlled device that first terminal camera described in the camera and remote control of first terminal scene image is rotated, the second terminal provided with the display device for showing the scene image and being controlled by the controlled device with the first terminal interact so that remote control described in the rotation of first terminal camera control device.The present invention is reached according to user in the purpose of the acquisition first terminal image of the control operation multi-angle of second terminal, reaches the level of intelligentized control method, considerably improves the video interactive experience of user.

Description

Video interaction system and method, interactive glasses and terminal
Technical Field
The invention belongs to the technical field of video interaction, and particularly relates to a video interaction system and method, interactive glasses and a terminal.
Background
In daily life habits, televisions are typically placed in living rooms. Compared with general computer video chat, the living room has a more open space range, the user privacy protection requirement is lower, more scenes are in the living room, more people can participate, and the requirement on the video watching range is higher.
The television is different from a Personal Computer (PC), the smart television is generally heavy or mounted on a wall, and a fixed camera arranged on the smart television can only keep a single rigid shooting angle due to the fact that the smart television cannot be moved, so that the user video interaction is inconvenient to use, and the user video experience is not high.
On the other hand, the current 3D television focuses on playing 3D video, and a user can watch 3D images such as 3D movies or programs by wearing 3D glasses. At present, a plurality of 3D televisions are provided with cameras, the existing smart televisions can carry out video chat by means of related software application programs, but a perfect 3D video interaction system is not available, but users are inconvenient to use, and experience is not high.
Disclosure of Invention
The invention aims to provide a video interaction system capable of obtaining a larger video range between two user terminals for video interaction, and also provides a video interaction method, and accordingly provides video interaction glasses and a terminal, so that a user can experience more real video interaction.
The video interaction system provided by the invention is realized as follows: a video interaction system comprises a first terminal and a second terminal which are connected through an internet video, wherein the first terminal is provided with one or more than one camera used for acquiring a scene image of the first terminal and a controlled device for remotely controlling the rotation of the camera of the first terminal, and the second terminal is provided with a display device for displaying the scene image and a control device for remotely controlling the rotation of the camera of the first terminal through interaction with the controlled device of the first terminal.
Specifically, the control device of the second terminal comprises a remote command processing module, the remote command processing module packages the control command and transmits the control command to the controlled device of the first terminal according to the control command sent by the user through the remote control device, and the controlled device controls the rotation of the camera of the first terminal.
Specifically, the remote control device of the second terminal includes: the interactive glasses are provided with infrared emission devices for emitting infrared signals according to specific emission frequencies.
The control device of the second terminal includes:
the user viewpoint acquisition module is used for acquiring viewpoint information of a second terminal user; and the remote command processing module is used for packaging the viewpoint information into a control instruction for controlling the first terminal camera and sending the control instruction. And the control instruction is transmitted to a controlled device of the first terminal after being packaged, and the controlled device controls the rotation of the first terminal camera.
Compared with the prior art, the video interaction system provided by the invention obtains the scene image around the first terminal from the camera of the first terminal, transmits the scene image to the second terminal in an internet transmission mode, and displays the scene image through the display terminal of the second terminal. In order to enable the second terminal to acquire the scene image around the first terminal in multiple angles according to the change of the face motion of the user, a controlled device capable of controlling the camera of the first terminal to rotate is arranged at the first terminal, a user viewpoint acquisition device and a control device are arranged at the second terminal, the user viewpoint information is acquired through the user viewpoint acquisition device, the user viewpoint rotation direction information is transmitted to the controlled device of the first terminal through the control device, and the rotation of the camera of the first terminal is controlled through the controlled device, so that the remote adjustment of the scene image is realized. Further, the purpose of automatically acquiring the scene images around the first terminal in multiple angles according to the viewpoint change of the user at the second terminal and transmitting the scene images to the second terminal for playing is achieved.
The video interaction method provided by the invention is realized in such a way, and comprises the following steps:
the method comprises the steps that a first terminal and a second terminal establish video connection through the Internet;
the second terminal obtains an interactive control instruction of a user through the control device, packages the control instruction and transmits the control instruction to the first terminal;
and the first terminal receives the control instruction and controls the first terminal camera to rotate according to the control instruction.
Specifically, a second terminal user transmits a remote control signal for remotely controlling the camera to rotate through a remote control device, and the second terminal receives the remote control signal and packages the remote control signal into an instruction packet for controlling the camera of the first terminal to rotate.
Specifically, the second terminal user wears video interactive glasses, a gyroscope is arranged in the video interactive glasses, viewpoint position information of the user is obtained through the gyroscope, and the viewpoint position information is converted into a control instruction for remotely controlling the camera to rotate.
Specifically, a second terminal user wears video interactive glasses with an infrared transmitting device inside on the head, and the video interactive glasses transmit infrared signals to the display screen through the infrared transmitting device according to a certain frequency; the second terminal is provided with at least three infrared receivers which are not on the same straight line, and each infrared receiver receives the infrared signal and calculates the viewpoint position information of the current user according to the intensity information of the received infrared signal; and generating a control instruction for controlling the deflection of the first terminal camera according to the viewpoint position information.
Correspondingly, the invention also provides video interactive glasses, which are provided with infrared transmitting devices, wherein the infrared transmitting devices transmit infrared signals to the terminal according to specific transmitting frequencies and are used for the terminal to calculate the viewpoint of the user on the display screen of the terminal. Furthermore, the video interaction glasses are also provided with wireless headsets.
Correspondingly, the invention also provides a video interaction first terminal, wherein the terminal is provided with a camera and a controlled device for acquiring the scene image, the terminal receives the control instruction of the second terminal through the controlled device and controls the camera to rotate according to the control instruction, so that the shooting direction of the terminal is adjusted.
Correspondingly, the terminal is provided with a control device which is in control interaction with a controlled device of the first terminal so as to control the rotation of the camera of the first terminal.
Compared with the prior art, the video interaction method provided by the invention enters a remote virtual scene through the video interaction system provided by the invention, the viewpoint position information or the video focus point information of the current user is obtained by adopting the method provided by the invention, and the aim of controlling the first terminal camera to adjust the shooting angle consistent with the viewpoint position or the video focus point is further fulfilled, so that the requirements of freer control and higher watching range during the video interaction of the user are met.
Meanwhile, due to the video interaction method, the video interaction glasses, the video interaction first terminal and the second terminal, the shooting direction of the camera of the first terminal can be remotely controlled or can change along with the change of the viewpoint position information of the user, the video interaction method can acquire the image content of a remote scene corresponding to the deflection angle direction in real time according to the change of the head visual angle deflection of the user, the intelligent control level is achieved, and the video interaction experience of the user is remarkably improved.
Drawings
Fig. 1 is a schematic structural diagram of a video interaction system according to an embodiment of the present invention;
fig. 2 is a schematic structural diagram of a video interaction system according to a second embodiment of the present invention;
fig. 3 is a schematic structural diagram of a video interaction system according to a third embodiment of the present invention;
FIG. 4 is a schematic diagram of a detailed structure of the control device in FIG. 3;
fig. 5 is a flowchart of an implementation of a video interaction method according to a fourth embodiment of the present invention;
fig. 6 is a flowchart of an implementation of a video interaction method according to a fifth embodiment of the present invention;
fig. 7 is a flowchart of an implementation of a video interaction method according to a sixth embodiment of the present invention;
fig. 8 is a model diagram of a relationship between an amplitude of a signal received by an infrared receiver and a radius of the signal received by the infrared receiver according to a seventh embodiment of the present invention;
fig. 9 is a schematic view of viewpoint calculation provided by the seventh embodiment of the present invention;
fig. 10 is a schematic structural diagram of video interactive glasses according to an eighth embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
Example one
This embodiment is a preferred implementation of the structural schematic diagram of the video interaction system provided by the present invention.
Referring to fig. 1, fig. 1 is a schematic structural diagram of a video interaction system according to a first embodiment of the present invention.
In the figure, the first terminal 102 and the second terminal 105 are connected through internet video, wherein the video connection refers to network connection of network video chat. The first terminal 102 is provided with one or more than one camera 101 for acquiring a scene image of the first terminal, the camera 101 is controlled to deflect a shooting direction, and a controlled device for remotely controlling the rotation of the camera 101 of the first terminal 102 is used for receiving a remote instruction and then controlling the rotation of the camera 101 according to the remote instruction.
The second terminal 105 is provided with a display device for displaying the scene image, and is used for playing a remote video scene image; and the control device is also arranged for remotely controlling the rotation of the camera 101 of the first terminal 102 through control interaction with the controlled device of the first terminal 102. The control device is configured to receive a user control instruction, encapsulate the user control instruction, and send the instruction to the controlled device of the first terminal 102 through a network transmission protocol, so as to control the camera 101 of the first terminal 102 to rotate through the controlled device of the first terminal 102. The control interaction here refers to the mutual control coordination between the controlled device of the first terminal 102 and the control device of the second terminal 105.
In the present invention, the first terminal 102 and the second terminal 105 may establish a unidirectional video connection, that is, the second terminal 105 may only control the rotation of the camera 101 of the first terminal 102. Since not every terminal is provided with a camera which can be rotated to adjust the shooting direction thereof as required by the present invention, the unidirectional video connection has better compatibility. Even if the second terminal 105 does not have the camera described in the present invention, the camera 101 of the first terminal 102 can be remotely controlled to rotate for shooting as long as the corresponding control device is arranged on the second terminal 105 device or the corresponding control software is installed, so that the video connection can be established with the terminal 102 described in the present invention across the device and the platform.
In the case of equipment, namely, a first terminal 102 is provided with a camera 101 required by the invention, and a second terminal 105 is also provided with a camera 104 capable of offset shooting required by the invention, the first terminal 102 and the second terminal 105 can also establish a bidirectional video connection and perform bidirectional video interaction (mutually control the rotation of the cameras with each other). The second terminal 105 may control the rotation of the camera 101 of the first terminal 102, and the first terminal 102 may also control the rotation of the camera 104 of the second terminal 105 at the same time.
In fig. 1, 106 is a remote control device, and a user of the second terminal 105 sends an instruction to control the camera 101 of the first terminal 102 to rotate for shooting through the remote control device 106, so as to obtain a moving image shot from multiple angles in a scene of the first terminal 102. In the state of bidirectional video interaction, the user of the first terminal 102 can also send instructions to control the rotation shooting of the camera 104 of the second terminal 105 through the remote control device 103.
In view of convenience of description, the present invention will be described with reference to a unidirectional connection mode in which the user of the second terminal 105 sends an instruction to the second terminal 105 through the remote control 106 and controls the rotational photographing of the camera 101 of the first terminal 102 through the second terminal 105.
Taking the implementation scheme of the unidirectional video connection as an example, in order to enhance the video acquisition effect, a plurality of control modes for automatically rotating and shooting the camera can be set in the first terminal 102, where the control modes correspond to a group of camera automatic rotation trajectory control programs, and the camera is controlled by the corresponding camera automatic rotation trajectory control program algorithm to rotate and shoot according to the predefined rotation trajectory. The user of the second terminal 105 or the user of the local first terminal 102 can automatically control the camera 101 of the first terminal 102 to rotate for shooting according to a specific track by switching the shooting mode. For example, at a certain rotation rate, left and right motion scanning shooting is carried out; rotating at a constant speed for 360 degrees, namely controlling the camera 101 to perform annular scanning shooting at a certain angle; locking follows a moving object (e.g., user) shot, etc.
The following-moving-target-locking (such as a user) shooting is characterized in that a moving target is automatically captured through an image processing algorithm according to a scene image, the following rotation of a camera is controlled according to the movement of the target, and then the following-moving target (such as the user) is locked to acquire a scene moving image.
It should be noted that the rotating member of the camera head can be a rotating member of the prior art, for example, a motor is used to drive a gear, so as to drive the camera head to rotate. The applicant will additionally disclose in another application a better camera rotation control mechanism. In addition, one or more cameras with wide visual angles can be used for static shooting, and an image processing method is adopted to rotate the visual field of the virtual image.
The remote control device 106 may be a general remote controller, an intelligent terminal, or interactive glasses.
The video interaction system of the embodiment is characterized in that in the video interaction of the user, the control capability of the user on the remote image acquisition is provided, so that in the video interaction, a video receiving user can watch the favorite remote movable image according to the intention of the user, a plurality of shooting modes are provided for the user, and the interest and the convenience of the user interaction are greatly improved.
Example two
This embodiment is a preferred embodiment of the video interaction system provided by the present invention, and is described by taking an embodiment of a unidirectional video connection as an example.
Referring to fig. 2, a video interactive system includes a first terminal 10 and a second terminal 20 connected to each other via the internet, the first terminal 10 is provided with at least one camera 12 capable of acquiring a scene image of the first terminal and a controlled device 11 capable of remotely controlling the rotation of the camera 12, the second terminal 20 comprises a control device 23 for remotely controlling the rotation of the first terminal camera 12 through control interaction with the controlled device 11 of the first terminal 10, the control device 23 comprises a remote command processing module 24, and the video interactive system further comprises a remote control device 21, the remote control device 21 is provided with a gyroscope 22, and the remote control device 21 establishes a wireless connection such as bluetooth, infrared and WIIF with the second terminal 20, and the user of the second terminal 20 performs control interaction with the second terminal 20 through the remote control device 21.
The remote command processing module 24 obtains the video focus control information of the user of the second terminal 20 through the remote control device 21, and converts the video focus control information into a control instruction for remotely controlling the rotation of the camera 12 of the first terminal 10, and the control instruction is packaged and transmitted to the controlled device 11 of the first terminal, so as to remotely control the rotation of the camera 12 of the first terminal 10. Video focus as used herein refers to the remote scene focus that a user focuses on in a video interaction.
The remote control device can be a remote controller, an intelligent terminal, interactive glasses and the like, a gyroscope for acquiring device position information is arranged in the remote control device, and the gyroscope is used for acquiring the movement deflection information of the remote control device to control the rotation offset of a camera of the remote terminal. Of course, the gyroscope herein also includes some motion positioning devices such as velocity and acceleration sensors.
In a normal use state, the remote control device 21 acquires the deflection parameters of the gyroscope 22 of the remote control device 21 in real time, and transmits the deflection parameters to the second terminal 20. The remote command processing module 24 of the second terminal 20 encapsulates the deflection parameters into an instruction data packet for controlling the rotation and deflection of the camera 12 of the first terminal 10, and sends the data packet to the first terminal 10, so as to control the deflection of the camera 12 of the first terminal 10, and realize that the user controls the second terminal 20 through the remote control device 21, and controls the camera of the first terminal to deflect according to the attention will of the user of the second terminal 20 through the second terminal. The first terminal 10 and the second terminal 20 may transmit the command packet through XML communication by the HTTP protocol.
Further, in this embodiment, the shielding of the gyroscope motion sensing function may be set according to actual use requirements. For example, when the user controls the camera rotation angle using the direction keys of the remote control device 21, the gyroscope 22 in the user's remote control device 21 is put in a shielding state in which the gyroscope 22 does not function, and the rotation of the first terminal camera is controlled only by the control keys of the remote control device 21. Of course, the remote control device without a built-in gyroscope may be controlled directly by the direction key. Therefore, the user can flexibly adopt the mode of controlling the camera in the actual use process, and the user can more conveniently meet the experience requirement of video interchange.
In this embodiment, the first terminal 10 further includes a scene video encoding module 13 and a video data transmission module 14. The second terminal 20 further includes an audio-video data receiving module 25, an audio-video decoding module 26, and an audio-video playing module 27, where the audio-video playing module 27 is electrically connected to the display screen of the second terminal 20.
The scene video coding module 13 is in electrical signal connection with the camera 12, and the camera 12 acquires the peripheral image information of the first terminal 10, transmits the peripheral image information to the scene video coding module 13, and is coded and converted into a video format convenient for network transmission by the video coding module 13. In addition, the voice data of the microphone may be encoded and transmitted together with the video data or independently, and is not limited herein.
The scene video encoding module 13 further includes a network detection unit, and the video data encoding and decoding module may further automatically adjust the video compression ratio of the encoding according to the network speed detected by the network detection unit (for example, sending a "ping" command to the second terminal).
The video data transmission module 14 is in signal connection with the video data receiving module 25 of the second terminal 20, a video transmission buffer can be established between the two, and the image data acquired by the first terminal 10 is transmitted to the video data receiving module 25 through a network by an RTP/RTCP protocol. The video decoding module 26 performs corresponding decoding processing on the video code acquired by the video data receiving module 25, so as to adapt to the video playing module 27, and the video code can be played on the display screen of the second terminal 20 through the video playing module 27.
Preferably, the system provided by the invention is in the case of a self-contained device, for example the first terminal 10 comprises at least two of said cameras 12 and has a 3D encoding engine; the second terminal 20 has a 3D playing engine, and the system provided by the present invention is suitable for 3D scene video interaction. The video interaction system of the embodiment can greatly improve the interaction experience of 3D scene video interaction.
Compared with the prior art, the video interaction system provided by the invention obtains the scene image around the first terminal from the camera of the first terminal, transmits the scene image to the second terminal in an internet transmission mode, and displays the scene image through the display terminal of the second terminal. And at the second terminal, a remote control device with a built-in gyroscope is provided, and the purpose that the camera of the first terminal is conveniently controlled to rotate and offset through the remote control device according to the video attention point requirement of the user of the second terminal so as to obtain the scene image meeting the attention requirement of the user of the second terminal from the first terminal is achieved through the remote control device.
In this embodiment, a gyroscope is built in the remote control device, video focus point position information of a user is obtained through the gyroscope, the video focus point position is converted into a control instruction for remotely controlling the camera to rotate, and then the control signal is sent to the first terminal, so that the first terminal camera is controlled to deflect according to the focus intention of the local user. The embodiment improves the experience of user video interaction through the improvement of the remote control device, and the operation is more convenient.
EXAMPLE III
The embodiment provides a system for video interaction through video interaction glasses based on the technical content of the second embodiment.
In the system, viewpoint position information of a user on a screen is acquired through a viewpoint acquisition device, and a control instruction for controlling the rotation of the first terminal camera is generated according to the viewpoint position information. The viewpoint acquisition device comprises interactive glasses worn by a user, the interactive glasses can integrate the functions of 3D glasses and wireless headsets, and a capture device of the viewpoint position of the user is arranged in the interactive glasses and used for capturing the viewpoint position of the user. Of course, other user viewpoint devices worn on the head may be used as long as there is a user viewpoint acquisition device. The viewpoint here refers to an intersection point of a straight line (i.e., the emission direction of the infrared emission device) in which the positive direction of the user's face is located and the plane in which the screen is located.
The embodiment is mainly mapped to the shooting direction of the first terminal camera according to the viewpoint position of the second terminal user. For example, when the viewpoint position of the second terminal user falls on the middle position of the left side of the screen, controlling the camera of the first terminal to rotate and offset towards the left side/the right side, and acquiring the graph of the left scene of the first terminal; and if the viewpoint position of the second terminal user is in the upper right corner position of the screen, controlling the camera of the first terminal to rotate towards the upper right corner position, and acquiring the graph of the upper right corner/upper left corner scene of the first terminal.
The invention adopts the principle that the size of the receiving amplitude of the infrared signal has a functional relation with the transmitting angle thereof, and also provides a user viewpoint acquisition device, and the viewpoint position of the user is captured by the user viewpoint acquisition device.
As shown in fig. 3, in this embodiment, the remote control device of the second terminal 20 is replaced by video interactive glasses 29, the video interactive glasses 29 are used for performing 3D video interaction, the video interactive glasses 29 are provided with an infrared emitting device 30, the infrared emitting device 30 is disposed at a central position of the video interactive glasses 29, and the infrared emitting device 30 emits an infrared signal in a direction right in front of the face of the user at a specific emitting frequency, and the infrared signal is preferably a signal frequency of a fixed frequency. The video interaction glasses 29 integrate the functions of 3D glasses, and also wireless headsets.
The infrared transmitter 30 may be provided with a uniform infrared signal emitting device, such as an optical glass housing, to uniformly emit the infrared signal in a gradient manner, and the relation model of the intensity of the received signal of the infrared receiver and the receiving radius of the signal is a linear relation model, under which the amplitude of the received signal of the infrared receiver is in a functional relation with the receiving radius of the infrared receiver.
In the present embodiment, a modification is further included in the control device 23, please refer to fig. 3 and fig. 4, and the control device 23 includes a user viewpoint obtaining device 28. The viewpoint acquiring device 28 includes:
an infrared receiving device 2802, where the infrared receiving device 2802 includes at least three infrared receivers that are not on the same straight line and are disposed on the second terminal 20, specifically, the four infrared receivers that are disposed at four corners of the display screen and are disposed on the display terminal 20, the infrared receivers receive the infrared signals transmitted by the infrared transmitting device of the interactive glasses 29,
the viewpoint position information calculation module 2801 of the user is configured to calculate viewpoint information of the user according to the intensity information of the infrared signal received by each infrared receiver, and since the amplitude of the signal received by the infrared receiver is in a functional relationship with the receiving radius of the infrared receiver, the radius distance of the signal received can be determined according to the magnitude of the signal amplitude, that is, the distance from the viewpoint to the infrared receiver, where the viewpoint is an intersection point of a straight line where a square right in front of the user's face is located and a plane where the screen is located. And further, the position of the viewpoint can be determined according to the proportional relation between the plurality of infrared receivers and the distance from the viewpoint and the geometric relation between the length and the width of the screen. The detailed calculation process refers to the description of the seventh embodiment, and is not described in detail here.
The remote command processing module includes:
the instruction encapsulation module 2402 is used for generating a control instruction for remotely controlling the rotation of the first terminal camera according to the viewpoint information;
the instruction sending module 2401 is configured to transmit the encapsulated control instruction to a controlled device of the first terminal, and control rotation of the first terminal camera through the controlled device.
In order to enable the first terminal 10 to obtain a scene image around the first terminal 10 at multiple angles according to the change of the facial motion of the user of the second terminal 20, the first terminal 10 is provided with a controlled device 11 capable of controlling the camera of the first terminal to rotate, the second terminal 20 is provided with a control device 23 and interactive glasses 29, the control device 23 is provided with a user viewpoint obtaining module 28 and a remote command processing module 24, viewpoint position information of the user is obtained through the viewpoint obtaining module 28, and then the viewpoint position information is encapsulated into a control instruction for controlling the camera of the first terminal to rotate through the remote command processing module 24, and the control instruction is transmitted to the controlled device 11 of the first terminal, so as to adjust the rotation angle of the camera 12 on the first terminal 10.
Compared with the prior art, the video interaction system provided by the invention has the advantages that the user interacts through the interactive glasses, and the viewpoint position of the user is calculated to be mapped to the offset direction of the camera of the remote terminal. Only adopt one infrared ray device and four infrared external connectors, through infrared ray device according to certain transmitting frequency to display terminal transmission infrared signal, the power situation of above-mentioned signal is received through four infrared external connectors, judges user's visual angle direction and viewpoint for display terminal through mathematical calculation, and it has product simple structure, convenient to use's characteristics to user's interactive experience has further been improved.
Example four
Fig. 5 shows an implementation flow of the video interaction method provided by the embodiment of the present invention, which is described in detail below. The video interaction method comprises the following steps:
in step S501, a first terminal and a second terminal establish a video connection through the internet to transmit audio and video data, where the video connection refers to a network connection of a network video chat;
in step S502, the second terminal obtains an interactive control instruction of the user through the control device, encapsulates the control instruction, and transmits the encapsulated control instruction to the first terminal; the remote control device can be a common remote controller, an intelligent terminal or interactive glasses.
In step S503, the first terminal receives the control command, and controls the first terminal camera to rotate according to the control command. The rotating member of the camera here can be a rotating member of the prior art, for example, a motor is used to drive the gear and thus the camera to rotate.
In step S502, the interactive control instruction includes a control instruction for controlling the first terminal camera to rotate, and also includes a control instruction for setting shooting parameters of the first terminal camera, such as a focal length, a brightness, a contrast parameter, and the like of the camera, where a user can perform a local feature on a shooting target by controlling the focal length of the remote camera.
It should be noted that the video connection may be a unidirectional connection or a bidirectional connection, that is, when the first terminal and the second terminal are provided with the camera components in the video interaction system provided by the present invention at the same time, the two terminals may control the respective cameras to rotate mutually.
In view of the convenience of description, the present invention will be described with reference to a unidirectional connection mode in which the second terminal user sends a command to the second terminal through the remote control device and controls the rotation of the camera of the first terminal through the second terminal.
Taking the implementation mode of the unidirectional video connection as an example, in order to enhance the video interaction effect, the interaction control instruction further includes an instruction for controlling the camera to execute multiple automatic shooting modes, and the shooting model is correspondingly provided with a camera control program for automatically controlling the first terminal camera to move and shoot according to a shooting track predefined by the control program. The automatic shooting mode instruction comprises scanning shooting according to left and right movement of a specific rotation rate, shooting by rotating a visual field at a constant speed by 360 degrees, shooting by locking and following a moving target (such as a user), and the like. Because the first terminal is provided with a certain rotation track control program, the user of the second terminal or the user of the local first terminal can automatically control the camera to move and shoot according to a specific track by switching the shooting mode.
For example, in a following moving object shooting mode, the first terminal automatically captures a moving object through an image processing algorithm according to the acquired scene image, and controls the following rotation of the camera according to the movement of the object, thereby acquiring the scene image of the following moving object (such as a user).
In the method provided by the embodiment, the control capability of the user on the remote image acquisition is improved in the video interaction of the user, so that the video receiving user can watch the favorite remote movable image according to the intention of the user in the video interaction, a plurality of shooting modes are provided for the user, and the interestingness and the convenience of the user interaction are greatly improved.
EXAMPLE five
On the basis of the fourth embodiment, fig. 6 shows an implementation flow of the video interaction method provided by the embodiment of the present invention, and similarly, the embodiment takes an implementation manner of unidirectional video connection as an example for description, which is detailed as follows. The video interaction method comprises the following steps:
in this embodiment, a gyroscope is built in the remote control device, video focus point position information of a user is obtained through the gyroscope, the video focus point position is converted into a control instruction for remotely controlling the camera to rotate, and then the control signal is sent to the first terminal, so that the first terminal camera is controlled to deflect according to the focus will of the local user.
The remote control device can be a remote controller, an intelligent terminal, interactive glasses and the like, and a gyroscope for acquiring device position information is arranged in the remote control device. Of course, the gyroscope here also includes some speed and acceleration sensors.
In step S601, the second terminal obtains a reference raw position of a gyroscope of the remote control device, where the reference raw position is a reference position of the remote control device for deflection. In a default situation, when the remote control device is started, the position point of the remote control device or the position of the remote control device is sampled and averaged for a plurality of times by a user in the using process, so as to determine the reference location.
In step S602, in a state of normal use by the user, the second terminal acquires the deflection parameter of the gyroscope in real time.
In step S603, an instruction for controlling the first terminal camera to deflect is generated according to the deflection parameter of the gyroscope, and the instruction may be packaged into an XML data packet for transmission.
In step S604, the second terminal sends the packaged command packet to the first terminal, and controls the deflection of the camera of the first terminal, so that the user can control the camera of the first terminal to deflect according to the attention intention of the user of the first terminal through the remote control device. The first terminal and the second terminal can communicate and transmit the instruction packet through the HTTP.
In this embodiment, a gyroscope is built in the remote control device, video focus point position information of a user is obtained through the gyroscope, the video focus point position is converted into a control instruction for remotely controlling the camera to rotate, and then the control signal is sent to the first terminal, so that the first terminal camera is controlled to deflect according to the focus will of the local user. The embodiment mainly controls the rotation of the remote terminal camera through the built-in gyroscope of the remote control device, and has the effect of somatosensory control. The embodiment improves the experience of user video interaction through the improvement of the remote control device, and the operation is more convenient.
EXAMPLE six
On the basis of the fifth embodiment, fig. 7 shows an implementation flow of the video interaction method provided by the embodiment of the present invention, which is detailed as follows. The video interaction method comprises the following steps:
in step S701, a user wears video interactive glasses and enters a remote virtual scene through a video interactive system, where the interactive glasses are provided with an infrared emitting device that emits an infrared signal in a direction right in front of the user' S face at a specific frequency. In particular, the infrared emission device may be provided with a uniform infrared signal emission device, such as an optical glass housing, so that the infrared signal is emitted in a gradient manner, and the relation model of the intensity of the received signal of the infrared receiver and the radius of the received signal is a linear relation model.
In step S702, the infrared signal is received by an infrared receiver disposed on the second terminal, and the amplitude of the acquired signal is sampled. The infrared receiver is arranged on the frame of the second terminal and is provided with at least three infrared receivers which are not on the same straight line, and the better setting mode is that four infrared receivers are arranged at four corners of the frame of the screen. The signal amplitude received by the infrared receiver is amplified and analog-to-digital converted to generate signal amplitude data representing the signal amplitude.
In step S703, the second terminal calculates viewpoint position information of the user on the screen according to the magnitude relationship of the signals received by the respective infrared receivers.
In step S704, the second terminal generates a control command for controlling the deflection of the camera of the first terminal according to the viewpoint position information. The process is actually translating the position of the second end user's point of view on the screen to control the angle of deflection of the first end camera. The implementation process comprises the steps of establishing a relation model of the viewpoint position of the second terminal user and the shooting angle of the first terminal camera, wherein the relation model comprises the steps of establishing the correlation mapping of the viewpoint position of the second terminal user and the shooting angle of the first terminal camera, and generating the control instruction according to the correlation mapping so as to control the rotation of the camera. Of course, the mapping relationship may be that a plurality of viewpoint positions correspond to a certain shooting angle.
In step S705, the second terminal transmits the control instruction to the first terminal through the network, and further controls the first terminal camera to adjust a shooting angle consistent with the viewpoint position, and the first terminal acquires and transmits an image of the first terminal scene at the shooting angle.
In step S703, the step of calculating the viewpoint position information of the user on the screen according to the magnitude relationship between the signals received by the infrared receivers by the second terminal specifically includes:
constructing a relation model between the intensity of a signal received by an infrared receiver and the receiving radius of the signal;
according to the relation model and the strength information of the signals received by the infrared receivers, obtaining the distance information between the infrared receivers and the viewpoint of the user on the screen;
and calculating the coordinate information of the user viewpoint position according to the distance information between the viewpoint of the user on the screen and each receiver.
The above calculation process will be described in detail in embodiment seven, and will not be described here for the moment.
In the embodiment, interaction is performed through the interactive glasses, and the interactive glasses can also integrate the functions of 3D glasses and wireless headsets, so that interactive living room virtual reality in a 3D scene is realized.
The scheme for acquiring, coding, transmitting and decoding the 3D images has a mature scheme in the prior art. Some coding schemes suitable for interacting with 3D are provided in the prior art, such as disparity estimation based coding, 3-dimensional mesh coding, and the like.
In addition, in consideration of the difference of the network transmission speed of the user, the video and audio data coding and decoding module can automatically adjust the video compression ratio of the coding according to the detected network speed (for example, sending a 'ping' instruction to the opposite side), so that the adaptability of the system in the 3D interaction in the network transmission process is improved.
Compared with the prior art, the video interaction method provided by the invention has the advantage that the user can enter the remote virtual scene by adopting the video interaction system provided by the invention after wearing the video interaction glasses. And the second terminal acquires the viewpoint position information of the current user, and further controls the first terminal camera to adjust the shooting angle consistent with the viewpoint position, so that the requirement of a higher watching range during video interaction of the user is met. Due to the adoption of the video interaction method, the first terminal camera can change along with the change of the viewpoint position of the user, the display content of the scene can be changed in real time according to the change of the sight line and the visual angle of the user, the intelligent control level is achieved, and the video experience of the user is obviously improved.
EXAMPLE seven
On the basis of the sixth embodiment, the present embodiment provides a process for calculating a viewpoint in the video interaction method provided by the embodiment of the present invention with reference to fig. 8 and fig. 9, which is described in detail below.
The method comprises the following steps:
2.1) constructing a relation model of the intensity of the infrared receiver receiving signal and the signal receiving radius.
The infrared receiver receives the infrared signal emitted by the infrared emitting device, generally, the received signal amplitude will generate signal amplitudes with different intensities for different signal receiving radiuses, and under the condition that the receiving distance is a certain distance, the received signal amplitude is directly related to the receiving radius. Therefore, the signal amplitude generated by the infrared transmitting device and the signal amplitude generated by the infrared transmitting device for each signal receiving radius can be measured in advance, the measured signal amplitude values correspond to the signal receiving radius values one by one, and then a relation model of the intensity of the received signal of the infrared receiver and the signal receiving radius is established.
As shown in fig. 8, the infrared transmitter 1 transmits an infrared signal, and the infrared receiver 3 receives the infrared signal, and the direction radius of the received signal is r, and when the distance L is constant, the magnitude of the signal amplitude is in function of the deflection angle α of the signal transmission, and α is arctan (r/L), so that when the distance L is constant, the magnitude of the signal received amplitude is in function of the signal receiving radius r, and the larger the angle α transmitted by the infrared transmitter is, the smaller the signal amplitude is, the smaller the angle α is, the larger the signal amplitude is, that is, the signal amplitude value is in function of the signal receiving radius value.
The functional relationship between the infrared emission angle and the intensity of the infrared receiver receiving signal can be simplified into the functional corresponding relationship between the signal receiving radius r and the intensity of the receiver receiving signal, wherein pi is the signal amplitude of the infrared receiver, and r is the signal receiving radius. And F (r) measuring the function corresponding relation between the signal receiving radius r and the strength pi of the receiver receiving signal in a laboratory.
In order to make the signal amplitude value and the signal receiving radius value easier to express and measure, the infrared emission device is provided with an infrared signal uniform dispersion device, such as an optical lens, so that the infrared signal emitted by the infrared emission device is uniformly dispersed in gradient, and the relation model of the infrared receiver receiving signal intensity and the signal receiving radius is a linear relation model, so that, for example, in the case that the angle α is small and the distance L is constant, the function relation between the signal amplitude F and the signal receiving radius r can be established:wherein m, n and k are coefficients which are intrinsic parameters between the infrared receiver and the infrared transmitting device.
2.2) obtaining the distance between each infrared receiver and the viewpoint of the user on the screen according to the relation model and the signal intensity information received by each infrared receiver.
For convenience of analysis, please refer to the left and right drawings of FIG. 9.
Four infrared receivers 3 are arranged at four corners of the smart television, the four infrared receivers 3 are respectively an infrared receiver A, an infrared receiver B, an infrared receiver C and an infrared receiver D (hereinafter abbreviated as A, B, C and D), meanwhile, the viewpoint of the user is O, the point S is a transmitting point of the infrared transmitting device, and the viewpoint O is an intersection point of a straight line where the positive direction of the user face is located and a plane where the display screen of the display terminal is located. Under the relational model, the reception radius of the signal of the infrared receiver a is AO, the reception radius of the signal of the infrared receiver B is BO, the reception radius of the signal of the infrared receiver C is CO, and the reception radius of the signal of the infrared receiver D is DO.
Assuming that a has a received signal amplitude of 0.8 § B, B1.6 § C0.7 and D0.2, wherein § reference signal amplitude, and Ra, Rb, Rc and Rd are distances from the viewpoint (i.e. the signal reception radius r) of the four infrared receivers, respectively, the following system of equations follows:
due to signal amplitude in the above equation setThe value of the middle coefficient can be determined in advance according to experiments, namely the value of F (r) can be known, so that the distances from the four infrared receivers to the viewpoint (namely the signal receiving radius r) can be obtained according to the function corresponding relation between the signal amplitude and the signal receiving radius, namely the distance between each infrared receiver and the viewpointThe signal reception radius of the receiver is Ra, Rb, Rc, Rd. In the quadrilateral ABCD, the side lengths BC and CD can be measured in advance, and when AO, BO, CO, and DO are known, the position of O can be easily calculated.
2.3) calculating the coordinate information of the viewpoint position of the user on the display terminal according to the distance information between the viewpoint of the user on the screen and each receiver.
In this embodiment, a calculation method for calculating the above-mentioned viewpoint coordinates is provided, which does not need to determine the actual functional relationship of f (r), and in a model in which the reception radius r is linearly proportional to the signal reception amplitude, the viewpoint is denoted as O, and in a quadrilateral formed by points A, B, C and D, the point O is a point on a plane determined by point A, B, C, and the ratio between AO, BO, CO, and DO can be quickly calculated.
The values OA, OB, OC, Rc and the values of the line segments AB, BC, AC can be measured directly from the display terminal. Thus, in the quadrilateral ABCD, each side is known, and it is obvious that the position of the viewpoint O on the quadrilateral ABCD, i.e. the coordinates of the viewpoint on the display screen, can be calculated according to the proportional relationship among Ra, Rb, Rc, and Rd.
Referring to the right diagram of fig. 9, if C is the origin, CB is the x-axis, and CD is the y-axis, the coordinates (x, y) of O can be calculated by geometric relationship. The method comprises the following specific steps: in Δ OBC, let BC equal to L be the distance between B and C, there are
Then
Since the values of segments OC and OB are known, the value of u is known and is a constant.
Similarly, in Δ ADO, let the distance CD between C and D equal to H, there are
Then
Since the ratio of OD to OA is known, the value of w is known and is a constant.
According to the equation set formed by the equation (1) and the equation (2), the specific value of the O point coordinate (x, y) can be calculated. I.e. the coordinate value of the viewpoint O with respect to the display terminal is known.
Further, in order to provide the accuracy of the calculation, the present invention may be corrected by a reference difference value of the viewpoint of the user. Recording the viewpoint calculated by the system as O when the user is in the normal watching and normal watching state1When the user is in the condition of offsetting the viewpoint, the viewpoint calculated by the system is O2Then, the reference difference value of the viewpoint of the user is: p ═ O2-O1If the center of the screen is denoted as O, the viewpoint corrected by the system is Q ═ O + P, and Q is obtained as the corrected viewpoint position.
Furthermore, in order to accurately capture the viewpoint position of the user, a miniature image processing device is also arranged in the glasses, and the pupil rotation direction of the user is acquired through an image processing technology to acquire the gaze direction of the glasses of the user, so that the viewpoint position of the user is further corrected.
The embodiment provides a computing method for computing the viewpoint position of a user on a screen, which improves the accuracy of viewpoint computing of the user, can be applied to the video interaction method and the video interaction system, and further improves the experience of video interaction of the user by determining the viewpoint position of the user and mapping the viewpoint position to the deflection direction of a camera of a remote terminal.
Example eight
The present embodiment is a preferred embodiment of the present invention, which provides video interactive glasses for a video interactive system.
Referring to fig. 10, as shown in fig. 10, the interactive glasses include a general 3D glasses structure part 4, and further include an infrared emitting device 1 disposed in the center of the glasses, and configured to emit an infrared signal to a terminal direction at a specific transmission frequency, so that the terminal can calculate and obtain a viewpoint of a user on a screen of a display terminal.
Preferably, the infrared emission device can be provided with an infrared signal uniform emission device, for example, an optical glass housing, so that the infrared signal is emitted uniformly in a gradient manner, and the relation model of the intensity of the received signal of the infrared receiver and the radius of the received signal is a linear relation model.
Preferably, the interactive glasses further comprise a wireless microphone and a micro speaker.
In the prior art, when a user watches 3D image video, 3D video glasses are generally worn. The video interaction glasses provided by the embodiment are provided with the infrared emission device, the microphone and the micro loudspeaker on the original 3D video glasses so as to realize video interaction. The working principle of the infrared emitting device is as described in the foregoing embodiments, and is not described herein again.
Compared with the prior art, the video interactive glasses provided by the invention can better cooperate with the video interactive system provided by the invention to capture the viewpoint of the user. Meanwhile, the system also has the voice sending and listening functions, and the experience effect of user video interaction can be further improved.
Example nine
The present embodiment relates to a preferred implementation of a video interaction terminal used in a video interaction system provided by the present invention.
Referring to fig. 2, the video interaction first terminal provided by the present invention is provided with a camera 12 for acquiring a scene image and a controlled device 11 for controlling the camera of the first terminal to rotate, and the first terminal receives a control instruction of a second terminal through the controlled device 11 and controls the camera 12 to rotate according to the control instruction, so as to adjust a shooting position thereof.
Preferably, the first terminal further includes a scene video encoding module 13, configured to encode the image data acquired by the camera 12 and the voice data acquired by the microphone; and the audio-video data transmission module 14 is used for transmitting the audio-video data to the second terminal for playing. The image data acquired by the camera 12 and the audio data acquired by the microphone may be encoded together into a specific audio/video format, or may be encoded, transmitted and played independently from each other, which is not limited herein.
The video interaction second terminal is provided with a control device 23 which is in control interaction with a controlled device of the first terminal so as to control the rotation of the camera of the first terminal, the control device 23 obtains a control instruction for controlling the rotation of the camera of the first terminal by a user, and the control instruction is packaged and transmitted to the first terminal so as to remotely control the rotation of the camera of the first terminal. The control interaction refers to the mutual control coordination between the controlled device of the first terminal and the control device of the second terminal.
Preferably, the second terminal is further configured with a remote control device 21, which is used for providing control interaction for a user, acquiring video focus point information of the user on a remote scene through the remote control device 21, and controlling the rotation of the first terminal camera according to the video focus point information.
In addition, the second terminal further comprises a video and audio data receiving module 25, a video and audio decoding module 26 for receiving the video and audio data of the remote video interaction, a video and audio playing module 27 for decoding the video and audio data, and playing the video and audio.
For a specific working principle of the first terminal and the second terminal for video interaction provided in this embodiment, please refer to the foregoing embodiments, which are not described herein again.
In this embodiment, based on the interactive system provided by the present invention, two terminals are provided, and the terminals can be independently applied to the interactive system, and the interactive terminal provided by this embodiment improves the user experience of the terminals in video interaction.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents and improvements made within the spirit and principle of the present invention are intended to be included within the scope of the present invention.

Claims (13)

1. A video interactive system, comprising a first terminal and a second terminal connected through internet video, wherein the video connection refers to a network connection of network video chat, and the video interactive system is characterized in that: the first terminal is provided with one or more than one camera for acquiring a scene image of the first terminal and a controlled device for remotely controlling the rotation of the camera of the first terminal, and the second terminal is provided with a display device for displaying the scene image and a control device for controlling the rotation of the camera of the first terminal through control interaction with the controlled device of the first terminal;
the control device of the second terminal comprises a remote command processing module, the remote command processing module receives a control instruction sent by a user through a remote control device, the control instruction is packaged and then transmitted to the controlled device of the first terminal, and the controlled device controls the camera of the first terminal to rotate according to the control instruction; wherein,
the remote control device includes: the interactive glasses are provided with an infrared emission device for emitting infrared signals according to specific emission frequency,
the control device of the second terminal includes:
the user viewpoint acquisition module is used for acquiring viewpoint information of a second terminal user;
the remote command processing module is used for packaging the viewpoint information into a control instruction for controlling the first terminal camera and sending the control instruction;
the user viewpoint acquisition module comprises:
the infrared receiving device comprises at least three infrared receivers which are arranged on the second terminal and are not on the same straight line, and the infrared receivers receive infrared signals of the infrared transmitting device;
a user viewpoint position information calculation module for calculating viewpoint information of the user on the screen according to the intensity information of the infrared signals received by each infrared receiver,
the remote command processing module includes:
the instruction packaging module is used for generating a control instruction for remotely controlling the rotation of the first terminal camera according to the viewpoint information;
and the instruction sending module is used for transmitting the encapsulated control instruction to a controlled device of the first terminal and controlling the rotation of the first terminal camera through the controlled device.
2. The interactive system as claimed in claim 1, wherein a plurality of groups of camera auto-rotation trajectory control programs are provided in the first terminal for automatically controlling the camera of the first terminal to move for shooting according to the shooting trajectory defined by the control program according to the control program.
3. The interactive system of claim 1, wherein the first terminal comprises an audio-video encoding module and an audio-video data transmission module, and the second terminal comprises an audio-video decoding module, an audio-video data receiving module and an audio-video playing module.
4. The interactive system as claimed in claim 1, wherein the infrared emission means is provided with infrared signal uniform dispersion means to allow gradient uniform emission of infrared signals.
5. A video interaction method, characterized in that the method comprises the steps of:
the method comprises the steps that a first terminal and a second terminal establish video connection through the Internet; the video connection refers to network connection of network video chat;
the second terminal obtains an interactive control instruction of a user through the control device, packages the interactive control instruction and transmits the interactive control instruction to the first terminal;
the first terminal receives the control instruction and controls the camera of the first terminal to rotate according to the control instruction;
the steps that the second terminal obtains the interactive control instruction of the user through the control device, packages the control instruction and transmits the control instruction to the first terminal specifically comprise:
the head of a second terminal user wears video interactive glasses with an infrared transmitting device inside, and the video interactive glasses transmit infrared signals to a screen through the infrared transmitting device according to a certain frequency;
the second terminal is provided with at least three infrared receivers which are not on the same straight line, and each infrared receiver receives the infrared signal and calculates the viewpoint position information of the current user according to the intensity information of the received infrared signal;
and generating a control instruction for controlling the deflection of the first terminal camera according to the viewpoint position information.
6. The interaction method as claimed in claim 5, wherein said step of calculating the viewpoint location information of the current user according to the intensity information of the received infrared signal specifically comprises:
constructing a relation model between the intensity of a signal received by an infrared receiver and the receiving radius of the signal;
according to the relation model and the strength information of the signals received by the infrared receivers, obtaining the distance information between the infrared receivers and the viewpoint of the user on the screen;
and calculating the coordinate information of the user viewpoint position according to the distance information between each receiver and the viewpoint.
7. The interaction method as claimed in claim 6, wherein the infrared emission device is provided with an infrared signal uniform dispersion device to make the infrared signal gradient uniformly emitted, and the relation model of the infrared receiver receiving signal intensity and the signal receiving radius is a linear relation model.
8. The interaction method according to claim 5, wherein the step of generating the control command for controlling the deflection of the first terminal camera according to the viewpoint position information specifically comprises:
and establishing a relation model of the viewpoint position of the second terminal user and the shooting angle of the first terminal camera, wherein the relation model is used for establishing the association mapping of the viewpoint position of the second terminal user and the shooting angle of the first terminal camera.
9. The interaction method of claim 5, wherein the method further comprises: and the first terminal receives the control instruction and adjusts the shooting parameters of the first terminal connected with the camera according to the control instruction.
10. The interactive method according to claim 5, wherein the step of the first terminal receiving the control command and controlling the first terminal camera to rotate according to the control command further comprises:
the control instruction comprises an instruction for switching a fixed track shooting mode and is used for automatically controlling the camera to move to shoot according to a specific track of the shooting mode.
11. The interactive method according to claim 10, wherein the fixed-track shooting mode comprises a target following shooting mode, and in the target following shooting mode, the camera automatically locks the moving target and rotates along with the moving target to acquire a moving image of the target.
12. The video interaction glasses are characterized in that the video interaction glasses are provided with an infrared emitting device, the infrared emitting device is provided with an infrared signal uniform dispersing device, the infrared signal uniform dispersing device is used for emitting infrared signals with uniform gradients, the infrared sending device emits the infrared signals to a terminal according to specific emitting frequency, the infrared signals are used for the terminal to calculate the position of a viewpoint of a user on a display screen of the terminal, and the viewpoint position information is used for generating a control instruction for controlling the rotation of a first terminal camera.
13. A video interaction second terminal is used for establishing video connection with a video interaction first terminal through an Internet television, and the video connection refers to network connection of network video chat, and is characterized in that the second terminal is provided with a control device which is in control interaction with a controlled device of the first terminal so as to control the rotation of a camera of the first terminal; the control device of the second terminal comprises a remote command processing module, the remote command processing module receives a control instruction sent by a user through a remote control device, the control instruction is packaged and then transmitted to the controlled device of the first terminal, and the controlled device controls the camera of the first terminal to rotate according to the control instruction; wherein,
the remote control device includes: the interactive glasses are provided with an infrared emission device for emitting infrared signals according to specific emission frequency,
the control device of the second terminal includes:
the user viewpoint acquisition module is used for acquiring viewpoint information of a second terminal user;
the remote command processing module is used for packaging the viewpoint information into a control instruction for controlling the first terminal camera and sending the control instruction;
the user viewpoint acquisition module comprises:
the infrared receiving device comprises at least three infrared receivers which are arranged on the second terminal and are not on the same straight line, and the infrared receivers receive infrared signals of the infrared transmitting device;
a user viewpoint position information calculation module for calculating viewpoint information of the user on the screen according to the intensity information of the infrared signals received by each infrared receiver,
the remote command processing module includes:
the instruction packaging module is used for generating a control instruction for remotely controlling the rotation of the first terminal camera according to the viewpoint information;
and the instruction sending module is used for transmitting the encapsulated control instruction to a controlled device of the first terminal and controlling the rotation of the first terminal camera through the controlled device.
CN201310022617.9A 2013-01-21 2013-01-21 A kind of video interactive system, method, interaction glasses and terminal Expired - Fee Related CN103108126B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201310022617.9A CN103108126B (en) 2013-01-21 2013-01-21 A kind of video interactive system, method, interaction glasses and terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310022617.9A CN103108126B (en) 2013-01-21 2013-01-21 A kind of video interactive system, method, interaction glasses and terminal

Publications (2)

Publication Number Publication Date
CN103108126A CN103108126A (en) 2013-05-15
CN103108126B true CN103108126B (en) 2017-08-25

Family

ID=48315666

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310022617.9A Expired - Fee Related CN103108126B (en) 2013-01-21 2013-01-21 A kind of video interactive system, method, interaction glasses and terminal

Country Status (1)

Country Link
CN (1) CN103108126B (en)

Families Citing this family (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103533247A (en) * 2013-10-22 2014-01-22 小米科技有限责任公司 Self-photographing method, device and terminal equipment
CN105516577B (en) * 2014-09-24 2018-11-16 深圳Tcl数字技术有限公司 Camera shooting control method and system
CN104378587B (en) * 2014-10-27 2019-02-19 小米科技有限责任公司 Imaging apparatus control method, device and equipment
CN108366206B (en) * 2015-06-11 2020-08-18 Oppo广东移动通信有限公司 Shooting method and system based on rotary camera and intelligent glasses
CN106341641A (en) * 2015-07-10 2017-01-18 小米科技有限责任公司 Video communication method and device
CN105376554B (en) * 2015-12-04 2017-07-18 深圳市第六星设计技术有限公司 3D cameras mechanism, the mobile device with the mechanism and control method
CN105786179A (en) * 2016-02-26 2016-07-20 广东欧珀移动通信有限公司 Terminal, imaging device, interactive system, control method and device thereof
CN105791675A (en) * 2016-02-26 2016-07-20 广东欧珀移动通信有限公司 Terminal, imaging and interactive control method and device, terminal and system thereof
CN105721820B (en) * 2016-03-29 2018-10-30 佛山市南海区广工大数控装备协同创新研究院 A kind of interaction long-distance video communication system
CN105721821A (en) * 2016-04-01 2016-06-29 宇龙计算机通信科技(深圳)有限公司 Video calling method and device
CN108270986A (en) * 2016-12-30 2018-07-10 中兴通讯股份有限公司 A kind of cradle head control and processing method, device, system, electronic equipment
US20190156792A1 (en) * 2017-01-10 2019-05-23 Shenzhen Royole Technologies Co., Ltd. Method and system for adjusting display content and head-mounted display
CN107124589A (en) * 2017-05-24 2017-09-01 武汉大学 360 degree of immersion Active Eyes and method based on Cardboard
CN108900770B (en) * 2018-07-17 2021-01-22 广东小天才科技有限公司 Method and device for controlling rotation of camera, smart watch and mobile terminal
CN110798609A (en) * 2018-08-02 2020-02-14 光宝电子(广州)有限公司 Image display method, image display system and virtual window
CN108834002A (en) * 2018-08-17 2018-11-16 袁森 A method for remotely adjusting the image screen of a mobile terminal
CN110177285A (en) * 2019-05-29 2019-08-27 王子君 Live broadcasting method, device, system and dollying head
CN113497910A (en) * 2020-04-01 2021-10-12 南宁富桂精密工业有限公司 Video system and picture generation method thereof
CN111953936A (en) * 2020-07-21 2020-11-17 福建升腾资讯有限公司 Adjustable camera system and method for self-service equipment
CN114005313B (en) * 2021-01-12 2024-01-16 深圳动魅科技有限公司 Motion interaction equipment and control method thereof
CN114040106B (en) * 2021-11-17 2024-07-23 维沃移动通信有限公司 Video call control method and device, electronic equipment and readable storage medium
CN114205669B (en) * 2021-12-27 2023-10-17 咪咕视讯科技有限公司 Free-angle video playback method, device and electronic equipment

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101790070A (en) * 2010-02-25 2010-07-28 电子科技大学 Design of novel wireless video communication system
CN101860732A (en) * 2010-06-04 2010-10-13 天津市亚安科技电子有限公司 Method of controlling holder camera to automatically track target
CN101866215A (en) * 2010-04-20 2010-10-20 复旦大学 Human-computer interaction device and method using gaze tracking in video surveillance

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR0150700B1 (en) * 1995-04-06 1998-11-02 구자홍 Point type wireless controller using infrared rays
CN101888534A (en) * 2009-05-12 2010-11-17 胡伟强 Flight image pick-up system
CN101707671A (en) * 2009-11-30 2010-05-12 杭州普维光电技术有限公司 Panoramic camera and PTZ camera combined control method and panoramic camera and PTZ camera combined control device
CN201629024U (en) * 2009-12-21 2010-11-10 马胜利 A wireless fatigue alarm
JP5187369B2 (en) * 2010-09-24 2013-04-24 株式会社デンソー Reverse parking assist device for vehicle and program for reverse parking assist device

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101790070A (en) * 2010-02-25 2010-07-28 电子科技大学 Design of novel wireless video communication system
CN101866215A (en) * 2010-04-20 2010-10-20 复旦大学 Human-computer interaction device and method using gaze tracking in video surveillance
CN101860732A (en) * 2010-06-04 2010-10-13 天津市亚安科技电子有限公司 Method of controlling holder camera to automatically track target

Also Published As

Publication number Publication date
CN103108126A (en) 2013-05-15

Similar Documents

Publication Publication Date Title
CN103108126B (en) A kind of video interactive system, method, interaction glasses and terminal
US10757423B2 (en) Apparatus and methods for compressing video content using adaptive projection selection
Stankiewicz et al. A free-viewpoint television system for horizontal virtual navigation
WO2018077142A1 (en) Panoramic video processing method, device and system
US10827176B2 (en) Systems and methods for spatially adaptive video encoding
US20200322532A1 (en) Head-mountable display system
EP3481067A1 (en) Method, apparatus and stream for encoding/decoding volumetric video
US10681276B2 (en) Virtual reality video processing to compensate for movement of a camera during capture
WO2020259542A1 (en) Control method for display apparatus, and related device
WO2008120125A1 (en) A device for and a method of processing image data representative of an object
JP2014517569A (en) Panorama video imaging apparatus and method using portable computer device
CN106791699A (en) One kind remotely wears interactive video shared system
US20140092218A1 (en) Apparatus and method for stereoscopic video with motion sensors
US11521366B2 (en) Marker-based tracking apparatus and method
WO2022075767A1 (en) Method and device for rendering content in mobile communication system
KR20120074493A (en) Terminal for capturing image and method for capturing image
KR101784095B1 (en) Head-mounted display apparatus using a plurality of data and system for transmitting and receiving the plurality of data
US20240015264A1 (en) System for broadcasting volumetric videoconferences in 3d animated virtual environment with audio information, and procedure for operating said device
CN109479147B (en) Method and technical device for inter-temporal view prediction
JP2022102923A (en) Virtual studio system
CN208207368U (en) Head-mounted display apparatus, aircraft and image delivering system
JP2005303683A (en) Image transceiver
US20210065435A1 (en) Data processing
CN205017461U (en) Remote first visual angle video monitoring control system
JP2014096701A (en) Telecommunications equipment and method of telecommunications

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20170825