CN113139456A - Electronic equipment state tracking method and device, electronic equipment and control system - Google Patents
Electronic equipment state tracking method and device, electronic equipment and control system Download PDFInfo
- Publication number
- CN113139456A CN113139456A CN202110429852.2A CN202110429852A CN113139456A CN 113139456 A CN113139456 A CN 113139456A CN 202110429852 A CN202110429852 A CN 202110429852A CN 113139456 A CN113139456 A CN 113139456A
- Authority
- CN
- China
- Prior art keywords
- frame
- current frame
- key frame
- image
- imu
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 86
- 230000000007 visual effect Effects 0.000 claims abstract description 14
- 238000013507 mapping Methods 0.000 claims abstract description 11
- 230000004807 localization Effects 0.000 claims abstract description 10
- 230000001133 acceleration Effects 0.000 claims description 35
- 238000006073 displacement reaction Methods 0.000 claims description 20
- 238000012795 verification Methods 0.000 claims description 15
- 230000003190 augmentative effect Effects 0.000 claims description 12
- 230000005484 gravity Effects 0.000 claims description 7
- 238000005259 measurement Methods 0.000 claims description 5
- 238000004590 computer program Methods 0.000 claims description 4
- 238000010586 diagram Methods 0.000 description 28
- 238000012545 processing Methods 0.000 description 20
- 238000011084 recovery Methods 0.000 description 7
- 238000012216 screening Methods 0.000 description 5
- 238000005516 engineering process Methods 0.000 description 3
- 238000004364 calculation method Methods 0.000 description 2
- 230000010354 integration Effects 0.000 description 2
- 238000012804 iterative process Methods 0.000 description 2
- 230000033001 locomotion Effects 0.000 description 2
- 238000004891 communication Methods 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/20—Scenes; Scene-specific elements in augmented reality scenes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/40—Scenes; Scene-specific elements in video content
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Signal Processing (AREA)
- Studio Devices (AREA)
Abstract
The embodiment of the invention provides a method and a device for tracking the state of electronic equipment, the electronic equipment and a control system, wherein the method comprises the following steps: determining a target key frame matched with a current frame lost by state tracking in a key frame set of a map generated by a simultaneous localization and mapping (SLAM) system aiming at the environment where electronic equipment is located; determining the lens position and the lens orientation of the electronic equipment according to the target key frame and the current frame; performing visual state tracking of the electronic device based on the determined lens position and the lens orientation. The method can quickly recover the tracking state after the tracking loss is realized.
Description
The application is a divisional application of a Chinese invention patent with the application date of 2018, 2, month and 5 and the application number of 201810114363.6.
Technical Field
The present invention relates to computer technologies, and in particular, to a method and an apparatus for tracking a state of an electronic device, and a control system.
Background
In a Simultaneous Localization and Mapping (SLAM) system, a tracking technology needs to be supported. Tracking mainly refers to tracking the field of view and the viewpoint of a user. Additionally, tracking may also refer to tracking of user actions. In the operation process of the SLAM system, the problem that the tracking cannot be normally performed, namely the tracking is lost, may occur due to some exceptions.
When a loss of tracking occurs, it is necessary to quickly resume normal tracking. However, the current SLAM system requires a long processing time to continue the tracking after the tracking is lost.
Disclosure of Invention
The embodiment of the invention provides a technical scheme for tracking the state of electronic equipment and a technical scheme for controlling augmented reality.
A first aspect of an embodiment of the present invention provides a method for tracking a state of an electronic device, including:
determining a target key frame matched with a current frame lost by state tracking in a key frame set of a map generated by a simultaneous localization and mapping (SLAM) system aiming at the environment where electronic equipment is located;
determining the lens position and the lens orientation of the electronic equipment according to the target key frame and the current frame;
performing visual state tracking of the electronic device based on the determined lens position and the lens orientation.
With reference to any embodiment of the present invention, optionally, the determining, in a key frame set of a map generated by a simultaneous localization and mapping SLAM system for an environment in which an electronic device is located, a target key frame matched with a current frame with a loss of state tracking includes:
performing similarity matching on the image of the current frame and the image of at least one key frame in the key frame set;
and determining a target key frame matched with the current frame which is lost by tracking in the key frame set according to the matching similarity.
With reference to any embodiment of the present invention, optionally, the determining, according to the matching similarity, a target key frame matched with the current frame that is lost for tracking in the key frame set includes:
taking the key frame with the highest similarity in the key frame set as the target key frame; or,
and taking the key frames with the similarity exceeding a set threshold value in the key frame set as the target key frames.
With reference to any embodiment of the present invention, optionally, the performing similarity matching between the image of the current frame and the image of at least one key frame in the key frame set includes:
determining thumbnails of a plurality of key frames in the key frame set of the map, wherein the difference between the thumbnail shooting parameters and the shooting parameters of the current frame meets the tolerance range;
and performing similarity matching according to the similarity between the pixel point on the thumbnail of the current frame and the pixel point on the thumbnail of the key frame.
With reference to any embodiment of the present invention, optionally, the performing similarity matching between the image of the current frame and the image of at least one key frame in the key frame set includes:
determining the original images of a plurality of key frames in the key frame set of the map, wherein the difference between the original image shooting parameters and the current frame shooting parameters meets the tolerance range;
and performing similarity matching according to the similarity between the pixel points on the original image of the current frame and the pixel points on the original image of the key frame.
With reference to any embodiment of the present invention, optionally, the performing similarity matching between the image of the current frame and the image of at least one key frame in the key frame set includes:
determining thumbnails of a plurality of key frames in the key frame set of the map, wherein the difference between the thumbnail shooting parameters and the shooting parameters of the current frame meets the tolerance range;
determining an original image of the plurality of key frames;
and performing similarity matching according to the similarity between the pixel points on the original image of the current frame and the pixel points on the original image of the key frame.
With reference to any embodiment of the present invention, optionally, before performing similarity matching between the image of the current frame and the image of at least one key frame in the key frame set, the method further includes: determining the at least one key frame in the key frame set according to shooting parameters, wherein the shooting parameters comprise at least one of the following: lens orientation, lens position, and shooting time; and/or the presence of a gas in the gas,
the image of the at least one key frame includes: thumbnails and/or artwork of the at least one key frame.
With reference to any embodiment of the present invention, optionally, the determining a lens position and a lens orientation of the electronic device according to the target key frame and the current frame includes:
determining an initial lens position and an initial lens orientation corresponding to the current frame according to the lens position and the lens orientation corresponding to the target key frame;
projecting the thumbnail of the target key frame to the thumbnail of the current frame according to the initial lens position and the initial lens orientation corresponding to the current frame;
and if the matching degree of the pixel point of the thumbnail of the target key frame after projection and the pixel point of the thumbnail of the current frame reaches a preset value, taking the lens position and the lens orientation during projection as the lens position and the lens orientation corresponding to the current frame.
With reference to any embodiment of the present invention, optionally, the determining a lens position and a lens orientation of the electronic device according to the target key frame and the current frame includes:
determining an initial lens position and an initial lens orientation corresponding to the current frame according to the lens position and the lens orientation corresponding to the target key frame;
according to the original image of the current frame and the original image of the target key frame, performing pixel point matching on the current frame and the target key frame to obtain a matched pixel point set;
projecting the target key frame to the current frame according to the initial lens position, the initial lens orientation and the matched pixel point set;
and determining the lens position and the lens orientation of the electronic equipment according to the projection result.
With reference to any embodiment of the present invention, optionally, the determining a lens position and a lens orientation of the electronic device according to the target key frame and the current frame includes:
determining an initial lens position and an initial lens orientation corresponding to the current frame according to the lens position and the lens orientation corresponding to the target key frame;
projecting the thumbnail of the target key frame to the thumbnail of the current frame according to the initial lens position and the initial lens orientation corresponding to the current frame;
if the matching degree of the pixel point of the thumbnail of the target key frame and the pixel point of the thumbnail of the current frame after projection reaches a preset value, taking the lens position and the lens orientation during projection as the lens position and the lens orientation corresponding to the current frame;
according to the original image of the current frame and the original image of the target key frame, performing pixel point matching on the current frame and the target key frame to obtain a matched pixel point set;
projecting the target key frame to the current frame according to the lens position and the lens orientation corresponding to the current frame and the matched pixel point set;
and determining the lens position and the lens orientation of the electronic equipment according to the projection result.
With reference to any embodiment of the present invention, optionally, the method further includes:
determining parameter information of an Inertial Measurement Unit (IMU) in the electronic equipment;
re-determining the state of the IMU of the electronic equipment according to the lens position and the lens orientation of the electronic equipment and the parameter information;
and tracking the IMU state of the electronic equipment according to the redetermined IMU state of the electronic equipment.
With reference to any embodiment of the present invention, optionally, the parameter information of the IMU includes at least one of: a velocity of the electronic device, an angular velocity bias of the IMU, and an acceleration bias of the IMU.
With reference to any embodiment of the present invention, optionally, determining the angular velocity deviation of the IMU includes:
determining the position and the orientation of an IMU corresponding to each frame of image in a first video subsection in a video stream acquired by the lens of the electronic equipment;
determining a first angle variation according to the orientation of the IMU corresponding to the first frame and the orientation of the IMU corresponding to the second frame;
integrating the angular speed of the IMU within a time interval corresponding to the image of the first frame and the image of the second frame to obtain a second angle variation;
acquiring the angular speed deviation of the IMU corresponding to the first frame according to the first angle variation and the second angle variation;
wherein the second frame is a frame preceding the first frame.
With reference to any embodiment of the present invention, optionally, determining the acceleration deviation of the IMU includes:
determining a first displacement variation according to the position of the IMU corresponding to the first frame and the position of the IMU corresponding to the second frame;
according to preset gravity information, integrating the acceleration of the IMU in a time interval corresponding to the first frame image and the second frame image to obtain a second displacement variation;
and acquiring the speed information corresponding to the first frame and the acceleration deviation of the IMU according to the first displacement variation, the second displacement variation, the first angle variation and the second angle variation.
With reference to any embodiment of the present invention, optionally, the method further includes:
verifying the parameter information of the IMU;
and if the IMU passes the verification, correspondingly adjusting the IMU according to the determined state of the IMU.
With reference to any embodiment of the present invention, optionally, the method further includes:
and if the verification fails, determining a second video sub-segment in the video stream acquired by the lens of the electronic equipment, and re-determining the parameter information of the IMU based on the second video sub-segment, wherein the second video sub-segment is partially overlapped with or completely different from the first video sub-segment.
With reference to any embodiment of the present invention, optionally, the verifying the parameter information of the IMU includes:
verifying whether the parameter information of the IMU meets at least one of the following conditions:
the acceleration deviation of the IMU is smaller than or equal to a first preset threshold, and the difference value between the acceleration deviation of the IMU and the historical offset of the accelerometer stored in the electronic equipment is smaller than a second preset threshold;
the difference between the speed of the electronic device and the speed determined at the time of the tracking is less than a third preset threshold.
A second aspect of the embodiments of the present invention provides an augmented reality AR control method, including:
the AR engine acquires an output result of the simultaneous localization and mapping SLAM system, wherein the output result comprises state tracking information of the electronic equipment acquired by the method of the first aspect;
and the AR engine draws a virtual object in the scene of the electronic equipment or the video stream shot by the scene according to the state tracking information.
A third aspect of the embodiments of the present invention provides an apparatus for tracking a state of an electronic device, including:
the system comprises a first determining module, a second determining module and a third determining module, wherein the first determining module is used for determining a target key frame matched with a current frame lost by state tracking in a key frame set of a map generated by a simultaneous positioning and mapping SLAM system aiming at the environment where the electronic equipment is located;
the acquisition module is used for determining the lens position and the lens orientation of the electronic equipment according to the target key frame and the current frame;
a first tracking module to track a visual state of the electronic device based on the determined lens position and the lens orientation.
With reference to any embodiment of the present invention, optionally, the first determining module includes:
a matching unit, configured to perform similarity matching between the image of the current frame and an image of at least one key frame in the key frame set;
and the first determining unit is used for determining a target key frame matched with the current frame with the loss of tracking in the key frame set according to the matched similarity.
With reference to any embodiment of the present invention, optionally, the first determining unit is specifically configured to:
taking the key frame with the highest similarity in the key frame set as the target key frame; or,
and taking the key frames with the similarity exceeding a set threshold value in the key frame set as the target key frames.
With reference to any embodiment of the present invention, optionally, the matching unit is specifically configured to:
determining thumbnails of a plurality of key frames in the key frame set of the map, wherein the difference between the thumbnail shooting parameters and the shooting parameters of the current frame meets the tolerance range;
and performing similarity matching according to the similarity between the pixel point on the thumbnail of the current frame and the pixel point on the thumbnail of the key frame.
With reference to any embodiment of the present invention, optionally, the matching unit is further specifically configured to:
determining thumbnails of a plurality of key frames in the key frame set of the map, wherein the difference between the thumbnail shooting parameters and the shooting parameters of the current frame meets the tolerance range;
determining an original image of the plurality of key frames;
and performing similarity matching according to the similarity between the pixel points on the original image of the current frame and the pixel points on the original image of the key frame.
With reference to any embodiment of the present invention, optionally, the first determining module further includes:
a second determining unit, configured to determine the at least one key frame in the key frame set according to shooting parameters, where the shooting parameters include at least one of: lens orientation, lens position, and shooting time; and/or the presence of a gas in the gas,
the image of the at least one key frame includes: thumbnails and/or artwork of the at least one key frame.
With reference to any embodiment of the present invention, optionally, the obtaining module includes:
a first determining unit, configured to determine an initial lens position and an initial lens orientation corresponding to the current frame according to a lens position and a lens orientation corresponding to the target key frame;
the first projection unit is used for projecting the thumbnail of the target key frame to the thumbnail of the current frame according to the initial lens position and the initial lens orientation corresponding to the current frame;
and the first processing unit is used for taking the lens position and the lens orientation during projection as the lens position and the lens orientation corresponding to the current frame when the matching degree of the pixel point of the thumbnail of the target key frame and the pixel point of the thumbnail of the current frame after projection reaches a preset value.
With reference to any embodiment of the present invention, optionally, the obtaining module further includes:
a second determining unit, configured to determine an initial lens position and an initial lens orientation corresponding to the current frame according to the lens position and the lens orientation corresponding to the target key frame;
the first matching unit is used for matching pixel points of the current frame and the target key frame according to the original image of the current frame and the original image of the target key frame to obtain a matched pixel point set;
the second projection unit is used for projecting the target key frame to the current frame according to the initial lens position, the initial lens orientation and the matched pixel point set;
and the second processing unit is used for determining the lens position and the lens orientation of the electronic equipment according to the projection result.
With reference to any embodiment of the present invention, optionally, the obtaining module further includes:
a third determining unit, configured to determine an initial lens position and an initial lens orientation corresponding to the current frame according to the lens position and the lens orientation corresponding to the target key frame;
the third projection unit is used for projecting the thumbnail of the target key frame to the thumbnail of the current frame according to the initial lens position and the initial lens orientation corresponding to the current frame;
the third processing unit is used for taking the lens position and the lens orientation during projection as the lens position and the lens orientation corresponding to the current frame when the matching degree of the pixel point of the thumbnail of the target key frame and the pixel point of the thumbnail of the current frame after projection reaches a preset value;
the first matching unit is used for matching pixel points of the current frame and the target key frame according to the original image of the current frame and the original image of the target key frame to obtain a matched pixel point set;
the third projection unit is used for projecting the target key frame onto the current frame according to the lens position and the lens orientation corresponding to the current frame and the matched pixel point set;
and the fourth processing unit is used for determining the lens position and the lens orientation of the electronic equipment according to the projection result.
With reference to any embodiment of the present invention, optionally, the method further includes:
the second determination module is used for determining parameter information of an Inertial Measurement Unit (IMU) in the electronic equipment;
a third determining module, configured to re-determine the state of the IMU according to the lens position and the lens orientation of the electronic device and the parameter information;
and the second tracking module is used for tracking the IMU state of the electronic equipment according to the redetermined IMU state of the electronic equipment.
With reference to any embodiment of the present invention, optionally, the parameter information of the IMU includes at least one of: a velocity of the electronic device, an angular velocity bias of the IMU, and an acceleration bias of the IMU.
With reference to any embodiment of the present invention, optionally, the second determining module includes:
the first determining unit is used for determining the position and the orientation of an IMU (inertial measurement Unit) corresponding to each frame of image in a first video subsection in a video stream acquired by the lens of the electronic equipment;
a second determining unit, configured to determine the first angle variation according to an orientation of the IMU corresponding to the first frame and an orientation of the IMU corresponding to the second frame;
a third determining unit, configured to perform integration according to the angular velocity of the IMU in a time interval corresponding to the image of the first frame and the image of the second frame to obtain a second angle variation;
a first obtaining unit, configured to obtain an angular velocity deviation of the IMU corresponding to the first frame according to the first angle variation and the second angle variation;
wherein the second frame is a frame preceding the first frame.
With reference to any embodiment of the present invention, optionally, the second determining module further includes:
a fourth determining unit, configured to determine the first displacement variation according to the position of the IMU corresponding to the first frame and the position of the IMU corresponding to the second frame;
a fifth determining unit, configured to integrate, according to preset gravity information, the acceleration of the IMU within a time interval corresponding to the first frame image and the second frame image to obtain a second displacement variation;
and the second obtaining unit is used for obtaining the speed information corresponding to the first frame and the acceleration deviation of the IMU according to the first displacement variation, the second displacement variation, the first angle variation and the second angle variation.
With reference to any embodiment of the present invention, optionally, the method further includes:
further comprising:
the verification module is used for verifying the parameter information of the IMU;
and the adjusting module is used for correspondingly adjusting the IMU according to the determined state of the IMU when the verification is passed.
With reference to any embodiment of the present invention, optionally, the method further includes:
and the fourth determining module is used for determining a second video sub-segment in the video stream acquired by the electronic equipment lens when the verification fails, and re-determining the parameter information of the IMU based on the second video sub-segment, wherein the second video sub-segment is partially overlapped with or completely different from the first video sub-segment.
With reference to any embodiment of the present invention, optionally, the verification module is specifically configured to:
verifying whether the parameter information of the IMU meets at least one of the following conditions:
the acceleration deviation of the IMU is smaller than or equal to a first preset threshold, and the difference value between the acceleration deviation of the IMU and the historical offset of the accelerometer stored in the electronic equipment is smaller than a second preset threshold;
the difference between the speed of the electronic device and the speed determined at the time of the tracking is less than a third preset threshold.
A fourth aspect of the embodiments of the present invention provides an augmented reality AR engine, including:
the acquisition module is used for acquiring an output result of the SLAM system, wherein the output result comprises the state tracking information of the electronic equipment acquired by adopting the state tracking method of the electronic equipment;
and the drawing module is used for drawing the virtual object in the scene of the electronic equipment or the video stream shot by the scene according to the state tracking information.
A fifth aspect of an embodiment of the present invention provides an electronic device, including:
a memory for storing program instructions;
a processor for calling and executing the program instructions in the memory to perform the method steps of the first aspect.
A sixth aspect of embodiments of the present invention provides a readable storage medium, where a computer program is stored, and when at least one processor of an electronic device state tracking apparatus executes the computer program, the electronic device state tracking apparatus executes the electronic device state tracking method according to the first aspect.
A seventh aspect of the embodiments of the present invention provides an augmented reality AR control system, including: the system comprises an electronic device, an Augmented Reality (AR) engine and a simultaneous localization and mapping (SLAM) system, wherein the electronic device is in communication connection with the AR engine, the AR engine is the AR engine in the second aspect, and the SLAM system comprises the electronic device state tracking device in the third aspect.
According to the electronic equipment state tracking method, the electronic equipment state tracking device, the electronic equipment and the control system, the electronic equipment determines the target key frame matched with the current frame in the key frame set of the map generated by the SLAM system aiming at the environment where the electronic equipment is located, and obtains the lens position and the lens orientation of the electronic equipment according to the current frame and the target key frame, so that the visual state tracking of the electronic equipment can be carried out based on the obtained lens position and the lens orientation of the electronic equipment, and the tracking state can be quickly recovered after the tracking is lost. The embodiment of the invention can quickly recover the tracking under the condition that the user does not perceive, thereby greatly improving the user experience.
Drawings
Fig. 1 is a schematic flowchart of a first embodiment of a method for tracking a state of an electronic device according to an embodiment of the present invention;
fig. 2 is a schematic flowchart of a second embodiment of a method for tracking a state of an electronic device according to the present invention;
fig. 3 is a schematic flowchart of a third embodiment of a method for tracking a state of an electronic device according to the present invention;
fig. 4 is a schematic flowchart of a fourth embodiment of a method for tracking a state of an electronic device according to the present invention;
fig. 5 is a schematic flowchart of a fifth embodiment of a method for tracking a state of an electronic device according to an embodiment of the present invention;
fig. 6 is a schematic flowchart of a sixth embodiment of a method for tracking a state of an electronic device according to an embodiment of the present invention;
fig. 7 is a schematic flowchart of a seventh embodiment of a method for tracking a state of an electronic device according to an embodiment of the present invention;
fig. 8 is a schematic flowchart of an eighth embodiment of a method for tracking a state of an electronic device according to an embodiment of the present invention;
fig. 9 is a schematic flowchart of a ninth embodiment of a method for tracking a state of an electronic device according to an embodiment of the present invention;
fig. 10 is a schematic flowchart of a tenth embodiment of a method for tracking a state of an electronic device according to an embodiment of the present invention;
fig. 11 is a schematic flowchart of an eleventh embodiment of a method for tracking a state of an electronic device according to the present invention;
fig. 12 is a block diagram of a first embodiment of a device for tracking status of an electronic device according to the present invention;
fig. 13 is a block diagram of a second embodiment of a device for tracking status of electronic equipment according to an embodiment of the present invention;
fig. 14 is a block diagram of a third embodiment of a device for tracking status of an electronic apparatus according to an embodiment of the present invention;
fig. 15 is a block diagram of a fourth embodiment of a device for tracking status of an electronic apparatus according to an embodiment of the present invention;
fig. 16 is a block diagram of a fifth embodiment of a device for tracking status of an electronic device according to an embodiment of the present invention;
fig. 17 is a block diagram of a sixth embodiment of an apparatus for tracking a status of an electronic device according to an embodiment of the present invention;
fig. 18 is a block diagram of a seventh embodiment of a device for tracking status of an electronic apparatus according to an embodiment of the present invention;
fig. 19 is a block diagram of an eighth embodiment of an apparatus for tracking a status of an electronic device according to an embodiment of the present invention;
fig. 20 is a block diagram of a ninth embodiment of an apparatus for tracking status of electronic devices according to the present invention;
fig. 21 is a block diagram of a tenth embodiment of an apparatus for tracking a status of an electronic device according to an embodiment of the present invention;
fig. 22 is a block diagram of an eleventh embodiment of an electronic device state tracking apparatus according to an embodiment of the present invention;
FIG. 23 is a block diagram of an AR engine according to an embodiment of the present invention;
FIG. 24 is a block diagram of an electronic device provided by an embodiment of the invention;
fig. 25 to 27 are schematic structural diagrams of an augmented reality AR control system according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
SLAM technology is a fundamental problem in the fields of Augmented Reality (AR), virtual Reality, robotics, and autopilot. For example, in an application scenario in which the AR applies the output information of the SLAM system, the SLAM system performs visual state tracking of the electronic device, and the AR engine calls the output result of the SLAM system to perform drawing of a virtual object, thereby achieving the visual effect of the AR.
The embodiment of the invention provides an electronic equipment state tracking method, when tracking loss occurs, firstly, the lens position and the lens orientation of the electronic equipment are determined through visual repositioning, and based on the determined lens position and the determined lens orientation of the electronic equipment, a SLAM system can quickly and normally track, so that tracking recovery is realized under the condition that a user does not perceive, and the user experience is greatly improved.
Fig. 1 is a schematic flowchart of a first embodiment of a method for tracking a state of an electronic device according to an embodiment of the present invention, where an execution subject of the method is the electronic device, as shown in fig. 1, the method includes:
s101, determining a target key frame matched with a current frame with lost state tracking in a key frame set of a map generated by the SLAM system aiming at the environment where the electronic equipment is located.
Specifically, the embodiment of the present invention is applied to a scenario where the tracking loss occurs in the SLAM system, that is, when the tracking loss occurs in the SLAM system, the electronic device performs tracking recovery through the method steps related to the embodiment of the present invention.
As an alternative embodiment, the electronic device may determine whether the SLAM system has a tracking loss according to the difference between the current frame and the key frame. Specifically, in the operation process of the SLAM system, frames formed in operation are stored according to a certain principle, for example, frames whose image quality meets a certain requirement or frames whose image angle meets a certain requirement are selected for storage, so as to form a key frame. And when the SLAM system runs, comparing the current frame with the key frame, and if the difference between the current frame and the key frame exceeds a certain threshold value, determining that the SLAM system has tracking loss. Illustratively, the difference between the current frame and the key frame can be obtained by comparing pixel points of the image.
Furthermore, after it is determined that the SLAM system is lost in tracking, the electronic device may determine a target key frame matched with the current frame lost in tracking according to a key frame set of a map generated by the SLAM system for an environment where the electronic device is located.
The key frame set of the map specifically refers to a set of frames formed by key frames, and the key frames can be obtained through the process of forming the key frames.
Optionally, the electronic device may select a target keyframe matched with the current frame from the keyframe set of the map according to the matching degree, the orientation matching degree, and the like of the pixel points.
And S102, obtaining the lens position and the lens orientation of the electronic equipment according to the target key frame and the current frame.
Based on the determined target key frame and the current frame, a lens position and a lens orientation of the electronic device can be obtained, specifically, the lens position and the lens orientation of the electronic device when the current frame is captured are obtained.
The environment tracked by the SLAM system is a three-dimensional space, and accordingly, the position of the electronic device is specified as a three-dimensional coordinate, and the orientation of the lens of the electronic device is the orientation of the electronic device relative to a reference plane, for example, the orientation relative to the ground. The lens orientation of the electronic device is formed by the combination of the angles of the electronic device on three coordinate axes, which can be represented by a normal.
The lens position of the electronic device also refers to a position of the electronic device, and since the electronic device is generally a rigid body, the position of the electronic device may actually be the lens position of the electronic device.
And S103, tracking the visual state of the electronic equipment based on the determined lens position and lens orientation of the electronic equipment.
When the tracking is lost, the electronic device cannot acquire the lens position and the lens orientation, and after the steps, the electronic device can quickly acquire the lens position and the lens orientation, and can normally track based on the lens position and the lens orientation.
In this embodiment, the electronic device determines a target key frame matched with the current frame in a key frame set of a map generated by the SLAM system for an environment where the electronic device is located, and obtains a lens position and a lens orientation of the electronic device according to the current frame and the target key frame, and then can perform visual state tracking of the electronic device based on the obtained lens position and lens orientation of the electronic device, so that after tracking loss is realized, tracking can be quickly recovered under the condition that a user does not perceive, and user experience is greatly improved.
After that, the electronic device may further restore the state of the IMU, and a specific restoration process will be described in detail in the following embodiments.
On the basis of the above embodiments, the present embodiment relates to a specific method for determining, by an electronic device, a target key frame matching a current frame with a tracking loss from a key frame set of a map generated by a SLAM system for an environment where the electronic device is located.
Fig. 2 is a schematic flowchart of a second embodiment of the method for tracking a state of an electronic device according to the embodiment of the present invention, and as shown in fig. 2, the step S101 includes:
s201, carrying out similarity matching on the image of the current frame and the image of at least one key frame in the key frame set.
Optionally, the similarity matching may be performed through the current frame and the original image of the key frame, or the similarity matching may be performed through the thumbnail of the current frame and the key frame, or the similarity matching may be performed through the original image of the current frame and the key frame and the thumbnail.
In addition, in the specific matching, the matching may be performed based on different shooting parameters, such as a lens position, a lens filtering, or a shooting time.
The above-described aspects will be specifically explained in the following examples.
S202, determining a target key frame matched with the current frame which is lost in tracking in the key frame set according to the matching similarity.
In an alternative manner, after determining the matching degree between the current frame and the key frame, the electronic device may use the key frame with the highest similarity in the key frame set as the target key frame.
In another alternative, the electronic device may also use, as the target key frame, a key frame in the key frame set, which has a similarity exceeding a set threshold. Specifically, the electronic device may preset a threshold, and then perform similarity matching between the current frame and the key frames in the key frame set, and once the similarity between a certain key frame and the current frame reaches the threshold, take the key frame as a target key frame, without performing similarity matching on other key frames in the key frame set.
A specific method of performing similarity matching in step S201 described above is described below.
Fig. 3 is a schematic flowchart of a third embodiment of a method for tracking a state of an electronic device according to an embodiment of the present invention, and as shown in fig. 3, a first optional manner of the step S201 is as follows:
and S301, determining thumbnails of a plurality of key frames in the key frame set of the map, wherein the difference between the thumbnail shooting parameters and the shooting parameters of the current frame meets the tolerance range.
And S302, performing similarity matching according to the similarity between the pixel point on the thumbnail of the current frame and the pixel point on the thumbnail of the key frame.
The thumbnail of the current frame refers to an image formed after the image of the current frame is compressed, and the thumbnail of the key frame refers to an image formed after the image of the key frame is compressed. The thumbnail is obviously reduced in size compared with the original image of the current frame, so that when the key frame is stored in advance, the electronic equipment can store the thumbnail of the key frame besides the original image of the key frame.
Furthermore, in this embodiment, with the thumbnails of the key frames in the key frame set of the map and the thumbnails of the current frame as comparison objects, the key frames in the key frame set are first screened, that is, thumbnails of a plurality of key frames whose difference from the shooting parameter of the current frame meets the tolerance range are screened, for example, assuming that the shooting parameter is the lens position, thumbnails of a plurality of key frames whose difference from the lens position of the current frame is smaller than a specific threshold value can be screened, and then pixel point comparison is performed on the thumbnails of the key frames and the thumbnails of the current frame, so as to obtain a result of similarity matching.
In the embodiment, the key frames in the key frame set of the map are firstly screened according to the shooting parameters of the key frames, and then the current frame and the screened key frames are subjected to similarity matching, namely the number of the key frames needing to be subjected to similarity matching is greatly reduced through screening the key frames, so that the calculated amount and the calculated time during similarity matching are further reduced, and the processing efficiency is improved. In addition, due to the fact that the size of the thumbnail is small, the efficiency of image screening is further improved by screening based on the thumbnail.
Fig. 4 is a schematic flowchart of a fourth embodiment of the method for tracking a state of an electronic device according to the embodiment of the present invention, and as shown in fig. 4, a second optional manner of the step S201 is as follows:
and S401, determining the original images of a plurality of key frames in the key frame set of the map, wherein the difference between the original image shooting parameters and the current frame image shooting parameters meets the tolerance range.
And S402, performing similarity matching according to the similarity between the pixel points on the original image of the current frame and the pixel points on the original image of the key frame.
In the above step, the original images of the key frames in the key frame set of the map and the original images of the current frame are comparison objects, the key frames in the key frame set are firstly screened, that is, the original images of a plurality of key frames whose difference from the shooting parameter of the current frame meets the tolerance range are screened out, for example, if the shooting parameter is the lens orientation, the original images of a plurality of key frames whose difference from the lens orientation of the current frame is smaller than a specific threshold value can be screened out, and then the pixel points of the original images of the key frames and the original images of the current frame are compared, so as to obtain the result of similarity matching.
In the embodiment, the key frames in the key frame set of the map are firstly screened according to the shooting parameters of the key frames, and then the current frame and the screened key frames are subjected to similarity matching, namely the number of the key frames needing to be subjected to similarity matching is greatly reduced through screening the key frames, so that the calculated amount and the calculated time during similarity matching are further reduced, and the processing efficiency is improved. In addition, because the information of the original image is complete, the result of similarity matching can be more accurate by screening based on the original image.
Fig. 5 is a schematic flowchart of a fifth embodiment of the method for tracking a state of an electronic device according to the embodiment of the present invention, and as shown in fig. 5, a third optional manner of the step S201 is as follows:
and S501, determining thumbnails of a plurality of key frames in the key frame set of the map, wherein the difference between the thumbnail shooting parameters and the shooting parameters of the current frame meets the tolerance range.
And S502, determining the original pictures of the plurality of key frames.
And S503, performing similarity matching according to the similarity between the pixel points on the original image of the current frame and the pixel points on the original image of the key frame.
In the embodiment, the thumbnails and the original images are combined for similarity matching, that is, the thumbnails of a plurality of key frames are firstly screened out from the key frame set based on the thumbnails, then the original images of the key frames are obtained, and the similarity matching is carried out based on the pixel points of the original images of the key frames.
Alternatively, as described above, the shooting parameters may specifically be lens position, lens orientation, shooting time, and the like. Before executing any of the embodiments of fig. 3-5, the electronic device further determines the at least one key frame in the key frame set according to a shooting parameter, where the shooting parameter includes at least one of: lens orientation, lens position, and shooting time; and/or, the image of the at least one key frame comprises: a thumbnail and/or artwork of the at least one key frame.
On the basis of the above embodiments, the present embodiment relates to a specific method for obtaining a lens position and a lens orientation of an electronic device by the electronic device based on a target key frame and a current frame.
Alternatively, the processing can be done in three ways:
1. processing based on thumbnails of target key frames and thumbnails of current frames
2. Processing the original image based on the target key frame and the original image of the current frame
3. Simultaneously processing the thumbnails and the original images based on the target key frame and the thumbnails and the original images of the current frame
The following description is made separately.
Fig. 6 is a schematic flowchart of a sixth embodiment of a method for tracking a state of an electronic device according to an embodiment of the present invention, and as shown in fig. 6, a first optional manner of step S102 is as follows:
s601, determining an initial lens position and an initial lens orientation corresponding to the current frame according to the lens position and the lens orientation corresponding to the target key frame.
Specifically, the lens position corresponding to the target key frame is taken as the initial lens position corresponding to the current frame, and the initial lens orientation corresponding to the target key frame is taken as the initial lens orientation corresponding to the current frame.
And S602, projecting the thumbnail of the target key frame to the thumbnail of the current frame according to the initial lens position and the initial lens orientation corresponding to the current frame.
And S603, if the matching degree of the pixel points of the thumbnail of the target key frame after projection and the pixel points of the thumbnail of the current frame reaches a preset value, taking the lens position and the lens orientation during projection as the lens position and the lens orientation corresponding to the current frame.
The steps S602 to S603 may be an iteration process, when the iteration starts, the thumbnail of the target key frame is projected to the current frame according to the initial lens position and the initial lens orientation corresponding to the current frame, and then based on the thumbnail of the target key frame, after the projection is completed, the pixel point matching degree between the thumbnail of the current frame and the thumbnail of the projected target key frame is calculated, and the lens position and the lens orientation corresponding to the current frame are adjusted according to the matching degree, and are used as the input of the next iteration. The iteration end conditions are as follows: and when the matching degree of the thumbnail of the target key frame in a certain iteration after projection and the thumbnail of the current frame reaches a preset value, the iteration is ended, and the lens position and the lens orientation of the current frame in the iteration are used as the final lens position and the final lens orientation.
Fig. 7 is a schematic flowchart of a seventh embodiment of a method for tracking a state of an electronic device according to an embodiment of the present invention, and as shown in fig. 7, a second optional manner of the step S102 is as follows:
and S701, determining an initial lens position and an initial lens orientation corresponding to the current frame according to the lens position and the lens orientation corresponding to the target key frame.
Specifically, the lens position corresponding to the target key frame is taken as the initial lens position corresponding to the current frame, and the initial lens orientation corresponding to the target key frame is taken as the initial lens orientation corresponding to the current frame.
And S702, performing pixel point matching on the current frame and the target key frame according to the original image of the current frame and the original image of the target key frame, and acquiring a matched pixel point set.
In this step, pixel point matching is performed on the original image of the current frame and the original image of the target key frame. The pixels for matching the pixels can be called feature points, that is, pixels representing original image features of the current frame. The electronic device may first determine feature points in the original image of the current frame, and then find out matched pixel points in the original image of the target key frame. The matched pixels can form a matched pixel set, and the set comprises matched pixels of the original image of the current frame and matched pixels of the original image of the target key frame.
And S703, projecting the target key frame to the current frame according to the initial lens position, the initial lens orientation and the matched pixel point set.
And S704, determining the lens position and the lens orientation of the electronic equipment according to the projection result.
Wherein, the steps S703-S704 may be an iterative process,
when iteration starts, firstly, according to the initial lens position and the initial lens orientation, based on the matching pixel point of the target key frame, the matching pixel point of the target key frame is projected to the matching pixel point of the current frame, after the projection is finished, the matching degree of the matching pixel point of the current frame and the matching pixel point of the projected target key frame is calculated, and the corresponding lens position and the lens orientation of the current frame are adjusted according to the matching degree to be used as the input of the next iteration. The end conditions of the iteration stage are: and when the matching degree of the matching pixel point of the target key frame in a certain iteration and the matching pixel point of the current frame after projection reaches a preset target value, the iteration is ended, and the lens position and the lens orientation of the current frame in the iteration are used as the final lens position and the final lens orientation.
Fig. 8 is a schematic flowchart of an eighth embodiment of a method for tracking a state of an electronic device according to an embodiment of the present invention, and as shown in fig. 8, a third optional manner of the step S102 is as follows:
and S801, determining an initial lens position and an initial lens orientation corresponding to the current frame according to the lens position and the lens orientation corresponding to the target key frame.
First, the processing idea of this embodiment is: the method comprises the steps of firstly giving an initial lens position and a lens orientation to a current frame, further carrying out multiple iteration processing of projecting a target key frame to the current frame, determining a new lens position and a lens orientation corresponding to the current frame in each iteration, and enabling the new lens position and the lens orientation obtained in each iteration to be closer to the correct lens position and lens orientation. In addition, the iteration can be performed in two stages of thumbnail projection and original image projection, namely, the iteration is performed on the basis of the thumbnail with a smaller size to obtain an intermediate result, and then the iteration is performed on the basis of the original image by taking the intermediate result as input, so that an accurate result is obtained.
Specifically, the lens position corresponding to the target key frame is taken as the initial lens position corresponding to the current frame, and the initial lens orientation corresponding to the target key frame is taken as the initial lens orientation corresponding to the current frame.
And S802, projecting the thumbnail of the target key frame to the thumbnail of the current frame according to the initial lens position and the initial lens orientation corresponding to the current frame.
And S803, if the matching degree of the pixel point of the thumbnail of the target key frame after projection and the pixel point of the thumbnail of the current frame reaches a preset value, taking the lens position and the lens orientation during projection as the middle lens position and the middle lens orientation corresponding to the current frame.
As described above, the iteration process may be divided into two stages, namely, an iteration stage of the thumbnail and an iteration stage of the original image, where steps S801 to S803 are the iteration stages of the thumbnail and steps S802 to S803 are specific iteration processes.
Specifically, when iteration starts, firstly, a thumbnail of a target key frame is projected to a current frame according to an initial lens position and an initial lens orientation corresponding to the current frame and based on the thumbnail of the target key frame, after projection is completed, a pixel point matching degree of the thumbnail of the current frame and the thumbnail of the projected target key frame is calculated, and the lens position and the lens orientation corresponding to the current frame are adjusted according to the matching degree to serve as input of next iteration. The end conditions of the iteration stage are: and when the matching degree of the thumbnail of the target key frame in a certain iteration after projection and the thumbnail of the current frame reaches a preset value, the iteration is ended.
The iteration stage is a thumbnail stage, namely projection is carried out based on thumbnails, the calculation time and calculation amount can be saved due to the small size of the thumbnails, and after the iteration stage is finished, a result which is relatively close to the correct lens position and lens orientation, namely the middle lens position and the middle lens orientation, can be formed. And then, based on the position and orientation of the intermediate shot, continuing to enter the iteration stage of the original image.
S804, according to the original image of the current frame and the original image of the target key frame, pixel point matching is conducted on the current frame and the target key frame, and a matched pixel point set is obtained.
Steps S804 to S806 are iteration stages of the original image.
In this step, pixel point matching is performed on the original image of the current frame and the original image of the target key frame. The pixels for matching the pixels can be called feature points, that is, pixels representing original image features of the current frame. The electronic device may first determine feature points in the original image of the current frame, and then find out matched pixel points in the original image of the target key frame. The matched pixels can form a matched pixel set, and the set comprises matched pixels of the original image of the current frame and matched pixels of the original image of the target key frame.
And S805, projecting the target key frame to the current frame according to the position of the intermediate lens, the orientation of the intermediate lens and the matched pixel point set.
And S806, determining the lens position and the lens orientation of the electronic equipment according to the projection result.
Steps S805 to S806 are specific iterative processes, and when iteration starts, a matching pixel point of a target key frame is projected to a matching pixel point of a current frame according to an intermediate lens position and an intermediate lens orientation determined in an iteration stage of a thumbnail, based on a matching pixel point of the target key frame, after the projection is completed, a matching degree between the matching pixel point of the current frame and the matching pixel point of the projected target key frame is calculated, and a lens position and a lens orientation corresponding to the current frame are adjusted according to the matching degree, and the matching pixel point are used as input of next iteration. The end conditions of the iteration stage are: and when the matching degree of the matching pixel point of the target key frame in a certain iteration after being projected and the matching pixel point of the current frame reaches a preset target value, the iteration is ended.
In the step, the accurate lens position and the lens orientation corresponding to the current frame are determined through two stages of thumbnail projection and pixel point projection matching in the original image, namely, projection is performed on the basis of the thumbnail with a smaller size, so that the lens position and the lens orientation information which are close to an accurate value are obtained, projection is performed on the basis of the original image containing all feature point information, and the accurate lens position and the lens orientation information are obtained, so that the processing time is saved, and the accuracy of the determined lens position and the lens orientation is also ensured.
The following is a specific process of the electronic device recovering the status of the IMU.
After the electronic device resumes tracking, the state of the IMU needs to be restored, i.e., re-evaluated. Alternatively, the state recovery of the IMU may be performed at intervals after the tracking recovery, i.e. after a certain movement of the electronic device.
Fig. 9 is a flowchart illustrating a ninth embodiment of a method for tracking a state of an electronic device according to an embodiment of the present invention, where as shown in fig. 9, an IMU state recovery process includes:
s901, determining parameter information of an IMU in the electronic equipment.
Wherein the parameter information includes: at least one of a velocity of the electronic device, an angular velocity bias of the IMU, and an acceleration bias of the IMU.
In particular, the state of the IMU is used to represent a motion state of the electronic device, wherein the state of the IMU is determined according to relevant parameters of the IMU.
Relevant parameters of the IMU include: a lens position of the electronic device, a lens orientation of the electronic device, a direction of gravity, a velocity of the electronic device, an angular velocity bias of the IMU, and an acceleration bias of the IMU.
The lens position and the lens orientation of the electronic device can be determined by the method, and the gravity direction can be obtained according to the visual estimation before the tracking recovery. Therefore, in this embodiment, the velocity of the electronic device, the angular velocity deviation of the IMU, and the acceleration deviation of the IMU need to be determined.
And S902, re-determining the state of the IMU according to the lens position and the lens orientation of the electronic equipment and the parameter information.
After the parameter information of the IMU is obtained, the state of the IMU can be obtained.
And S903, tracking the IMU state of the electronic equipment according to the redetermined IMU state of the electronic equipment.
The following describes in detail the process of the electronic device determining the velocity of the electronic device, the angular velocity deviation of the IMU, and the acceleration deviation of the IMU.
Fig. 10 is a schematic flowchart of a tenth embodiment of a method for tracking an electronic device state according to an embodiment of the present invention, and as shown in fig. 10, a process of determining an angular velocity deviation of an IMU by an electronic device includes:
s1001, determining the position and the orientation of an IMU corresponding to each frame of image in a first video subsection in a video stream acquired by a lens of the electronic equipment.
The above each frame of image specifically refers to each frame of image in the first video subsection captured by the electronic device when the status of the IMU is restored after the tracking is restored. After the SLAM system resumes normal tracking, the position and lens orientation of the electronic device can be acquired when each frame of image is captured.
Further, in this step, the position and orientation of the IMU corresponding to each frame of image may be specifically determined according to the position and lens orientation of the electronic device corresponding to each frame of image.
S1002, determining a first angle variation according to the orientation of the IMU corresponding to the first frame and the orientation of the IMU corresponding to the second frame.
The first frame may be any one frame acquired after the IMU state recovery processing is started, and the second frame is a frame before the first frame, that is, the angular velocity deviation of the IMU is acquired through the comparison processing of two adjacent frames in the present embodiment.
As described above, the orientation of the lens is formed by combining angles on three coordinate axes, and the orientation of the IMU is also the same, so that an angle variation, that is, a first angle variation, can be determined according to the orientations of the IMU of the first frame and the second frame.
And S1003, integrating the angular speed of the IMU in the time interval corresponding to the image of the first frame and the image of the second frame to obtain a second angle variation.
Wherein the angular velocity of the IMU may be read from the gyroscope.
And S1004, acquiring the angular speed deviation of the IMU corresponding to the first frame according to the first angle variation and the second angle variation.
The angular velocity deviation of the IMU corresponding to the first frame obtained in this step is the angular velocity deviation of the IMU.
Fig. 11 is a schematic flowchart of an eleventh embodiment of a method for tracking an electronic device status according to an embodiment of the present invention, where as shown in fig. 11, a process of determining an acceleration deviation of an IMU by an electronic device is as follows:
s1101, determining a first displacement variation according to the position of the IMU corresponding to the first frame and the position of the IMU corresponding to the second frame.
And S1102, according to preset gravity information, integrating the acceleration of the IMU in the time interval corresponding to the first frame image and the second frame image to obtain a second displacement variation.
The acceleration of the IMU may be read from an accelerometer.
And S1103, acquiring the speed corresponding to the first frame and the acceleration deviation of the IMU according to the first displacement variation, the second displacement variation, the first angle variation and the second angle variation.
And obtaining the speed corresponding to the first frame and the acceleration deviation of the IMU, which are the speed of the electronic equipment and the acceleration deviation of the IMU.
Further, as an optional implementation manner, after the electronic device determines the speed of the electronic device, the angular velocity deviation of the IMU, and the acceleration deviation of the IMU, the correctness of the information may also be verified. Namely, the electronic device may verify the parameter information of the IMU, and if the verification is passed, correspondingly adjust the IMU according to the determined IMU state.
And if the verification fails, determining a second video sub-segment in the video stream acquired by the lens of the electronic equipment, and re-determining the parameter information of the IMU based on the second video sub-segment, wherein the second video sub-segment is partially overlapped with or completely different from the first video sub-segment.
Specifically, since hardware signals such as a gyroscope, an accelerometer and the like may have a problem of excessive noise, when the verification fails, a video segment may be reselected to determine parameter information of the IMU. For example, a first video sub-segment corresponds to frames 1 to 10, and when the verification fails, the selected second video sub-segment may be frames 4 to 10, i.e., partially overlapping the first video sub-segment.
It should be noted that each of the video sub-segments includes at least two frames.
Optionally, the electronic device may verify whether the parameter information of the IMU satisfies the following condition:
(1) the acceleration deviation of the IMU is smaller than or equal to a first preset threshold, and the difference value between the acceleration deviation of the IMU and the historical offset of the accelerometer stored in the electronic equipment is smaller than a second preset threshold.
The first preset threshold is a preset value range, and the historical offset may be an offset of the accelerometer obtained during SLAM initialization or an offset of the accelerometer obtained at a certain time before the IMU state is restored.
(2) And the difference between the speed of the electronic equipment and the speed determined in the tracking is smaller than a third preset threshold value.
The speed determined during tracking may be speed information obtained by selecting two frames of images for visual positioning during tracking before the status of the IMU is restored.
Fig. 12 is a block diagram of a first embodiment of an apparatus for tracking status of electronic devices according to an embodiment of the present invention, as shown in fig. 12, the apparatus includes:
a first determining module 1201, configured to determine, in a key frame set of a map generated by the SLAM system for an environment where the electronic device is located, a target key frame that matches a current frame with a loss of state tracking.
An obtaining module 1202, configured to determine a lens position and a lens orientation of the electronic device according to the target key frame and the current frame.
A first tracking module 1203, configured to perform visual state tracking of the electronic device based on the determined lens position and lens orientation.
Fig. 13 is a block diagram of a second embodiment of a state tracking apparatus for electronic devices according to an embodiment of the present invention, and as shown in fig. 13, the first determining module 1201 includes:
a matching unit 12011, configured to perform similarity matching between the image of the current frame and the image of at least one key frame in the key frame set.
A first determining unit 12012, configured to determine, according to the matching similarity, a target key frame that matches the current frame with the tracking loss in the key frame set.
In another embodiment, the first determining unit 12012 is specifically configured to:
taking the key frame with the highest similarity in the key frame set as the target key frame; or,
and taking the key frames with the similarity exceeding a set threshold value in the key frame set as the target key frames.
In another embodiment, the matching unit 12011 is specifically configured to:
determining thumbnails of a plurality of key frames in the key frame set of the map, wherein the difference between the thumbnail shooting parameters and the shooting parameters of the current frame meets the tolerance range;
and performing similarity matching according to the similarity between the pixel point on the thumbnail of the current frame and the pixel point on the thumbnail of the key frame.
In another embodiment, the matching unit 12011 is further specifically configured to:
determining the original images of a plurality of key frames in the key frame set of the map, wherein the difference between the original image shooting parameters and the current frame shooting parameters meets the tolerance range;
and performing similarity matching according to the similarity between the pixel points on the original image of the current frame and the pixel points on the original image of the key frame.
In another embodiment, the matching unit 12011 is further specifically configured to:
determining thumbnails of a plurality of key frames in the key frame set of the map, wherein the difference between the thumbnail shooting parameters and the shooting parameters of the current frame meets the tolerance range;
determining an original image of the plurality of key frames;
and performing similarity matching according to the similarity between the pixel points on the original image of the current frame and the pixel points on the original image of the key frame.
Fig. 14 is a block diagram of a third module of an embodiment of an apparatus for tracking a state of an electronic device according to an embodiment of the present invention, and as shown in fig. 14, the first determining module 1201 further includes:
a second determining unit 12013, configured to determine the at least one key frame in the key frame set according to a shooting parameter, where the shooting parameter includes at least one of: lens orientation, lens position, and shooting time; and/or the presence of a gas in the gas,
the image of the at least one key frame includes: thumbnails and/or artwork of the at least one key frame.
Fig. 15 is a block diagram of a fourth embodiment of an electronic device state tracking apparatus according to an embodiment of the present invention, and as shown in fig. 15, an obtaining module 1202 includes:
a first determining unit 12021, configured to determine an initial lens position and an initial lens orientation corresponding to the current frame according to the lens position and the lens orientation corresponding to the target key frame;
a first projection unit 12022, configured to project the thumbnail of the target key frame onto the thumbnail of the current frame according to the initial lens position and the initial lens orientation corresponding to the current frame;
the first processing unit 12023 is configured to, when the matching degree between the pixel point of the thumbnail of the target key frame and the pixel point of the thumbnail of the current frame after projection reaches a preset value, take the lens position and the lens orientation during projection as the lens position and the lens orientation corresponding to the current frame.
Fig. 16 is a block diagram of a fifth module structure of an embodiment of an apparatus for tracking a state of an electronic device according to an embodiment of the present invention, and as shown in fig. 16, the obtaining module 1202 further includes:
a second determining unit 12024, configured to determine an initial lens position and an initial lens orientation corresponding to the current frame according to the lens position and the lens orientation corresponding to the target key frame;
a first matching unit 12025, configured to perform pixel matching on the current frame and the target key frame according to the original image of the current frame and the original image of the target key frame, and obtain a matched pixel set;
a second projecting unit 12026, configured to project the target key frame onto the current frame according to the initial lens position, the initial lens orientation, and the matched pixel point set;
a second processing unit 12027, configured to determine a lens position and a lens orientation of the electronic device according to the projection result.
Fig. 17 is a block diagram of a sixth embodiment of an electronic device state tracking apparatus according to an embodiment of the present invention, and as shown in fig. 17, the obtaining module 1202 further includes:
a third determining unit 12028, configured to determine an initial lens position and an initial lens orientation corresponding to the current frame according to the lens position and the lens orientation corresponding to the target key frame;
a third projection unit 12029, configured to project the thumbnail of the target key frame onto the thumbnail of the current frame according to the initial lens position and the initial lens orientation corresponding to the current frame;
a third processing unit 120210, configured to, when a matching degree between a pixel point of the thumbnail of the target key frame and a pixel point of the thumbnail of the current frame after projection reaches a preset value, take a lens position and a lens orientation during projection as a lens position and a lens orientation corresponding to the current frame;
a second matching unit 120211, configured to perform pixel matching on the current frame and the target key frame according to the original image of the current frame and the original image of the target key frame, and obtain a matched pixel set;
a fourth projecting unit 120212, configured to project the target key frame onto the current frame according to the lens position and the lens orientation corresponding to the current frame and the matched pixel point set;
a fourth processing unit 120213, configured to determine a lens position and a lens orientation of the electronic device according to the projection result.
Fig. 18 is a block diagram of a seventh embodiment of a state tracking apparatus for electronic devices according to an embodiment of the present invention, as shown in fig. 18, further including:
a second determining module 1204, configured to determine parameter information of an inertial measurement unit IMU in the electronic device;
a third determining module 1205, configured to re-determine the state of the IMU according to the lens position and lens orientation of the electronic device and the parameter information;
a second tracking module 1206, configured to perform IMU state tracking of the electronic device according to the re-determined state of the IMU of the electronic device.
In another embodiment, the parameter information of the IMU includes at least one of: a velocity of the electronic device, an angular velocity bias of the IMU, and an acceleration bias of the IMU.
Fig. 19 is a block diagram of an eighth embodiment of an apparatus for tracking a status of an electronic device according to an embodiment of the present invention, and as shown in fig. 19, the second determining module 1204 includes:
the first determining unit 12041 is configured to determine a position and an orientation of an IMU corresponding to each frame of image in a first video subsection of a video stream captured by a lens of the electronic device.
The second determining unit 12042 is configured to determine the first angle variation according to the orientation of the IMU corresponding to the first frame and the orientation of the IMU corresponding to the second frame.
A third determining unit 12043, configured to perform integration according to the angular velocity of the IMU in the time interval corresponding to the image of the first frame and the image of the second frame to obtain a second angle variation.
A first obtaining unit 12044, configured to obtain an angular velocity deviation of the IMU corresponding to the first frame according to the first angle variation and the second angle variation;
wherein the second frame is a frame preceding the first frame.
Fig. 20 is a block diagram of a ninth embodiment of an electronic device state tracking apparatus according to an embodiment of the present invention, and as shown in fig. 20, the second determining module 1204 further includes: the fourth determining unit 12045 is configured to determine the first displacement variation according to the position of the IMU corresponding to the first frame and the position of the IMU corresponding to the second frame. A fifth determining unit 12046, configured to integrate, according to preset gravity information, the acceleration of the IMU in a time interval corresponding to the first frame image and the second frame image to obtain a second displacement variation.
A second obtaining unit 12047, configured to obtain, according to the first displacement variation, the second displacement variation, the first angle variation, and the second angle variation, speed information corresponding to the first frame and an acceleration deviation of the IMU.
Fig. 21 is a block diagram of a tenth embodiment of an electronic device state tracking apparatus according to an embodiment of the present invention, as shown in fig. 21, further including:
a verifying module 1207, configured to verify the parameter information of the IMU.
An adjusting module 1208, configured to correspondingly adjust the IMU according to the determined status of the IMU when the verification passes.
Fig. 22 is a block diagram of an eleventh embodiment of an electronic device state tracking apparatus according to an embodiment of the present invention, as shown in fig. 22, further including:
a fourth determining module 1209, configured to determine, when the verification fails, a second video sub-segment in the video stream captured by the electronic device lens, and re-determine parameter information of the IMU based on the second video sub-segment, where the second video sub-segment is partially overlapped with or completely different from the first video sub-segment.
In another embodiment, the verification module 706 is specifically configured to:
verifying whether the parameter information of the IMU meets at least one of the following conditions:
the acceleration deviation of the IMU is smaller than or equal to a first preset threshold, and the difference value between the acceleration deviation of the IMU and the historical offset of the accelerometer stored in the electronic equipment is smaller than a second preset threshold;
the difference between the speed of the electronic device and the speed determined at the time of the tracking is less than a third preset threshold.
Fig. 23 is a block diagram of an AR engine according to an embodiment of the present invention, and as shown in fig. 23, the AR engine includes:
an obtaining module 2301, configured to obtain an output result of the SLAM system, where the output result includes state tracking information of the electronic device obtained by using the state tracking method for the electronic device;
a drawing module 2302 is configured to draw a virtual object in a scene of the electronic device or a video stream captured by the scene according to the state tracking information.
Fig. 24 is a block diagram of an electronic device according to an embodiment of the present invention, and as shown in fig. 24, the electronic device includes:
a memory 2401 for storing program instructions.
Fig. 25 is a schematic diagram of an architecture of an augmented reality AR control system according to an embodiment of the present invention, and as shown in fig. 24, the system 50 includes:
an electronic device 2501, an AR engine 2502, and a SLAM system 2503 that are communicatively connected.
The AR engine 2502 is the AR engine shown in fig. 23, and the SLAM system 2403 includes the electronic device status tracking apparatus described in the above embodiments. In actual use, the state tracking device of the electronic device in the SLAM system acquires the state tracking information of the electronic device, transmits the state tracking information to the AR engine, and draws a virtual object in a scene of the electronic device or a video stream shot by the scene according to the state tracking information of the electronic device by the AR engine.
Specifically, as shown in fig. 25, the electronic device 2501, the SLAM system 2503, and the AR engine 2502 in the AR control system 50 are communicatively connected to each other, and data can be transmitted.
Alternatively, as shown in fig. 26, the SLAM system 2503 of the present embodiment may be provided in the electronic device 2501, or as shown in fig. 27, both the SLAM system 2503 and the AR engine 2502 of the present embodiment are provided in the electronic device 2501.
Those of ordinary skill in the art will understand that: all or a portion of the steps of implementing the above-described method embodiments may be performed by hardware associated with program instructions. The program may be stored in a computer-readable storage medium. When executed, the program performs steps comprising the method embodiments described above; and the aforementioned storage medium includes: various media that can store program codes, such as ROM, RAM, magnetic or optical disks.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solution of the present invention, and not to limit the same; while the invention has been described in detail and with reference to the foregoing embodiments, it will be understood by those skilled in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present invention.
Claims (20)
1. An electronic device state tracking method, comprising:
determining images of a plurality of key frames of which the difference between shooting parameters and the current frame shooting parameters meets the tolerance range in a key frame set of a map generated by a simultaneous localization and mapping (SLAM) system aiming at the environment where electronic equipment is located; the image of the current frame and the image of the key frame are thumbnails; or the image of the current frame and the image of the key frame are both original images; or the image of the current frame comprises an original image and a thumbnail, and the image of the key frame comprises the original image and the thumbnail;
performing similarity matching according to the similarity degree of the pixel points on the image of the current frame and the pixel points on the image of the key frame, and determining a target key frame from a plurality of key frames based on the obtained similarity matching result;
determining the lens position and the lens orientation of the electronic equipment according to the target key frame and the current frame;
performing visual state tracking of the electronic device based on the determined lens position and the lens orientation.
2. The method of claim 1, wherein the image of the current frame and the image of the key frame are both thumbnails; the matching of the similarity according to the similarity between the pixel point on the image of the current frame and the pixel point on the image of the key frame comprises the following steps:
determining a plurality of key frames in the key frame set of the map, wherein the difference between the thumbnail shooting parameters and the shooting parameters of the current frame meets the tolerance range;
and performing similarity matching according to the similarity between the pixel point on the thumbnail of the current frame and the pixel point on the thumbnail of the key frame.
3. The method of claim 1, wherein the image of the current frame and the image of the key frame are both original images; the matching of the similarity according to the similarity between the pixel point on the image of the current frame and the pixel point on the image of the key frame comprises the following steps:
determining a plurality of key frames in the key frame set of the map, wherein the difference between the original image shooting parameters and the current frame shooting parameters meets the tolerance range;
and performing similarity matching according to the similarity between the pixel points on the original image of the current frame and the pixel points on the original image of the key frame.
4. The method of claim 1, wherein the image of the current frame comprises artwork and thumbnails and the image of the key frame comprises artwork and thumbnails; the matching of the similarity according to the similarity between the pixel point on the image of the current frame and the pixel point on the image of the key frame comprises the following steps:
determining a plurality of key frames in the key frame set of the map, wherein the difference between the thumbnail shooting parameters and the shooting parameters of the current frame meets the tolerance range;
determining an original image of the plurality of key frames;
and performing similarity matching according to the similarity between the pixel points on the original image of the current frame and the pixel points on the original image of the key frame.
5. The method according to any one of claims 1-4, wherein determining a lens position and a lens orientation of an electronic device from the target key frame and the current frame comprises:
determining an initial lens position and an initial lens orientation corresponding to the current frame according to the lens position and the lens orientation corresponding to the target key frame;
projecting the thumbnail of the target key frame to the thumbnail of the current frame according to the initial lens position and the initial lens orientation corresponding to the current frame;
and if the matching degree of the pixel point of the thumbnail of the target key frame after projection and the pixel point of the thumbnail of the current frame reaches a preset value, taking the lens position and the lens orientation during projection as the lens position and the lens orientation corresponding to the current frame.
6. The method according to any one of claims 1-4, wherein determining a lens position and a lens orientation of an electronic device from the target key frame and the current frame comprises:
determining an initial lens position and an initial lens orientation corresponding to the current frame according to the lens position and the lens orientation corresponding to the target key frame;
according to the original image of the current frame and the original image of the target key frame, performing pixel point matching on the current frame and the target key frame to obtain a matched pixel point set;
projecting the target key frame to the current frame according to the initial lens position, the initial lens orientation and the matched pixel point set;
and determining the lens position and the lens orientation of the electronic equipment according to the projection result.
7. The method according to any one of claims 1-4, wherein determining a lens position and a lens orientation of an electronic device from the target key frame and the current frame comprises:
determining an initial lens position and an initial lens orientation corresponding to the current frame according to the lens position and the lens orientation corresponding to the target key frame;
projecting the thumbnail of the target key frame to the thumbnail of the current frame according to the initial lens position and the initial lens orientation corresponding to the current frame;
if the matching degree of the pixel point of the thumbnail of the target key frame and the pixel point of the thumbnail of the current frame after projection reaches a preset value, taking the lens position and the lens orientation during projection as the lens position and the lens orientation corresponding to the current frame;
according to the original image of the current frame and the original image of the target key frame, performing pixel point matching on the current frame and the target key frame to obtain a matched pixel point set;
projecting the target key frame to the current frame according to the lens position and the lens orientation corresponding to the current frame and the matched pixel point set;
and determining the lens position and the lens orientation of the electronic equipment according to the projection result.
8. The method of any of claims 1-7, further comprising:
determining parameter information of an Inertial Measurement Unit (IMU) in the electronic equipment;
re-determining the state of the IMU of the electronic equipment according to the lens position and the lens orientation of the electronic equipment and the parameter information;
and tracking the IMU state of the electronic equipment according to the redetermined IMU state of the electronic equipment.
9. The method of claim 8, wherein the IMU parameter information includes at least one of: a velocity of the electronic device, an angular velocity bias of the IMU, and an acceleration bias of the IMU.
10. The method of claim 9, wherein determining an angular velocity bias of the IMU comprises:
determining the position and the orientation of an IMU corresponding to each frame of image in a first video subsection in a video stream acquired by the lens of the electronic equipment;
determining a first angle variation according to the orientation of the IMU corresponding to the first frame and the orientation of the IMU corresponding to the second frame;
integrating the angular speed of the IMU within a time interval corresponding to the image of the first frame and the image of the second frame to obtain a second angle variation;
acquiring the angular speed deviation of the IMU corresponding to the first frame according to the first angle variation and the second angle variation;
wherein the second frame is a frame preceding the first frame.
11. The method of claim 10, wherein determining the acceleration bias of the IMU comprises:
determining a first displacement variation according to the position of the IMU corresponding to the first frame and the position of the IMU corresponding to the second frame;
according to preset gravity information, integrating the acceleration of the IMU in a time interval corresponding to the first frame image and the second frame image to obtain a second displacement variation;
and acquiring the speed information corresponding to the first frame and the acceleration deviation of the IMU according to the first displacement variation, the second displacement variation, the first angle variation and the second angle variation.
12. The method according to any one of claims 8-10, further comprising:
verifying the parameter information of the IMU;
and if the IMU passes the verification, correspondingly adjusting the IMU according to the determined state of the IMU.
13. The method of claim 12, further comprising:
and if the verification fails, determining a second video sub-segment in the video stream acquired by the lens of the electronic equipment, and re-determining the parameter information of the IMU based on the second video sub-segment, wherein the second video sub-segment is partially overlapped with or completely different from the first video sub-segment.
14. The method of claim 13, wherein verifying the parameter information of the IMU comprises:
verifying whether the parameter information of the IMU meets at least one of the following conditions:
the acceleration deviation of the IMU is smaller than or equal to a first preset threshold, and the difference value between the acceleration deviation of the IMU and the historical offset of the accelerometer stored in the electronic equipment is smaller than a second preset threshold;
the difference between the speed of the electronic device and the speed determined at the time of the tracking is less than a third preset threshold.
15. An Augmented Reality (AR) control method, comprising:
the AR engine acquires an output result of the simultaneous localization and mapping SLAM system, wherein the output result comprises state tracking information of the electronic equipment acquired by the method of any one of claims 1-14;
and the AR engine draws a virtual object in the scene of the electronic equipment or the video stream shot by the scene according to the state tracking information.
16. An electronic device state tracking apparatus, comprising:
the system comprises a first determining module, a second determining module and a third determining module, wherein the first determining module is used for determining images of a plurality of key frames of which the difference between shooting parameters and the current frame of the current frame meets a tolerance range in a key frame set of a map generated by a simultaneous localization and mapping SLAM system aiming at the environment where the electronic equipment is located; the image of the current frame and the image of the key frame are thumbnails; or the image of the current frame and the image of the key frame are both original images; or the image of the current frame comprises an original image and a thumbnail, and the image of the key frame comprises the original image and the thumbnail;
a second determining module, configured to perform similarity matching according to a similarity between a pixel point on the image of the current frame and a pixel point on the image of the key frame, and determine a target key frame from the plurality of key frames based on an obtained similarity matching result;
a third determining module, configured to determine a lens position and a lens orientation of the electronic device according to the target key frame and the current frame;
a tracking module to track a visual state of the electronic device based on the determined lens position and the lens orientation.
17. An Augmented Reality (AR) engine, comprising:
the acquisition module is used for acquiring an output result of the SLAM system, wherein the output result comprises the state tracking information of the electronic equipment acquired by adopting the state tracking method of the electronic equipment;
and the drawing module is used for drawing the virtual object in the scene of the electronic equipment or the video stream shot by the scene according to the state tracking information.
18. An electronic device, comprising:
a memory for storing program instructions;
a processor for invoking and executing program instructions in said memory for performing the method steps of any of claims 1-14.
19. A readable storage medium having a computer program stored therein, wherein when the computer program is executed by at least one processor of an electronic device state tracking apparatus, the electronic device state tracking apparatus performs the electronic device state tracking method of any one of claims 1-14.
20. An Augmented Reality (AR) control system, comprising: communicatively connected electronic device, Augmented Reality (AR) engine, and simultaneous localization and mapping (SLAM) system, the AR engine being the AR engine of claim 17, the SLAM system comprising the electronic device status tracking apparatus of claim 16.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110429852.2A CN113139456A (en) | 2018-02-05 | 2018-02-05 | Electronic equipment state tracking method and device, electronic equipment and control system |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810114363.6A CN110119649B (en) | 2018-02-05 | 2018-02-05 | Electronic equipment state tracking method and device, electronic equipment and control system |
CN202110429852.2A CN113139456A (en) | 2018-02-05 | 2018-02-05 | Electronic equipment state tracking method and device, electronic equipment and control system |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810114363.6A Division CN110119649B (en) | 2018-02-05 | 2018-02-05 | Electronic equipment state tracking method and device, electronic equipment and control system |
Publications (1)
Publication Number | Publication Date |
---|---|
CN113139456A true CN113139456A (en) | 2021-07-20 |
Family
ID=67519768
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810114363.6A Active CN110119649B (en) | 2018-02-05 | 2018-02-05 | Electronic equipment state tracking method and device, electronic equipment and control system |
CN202110429852.2A Pending CN113139456A (en) | 2018-02-05 | 2018-02-05 | Electronic equipment state tracking method and device, electronic equipment and control system |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810114363.6A Active CN110119649B (en) | 2018-02-05 | 2018-02-05 | Electronic equipment state tracking method and device, electronic equipment and control system |
Country Status (1)
Country | Link |
---|---|
CN (2) | CN110119649B (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111862148B (en) * | 2020-06-05 | 2024-02-09 | 中国人民解放军军事科学院国防科技创新研究院 | Method, device, electronic equipment and medium for realizing visual tracking |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140334668A1 (en) * | 2013-05-10 | 2014-11-13 | Palo Alto Research Center Incorporated | System and method for visual motion based object segmentation and tracking |
WO2016095057A1 (en) * | 2014-12-19 | 2016-06-23 | Sulon Technologies Inc. | Peripheral tracking for an augmented reality head mounted device |
CN105953796A (en) * | 2016-05-23 | 2016-09-21 | 北京暴风魔镜科技有限公司 | Stable motion tracking method and stable motion tracking device based on integration of simple camera and IMU (inertial measurement unit) of smart cellphone |
WO2017117675A1 (en) * | 2016-01-08 | 2017-07-13 | Sulon Technologies Inc. | Head mounted device for augmented reality |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101763647A (en) * | 2010-02-02 | 2010-06-30 | 浙江大学 | Real-time video camera tracking method based on key frames |
US20140139635A1 (en) * | 2012-09-17 | 2014-05-22 | Nec Laboratories America, Inc. | Real-time monocular structure from motion |
EP2979246A1 (en) * | 2013-03-27 | 2016-02-03 | Thomson Licensing | Method and apparatus for automatic keyframe extraction |
CN103646391B (en) * | 2013-09-30 | 2016-09-28 | 浙江大学 | A kind of real-time video camera tracking method for dynamic scene change |
CN105513083B (en) * | 2015-12-31 | 2019-02-22 | 新浪网技术(中国)有限公司 | A kind of PTAM video camera tracking method and device |
CN106446815B (en) * | 2016-09-14 | 2019-08-09 | 浙江大学 | A Simultaneous Localization and Map Construction Method |
CN106885574B (en) * | 2017-02-15 | 2020-02-07 | 北京大学深圳研究生院 | Monocular vision robot synchronous positioning and map construction method based on re-tracking strategy |
-
2018
- 2018-02-05 CN CN201810114363.6A patent/CN110119649B/en active Active
- 2018-02-05 CN CN202110429852.2A patent/CN113139456A/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140334668A1 (en) * | 2013-05-10 | 2014-11-13 | Palo Alto Research Center Incorporated | System and method for visual motion based object segmentation and tracking |
WO2016095057A1 (en) * | 2014-12-19 | 2016-06-23 | Sulon Technologies Inc. | Peripheral tracking for an augmented reality head mounted device |
WO2017117675A1 (en) * | 2016-01-08 | 2017-07-13 | Sulon Technologies Inc. | Head mounted device for augmented reality |
CN105953796A (en) * | 2016-05-23 | 2016-09-21 | 北京暴风魔镜科技有限公司 | Stable motion tracking method and stable motion tracking device based on integration of simple camera and IMU (inertial measurement unit) of smart cellphone |
Non-Patent Citations (2)
Title |
---|
刘浩敏;章国锋;鲍虎军;: "基于单目视觉的同时定位与地图构建方法综述", 计算机辅助设计与图形学学报, no. 06 * |
梁兴建;雷文;陈超;: "基于多路图像融合的目标跟踪系统设计", 四川理工学院学报(自然科学版), no. 06 * |
Also Published As
Publication number | Publication date |
---|---|
CN110119649B (en) | 2021-03-26 |
CN110119649A (en) | 2019-08-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110264509B (en) | Method, apparatus, and storage medium for determining pose of image capturing device | |
CN107888828B (en) | Space positioning method and device, electronic device, and storage medium | |
US11922658B2 (en) | Pose tracking method, pose tracking device and electronic device | |
CN109643373B (en) | Estimating pose in 3D space | |
CN108805917B (en) | Method, medium, apparatus and computing device for spatial localization | |
EP1978731B1 (en) | Image stabilizing apparatus, image pick-up apparatus and image stabilizing method | |
EP3680808A1 (en) | Augmented reality scene processing method and apparatus, and computer storage medium | |
KR102135770B1 (en) | Method and apparatus for reconstructing 3d face with stereo camera | |
WO2019119328A1 (en) | Vision-based positioning method and aerial vehicle | |
US9516223B2 (en) | Motion-based image stitching | |
US20200265633A1 (en) | Image processing apparatus, image processing method, and storage medium | |
JP2019536170A (en) | Virtually extended visual simultaneous localization and mapping system and method | |
CN112543343B (en) | Live broadcast picture processing method and device based on live broadcast with wheat | |
US10204445B2 (en) | Information processing apparatus, method, and storage medium for determining a failure of position and orientation measurement of an image capturing device | |
EP2851868A1 (en) | 3D Reconstruction | |
US8965105B2 (en) | Image processing device and method | |
KR101804199B1 (en) | Apparatus and method of creating 3 dimension panorama image | |
CN113610918B (en) | Position calculation method and device, electronic device, and readable storage medium | |
EP3718302B1 (en) | Method and system for handling 360 degree image content | |
US20190098215A1 (en) | Image blur correction device and control method | |
EP3522520B1 (en) | Image processing method, electronic device, and non-transitory computer readable storage medium | |
CN110119649B (en) | Electronic equipment state tracking method and device, electronic equipment and control system | |
JP4578653B2 (en) | Depth image generation apparatus, depth image generation method, and computer-readable recording medium storing a program for causing a computer to execute the method | |
KR102211769B1 (en) | Method and Apparatus for correcting geometric correction of images captured by multi cameras | |
WO2021065607A1 (en) | Information processing device and method, and program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
AD01 | Patent right deemed abandoned |
Effective date of abandoning: 20240816 |
|
AD01 | Patent right deemed abandoned |