CN111523474B - Target object tracking method and electronic device - Google Patents
Target object tracking method and electronic device Download PDFInfo
- Publication number
- CN111523474B CN111523474B CN202010328930.5A CN202010328930A CN111523474B CN 111523474 B CN111523474 B CN 111523474B CN 202010328930 A CN202010328930 A CN 202010328930A CN 111523474 B CN111523474 B CN 111523474B
- Authority
- CN
- China
- Prior art keywords
- user
- electronic device
- target object
- information
- position information
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
- G06V40/19—Sensors therefor
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/22—Matching criteria, e.g. proximity measures
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/40—Scenes; Scene-specific elements in video content
- G06V20/41—Higher-level, semantic clustering, classification or understanding of video scenes, e.g. detection, labelling or Markovian modelling of sport events or news items
- G06V20/42—Higher-level, semantic clustering, classification or understanding of video scenes, e.g. detection, labelling or Markovian modelling of sport events or news items of sport video content
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30204—Marker
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Data Mining & Analysis (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Engineering & Computer Science (AREA)
- General Health & Medical Sciences (AREA)
- Evolutionary Biology (AREA)
- Bioinformatics & Computational Biology (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Artificial Intelligence (AREA)
- Health & Medical Sciences (AREA)
- Evolutionary Computation (AREA)
- Ophthalmology & Optometry (AREA)
- Human Computer Interaction (AREA)
- Life Sciences & Earth Sciences (AREA)
- Computational Linguistics (AREA)
- Software Systems (AREA)
- Position Input By Displaying (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The embodiment of the invention provides an object tracking method and electronic equipment, wherein the method is applied to first electronic equipment and comprises the following steps: acquiring first position information of the first electronic equipment and a target object tracked by eyeballs of a first user of the first electronic equipment; acquiring second position information of a second electronic device and first position information in a view field of a second user of the second electronic device; determining second positioning information of the target object relative to the first user according to the first position information, the second position information and the first positioning information; and tracking the target object according to the second positioning information. The embodiment of the invention solves the problem of poor perception of the visual angle received by the user in the process of using the human eye auxiliary equipment in the prior art.
Description
Technical Field
The present invention relates to the field of mobile communications technologies, and in particular, to an object tracking method and an electronic device.
Background
With the rapid development of mobile communication technology, the rapid development of intelligent electronic devices is promoted by the fifth generation mobile communication technology (5th Generation Mobile Networks,5G), and the intelligent electronic devices not only have the function of private assistants, but also can assist human beings to break through physiological limits, so that higher man-machine integration is realized. Taking eyes as an example, the eyes have various limitations, for example, the human vision distance is limited, and if the eyes are beyond a certain distance, the eyes are difficult to observe clearly without aid of auxiliary equipment such as glasses, cameras and the like, for example, football match scenes are watched on site; in addition, the ability of human eyes to perceive moving objects is limited, so that the moving objects cannot be captured in time at times, for example, the human eyes are easy to lose when mosquitoes are beaten, and large markets with people's heads being accumulated are easy to lose.
At present, people can solve part of problems by using auxiliary equipment such as telescopes, cameras and the like, and break through the limitation based on a more friendly man-machine interaction mode; for example, telescope is utilized to assist in telescoping, long-focus lens is utilized to shoot long-range scenery, high-frame camera is utilized to configure a tripod head and an algorithm to automatically capture moving objects, live broadcasting platform is utilized to share the sights, and the like, so that people can be helped to break through the limitations of human eyes to a certain extent.
Then, the existing human eye auxiliary equipment still has a certain limitation, the visual angle is conveyed through a language action or a network live broadcast mode, the visual angle received by the receiving user is the visual angle of other people, the receiving user is difficult to switch to the first visual angle based on the visual angle of other people, and the perceptibility is poor.
Disclosure of Invention
The embodiment of the invention provides an object tracking method and electronic equipment, which can solve the problem of poor perception of a visual angle received by a user in the process of using eye auxiliary equipment in the prior art.
In order to solve the technical problems, the invention is realized as follows:
in a first aspect, an embodiment of the present invention provides an object tracking method, which is applied to a first electronic device, where the method includes:
Acquiring first position information of the first electronic equipment and a target object tracked by eyeballs of a first user of the first electronic equipment; acquiring second position information of a second electronic device and first position information in a view field of a second user of the second electronic device;
determining second positioning information of the target object relative to the first user according to the first position information, the second position information and the first positioning information;
and tracking the target object according to the second positioning information.
In a second aspect, an embodiment of the present invention provides an object tracking method, applied to a second electronic device, where the method includes:
receiving a search request sent by first electronic equipment; the search request carries the identification of the target object;
transmitting second location information of a second user of the second electronic device and first location information of the target object in a view field of the second user of the second electronic device to the first electronic device in response to the search request; and the first electronic equipment determines second positioning information of the target object relative to a first user of the electronic equipment according to the first position information, the second position information and the first positioning information of the first electronic equipment, and tracks the target object according to the second positioning information.
In a third aspect, an embodiment of the present invention further provides an electronic device, where the electronic device is a first electronic device, and the electronic device includes:
the information acquisition module is used for acquiring first position information of the first electronic equipment and a target object tracked by eyeballs of a first user of the first electronic equipment; acquiring second position information of a second electronic device and first position information in a view field of a second user of the second electronic device;
an information determining module, configured to determine second positioning information of the target object relative to the first user according to the first position information, the second position information, and the first positioning information;
and the target tracking module is used for tracking the target object according to the second positioning information.
In a fourth aspect, an embodiment of the present invention further provides an electronic device, where the electronic device is a second electronic device, and the electronic device includes:
the request receiving module is used for receiving a search request sent by the first electronic equipment; the search request carries the identification of the target object;
an information sharing module, configured to send, to the first electronic device, second location information of a second user of the second electronic device and first location information of the target object in a view field of the second user of the second electronic device, in response to the search request; and the first electronic equipment determines second positioning information of the target object relative to a first user of the electronic equipment according to the first position information, the second position information and the first positioning information of the first electronic equipment, and tracks the target object according to the second positioning information.
In a fifth aspect, embodiments of the present invention also provide an electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the steps in the object tracking method as described above when executing the computer program.
In a sixth aspect, embodiments of the present invention also provide a computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements the steps of an object tracking method as described above.
In the embodiment of the invention, the first position information of the first electronic equipment and a target object tracked by the eyeballs of the first user of the first electronic equipment are acquired; acquiring second position information of a second electronic device and first position information in a view field of a second user of the second electronic device; determining second positioning information of the target object relative to the first user according to the first position information, the second position information and the first positioning information; and tracking the target object according to the second positioning information so as to realize view tracking through the first user. Based on the fact that the second electronic equipment shares the view information of the second electronic equipment, second positioning information of the target object relative to the first electronic equipment is determined, and the second positioning information is directly converted into a view angle of the first user; the problem of cracking sense of the picture is solved through the AR glasses; meanwhile, the first electronic equipment can quickly realize the re-tracking of the target object based on the information shared by the second electronic equipment, and the process does not need user participation, so that the operation is simple and convenient.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings that are needed in the description of the embodiments of the present invention will be briefly described below, it being obvious that the drawings in the following description are only some embodiments of the present invention, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 shows one of the flowcharts of an object tracking method according to an embodiment of the present invention;
FIG. 2 shows a schematic diagram of a first example of an embodiment of the invention;
FIG. 3 shows a flow chart of a second example of an embodiment of the present invention;
FIG. 4 shows a schematic diagram of a second example of an embodiment of the invention;
FIG. 5 is a second flowchart of an object tracking method according to an embodiment of the present invention;
FIG. 6 shows a flow chart of a third example of an embodiment of the invention;
FIG. 7 shows one of the block diagrams of the electronic device provided by an embodiment of the invention;
FIG. 8 shows a second block diagram of an electronic device provided by an embodiment of the invention;
fig. 9 shows a third block diagram of the electronic device provided by the embodiment of the invention.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are some, but not all embodiments of the invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
It should be appreciated that reference throughout this specification to "one embodiment" or "an embodiment" means that a particular feature, structure or characteristic described in connection with the embodiment is included in at least one embodiment of the present invention. Thus, the appearances of the phrases "in one embodiment" or "in an embodiment" in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
In various embodiments of the present invention, it should be understood that the sequence numbers of the following processes do not mean the order of execution, and the order of execution of the processes should be determined by the functions and internal logic, and should not constitute any limitation on the implementation process of the embodiments of the present invention.
Referring to fig. 1, an embodiment of the present invention provides an object tracking method applied to a first electronic device, where the first electronic device may be a wearable device, for example, augmented reality (Augmented Reality, AR) glasses.
The method comprises the following steps:
step 101, acquiring first position information of the first electronic equipment and a target object tracked by eyeballs of a first user of the first electronic equipment; and obtaining second position information of the second electronic device and first position information in a view field of a second user of the second electronic device.
Wherein the field of view is the field of view, and the range of vision of the user. The target object is an object tracked by eyeballs of the first user, when the first electronic device detects that the target object disappears from the view of the first user, the first position information of the first user is obtained, and the first position information of the target object in the view of the second user of the second electronic device is obtained; the target object exists in the view of the second user of the second electronic device, and therefore the first electronic device obtains first positioning information of the target object in the view of the second user. Specifically, the first positioning information is the position information of the target object relative to the second user, so that the current second position information of the second user needs to be acquired, and the position information of the target object relative to the first electronic device is determined according to the view information of the second object.
Step 102, determining second positioning information of the target object relative to the first user according to the first position information, the second position information and the first positioning information.
And then calculating the second positioning information of the target object relative to the first user according to the relative position information (namely, the first positioning information) of the target object and the second user and the relative position information of the first user and the second user. For example, a plane coordinate system with the second position information of the second user as the center is established, and the relative position relation between the first user and the target object in the same plane coordinate system, namely the second positioning information, can be determined through the relative position information of the first user, the target object and the second user.
And step 103, tracking the target object according to the second positioning information.
Wherein, according to the second positioning information, the target object can be continuously tracked; for example, after determining the second positioning information, the electronic device automatically or guiding the user to switch the view to the area of the second positioning information; through interaction between the first electronic equipment and the second electronic equipment, the view field of the second user is shared to the first user, the first user is helped to track a target object disappeared in the view field of the first user, if the target object is tracked through human eye auxiliary equipment in the prior art, a picture is easy to generate a cracking sense, and taking a telescopic and tracking example, an imaging carrier screen such as a mobile phone is small, and an object cannot be positioned quickly due to extremely small view field when long shot is taken; in addition, the auxiliary equipment for human eyes in the prior art is low in efficiency, the equipment such as a telescope and a camera needs to manually adjust parameters, the process is complicated, for example, focusing is carried out, and a scene needing to be zoomed or tracked often goes short.
In the embodiment of the invention, the first position information of the first electronic equipment and the target object tracked by the eyeballs of the first user of the first electronic equipment are acquired; acquiring second position information of a second electronic device and first position information in a view field of a second user of the second electronic device; determining second positioning information of the target object relative to the first user according to the first position information, the second position information and the first positioning information; and tracking the target object according to the second positioning information so as to realize view tracking through the first user. Based on the fact that the second electronic equipment shares the view information of the second electronic equipment, second positioning information of the target object relative to the first electronic equipment is determined, and the second positioning information is directly converted into a view angle of the first user; the problem of cracking sense of the picture is solved through the AR glasses; meanwhile, the first electronic equipment can quickly realize the re-tracking of the target object based on the information shared by the second electronic equipment, and the process does not need user participation, so that the operation is simple and convenient. The embodiment of the invention solves the problem of poor perception of the visual angle received by the user in the process of using the human eye auxiliary equipment in the prior art.
Optionally, in an embodiment of the present invention, the first positioning information includes a first distance value and a first direction angle value;
the step of determining second positioning information of the target object relative to the first user according to the first position information, the second position information and the first positioning information includes:
determining third positioning information of the first user relative to the second user according to the first position information and the second position information; the third positioning information comprises a third distance value and a third direction angle value;
determining a second distance value of the target object relative to the first user according to the first distance value and the third distance value, and determining a second direction angle value of the target object relative to the first user according to the first direction angle value and the third direction angle value; the second positioning information includes the second distance value and a second direction angle value.
The first location information and the second location information may include latitude and longitude information. As a first example, as shown in fig. 2, point a is second position information, and a first plane coordinate system using the second position information point a of the second user as an origin is established, and the direction indicated by an arrow N in fig. 2 is the north direction; the point B is first position information, the specific positions of the first position information and the second position information are determined according to the longitude and latitude information of the first position information and the second position information, and further third positioning information of the first user relative to the second user is obtained, wherein the third positioning information comprises a third distance value and a third direction angle value, as shown in fig. 2, L3 is the third distance value, and the angle B1 is the third direction angle; the point C is the position of the target object corresponding to the first positioning information, L1 is a first distance value, and the angle C is a first direction angle.
Thus, a second plane coordinate system with the point B as the origin is established, and the position of the point C in the second plane coordinate system is determined, so that second positioning information is obtained.
As shown in fig. 2, the second distance value L2 and the second direction angle b2 are obtained to obtain second positioning information.
Specifically, according to the cosine law: a, a 2 =b 2 +c 2 -2bc*cosα,
Then there are:
L2 2 =L1 2 +L3 2 -2L1*L3*cos(b1+c)
wherein the sum of angle b1 and angle c corresponds to angle α in the cosine definition; since the angles b1 and c are known and the angles L1 and L3 are known, L2 can be calculated.
After determining L2, the value of angle b4 is further determined according to the cosine law:
wherein cos -1 Representing the inverse of the cosine function, arccos. Angle b1 is equal to angle b3, then second direction angle b2=90 degrees-angle b 3-angle b4.
Thus, the second positioning information of the target object relative to the first electronic equipment is obtained and is directly converted into the view angle of the first user.
It can be understood that if the first user, the second user and the target object are in different relative positional relationships with those in fig. 2, the included angles between L1 and L3 are slightly different, and the above process may still be used to calculate the second positioning information, which is not described herein in detail.
As a second example, a schematic process to apply the above-described object tracking method is shown in fig. 3.
Alternatively, the family group may be built based on a communication network, and the members within the group may share a perspective. When the user looks at the target object, the AR glasses of the user automatically focus to the point, and the AR glasses are started to automatically track; after a user shares a visual angle by one key, the position of the glasses can be positioned in real time through the GPS, and meanwhile, the positioning information of the tracked object is shared to the matched AR glasses in a certain area range; alternatively, the telescopic function based on glasses is generally set to 10 km in the visual range, and is specifically defined according to actual needs, and the embodiment of the present invention is not limited herein.
When the tracked object is not in the sight line of the other party (the shared user), the AR glasses can mark the direction based on the positioning information, assist the other party to rotate the visual angle until the tracked object enters the sight line, and once the tracked object enters the sight line, the AR glasses determine the tracked object and mark the tracked object by the aid of the accurate positioning information through the image recognition and tracking function, so that the shared user can efficiently acquire the shared visual angle. The premise of viewing angle sharing is that the objects to be seen are in the viewing areas of both parties at the same time, and are different from directly transmitting the viewing angle of the objects to other people.
In step 301, AR glasses of the shared user detect that the target object disappears from the user's view, and set up a sharing group.
The AR glasses of the shared user correspond to the first electronic device, and the AR glasses of the shared user correspond to the second electronic device.
As shown in fig. 4, the user clicks on the AR glasses helper page into the cell phone, setting up the sharing group.
Step 302, sharing identification information of the tracked object to the group members, and requesting the group members to feed back whether the tracked object is in the respective vision field.
In step 303, the AR glasses of the shared user receive the second location information and the first location information shared by the AR glasses of the shared user.
The sharing user clicks the one-key sharing function on the side edge of the AR glasses, the positioning information of the tracked object is obtained by utilizing the functions of GPS, compass, positioning Jiao Ceju and the like and is shared to other members in the group, and the labeling direction assists the shared person in rotating the visual angle.
And step 304, determining second positioning information of the target object relative to the first user by the AR glasses of the shared user according to the first position information, the second position information and the first positioning information, so as to realize continuous tracking of the target object.
The shared user and the sharing user are in the same scene, but can not stand at the same place, the tracked object is captured and marked for continuous tracking based on image recognition and accurate positioning express delivery through the guidance of the marked azimuth until the tracked object enters the view angle of the shared person.
The viewing angle sharing of the same scene is realized on the premise of not switching the viewing angles, the what the sharing user shares in the same scene can be quickly and efficiently shared to other people, and the unchanged viewing angle of the shared user is realized.
Optionally, in this embodiment of the present invention, before the obtaining the second location information of the second electronic device and the first location information in the view field of the second user of the second electronic device, the method further includes:
detecting that the target object disappears from the view of the first user of the first electronic device, and starting tracking of the target object by detecting that the target object disappears from the view of the first user, specifically comprising:
when the fixed focus of the eyeballs of the first user to a target position is detected, a first moving track of the eyeballs of the first user and a second moving track of the target object at the target position are obtained;
and determining that the target object disappears from the view of the first user of the first electronic device under the condition that the first moving track deviates from the second moving track.
The first electronic equipment detects the eyeballs of the first user at first, determines the central eye line, namely the direction of the central eye gaze, and determines that the eyeballs of the user are focused to the position if the central eye line exceeds a first preset time and remains unchanged.
Detecting that the eyeballs of a first user are fixed to a target position, and acquiring a target object in the target position, wherein at the moment, the electronic equipment monitors a first movement track of the eyeballs of the user for measuring and calculating the focusing point and a second movement track of the target object; if the two moving tracks deviate, the first user is indicated to lose the target object, or the first user gives up to continue tracking the target object.
Optionally, two types of cameras can be arranged in the AR glasses, one type of cameras is used for monitoring the eyeball measuring and calculating eyeball tracking path, and the other type of cameras is used for monitoring moving objects in the vision area; if the paths of the two moving tracks coincide, the eyeball is tracking a certain moving object, and once the paths which coincide all the time deviate, the high-frame camera can be started immediately to track the object in time and mark the object.
Optionally, in an embodiment of the present invention, the step of acquiring a second movement track of the target object at the target position includes:
acquiring a moving track of each object at the target position;
and determining the object, of which the similarity between the moving track and the first moving track reaches a preset threshold value, as the target object within a first preset time period, wherein the moving track of the target object is the second moving track.
If at least two objects exist at the target position, the object with high similarity between the moving track and the first moving track needs to be selected as the target object.
For example, the first electronic device shoots a video when the eyeball of the user is focused to a target position, then recognizes a moving track of a dynamic object in the video based on an image recognition function, possibly generates a plurality of moving tracks, and marks the three moving objects with similarity larger than a certain threshold value and ranking by calculating the similarity of the tracks. Because the track similarity is calculated in real time, the track of the real tracked object is more and more similar to the eye view track, the track similarity of other moving objects is less and less similar until the track is removed below a threshold value, and finally the remaining object with high similarity is the object to be tracked by the human eye, namely the target object, and then the object can be marked to guide the user to see the target object.
Optionally, in an embodiment of the present invention, after the step of determining that the target object disappears from the view of the first user of the first electronic device, the method includes:
sending a search request carrying a first image of the target object to third electronic equipment;
If a search response corresponding to the search request sent by the third electronic device is received, and the search response carries the second image of the target object, determining that the third electronic device corresponding to the search request is a second electronic device.
After the first electronic device detects that a target object disappears from the view of a first user of the first electronic device, sending a search request to other electronic devices to request assistance of the first electronic device to track the target object; the second electronic equipment searches whether the target object exists in the self-vision domain according to the target object indicated in the first image after receiving the first image by carrying the first image comprising the target object in the search request, if so, shoots the image of the target object, namely the second image, and feeds back the second image to the first electronic equipment through the search response.
If a search response is received and a second image including the target object exists in the search response, the third electronic device is used as second electronic device, and second position information and first positioning information are further acquired.
In the embodiment of the invention, the first position information of the first electronic equipment and the target object tracked by the eyeballs of the first user of the first electronic equipment are acquired; acquiring second position information of a second electronic device and first position information in a view field of a second user of the second electronic device; determining second positioning information of the target object relative to the first user according to the first position information, the second position information and the first positioning information; and tracking the target object according to the second positioning information so as to realize view tracking through the first user. Based on the fact that the second electronic equipment shares the view information of the second electronic equipment, second positioning information of the target object relative to the first electronic equipment is determined, and the second positioning information is directly converted into a view angle of the first user; the problem of cracking sense of the picture is solved through the AR glasses; meanwhile, the first electronic equipment can quickly realize the re-tracking of the target object based on the information shared by the second electronic equipment, and the process does not need user participation, so that the operation is simple and convenient.
The object tracking method applied to the first electronic device provided by the embodiment of the invention is described above, and the object tracking method applied to the second electronic device provided by the embodiment of the invention is described below with reference to the accompanying drawings.
Referring to fig. 5, an embodiment of the present invention provides an object tracking method applied to a second electronic device, where the second electronic device may be a wearable device, for example, an AR) mirror, and the embodiment of the present invention introduces the object tracking method by taking the second electronic device as an AR glasses example.
The method comprises the following steps:
step 501, receiving a search request sent by a first electronic device; the search request carries the identification of the target object.
The target object is an object tracked by eyeballs of the first user, and when the first electronic equipment detects that the target object disappears from the view of the first user, the first electronic equipment sends a search request to other equipment; after receiving the search request, the second electronic device obtains the identifier of the target object carried in the search request, where the identifier may be an image, a feature, and the like.
Step 502, in response to the search request, sending second location information of a second user of the second electronic device and first location information of the target object in a view field of the second user of the second electronic device to the first electronic device; and the first electronic equipment determines second positioning information of the target object relative to a first user of the electronic equipment according to the first position information, the second position information and the first positioning information of the first electronic equipment, and tracks the target object according to the second positioning information.
The second electronic equipment sends second position information of a second user of the second electronic equipment and first position information of the target object in the view field of the second user of the second electronic equipment to the first electronic equipment when the target object exists in the view field of the second user of the second electronic equipment; the field of view is the field of view, and the range of vision of the user.
The first electronic equipment determines second positioning information of the target object relative to a first user of the electronic equipment according to the first position information, the second position information and the first positioning information of the first electronic equipment, and tracks the target object according to the second positioning information; for example, a plane coordinate system with the second position information of the second user as the center is established, and the relative position relation between the first user and the target object in the same plane coordinate system, namely the second positioning information, can be determined through the relative position information of the first user, the target object and the second user.
Optionally, in an embodiment of the present invention, before the step of sending, to the first electronic device, second location information of a second user of the second electronic device, and the first location information of the target object in a view field of the second user of the second electronic device, the method further includes:
And acquiring first positioning information of the target object in a view field of a second user of the second electronic equipment.
The second electronic device firstly acquires the first positioning information and then sends the first positioning information and the second positioning information to the first electronic device.
Optionally, in an embodiment of the present invention, the first positioning information includes a first distance value and a first direction angle value;
the step of obtaining the first positioning information of the target object in the view field of the second user of the second electronic device includes:
when the eyeball of the second user is fixed to the target object, the zoom multiple of the lens of the second electronic equipment and the visual angle center point of the second user are obtained;
determining the object distance between the target object and the second user according to the zoom multiple, wherein the object distance is the first distance value; and determining the first direction angle value according to an angle between the view angle center point and an initial center point of the view angle of the second user.
The second electronic device detects the eyeballs of the second user at first, determines the central eye line, namely the direction of the central eye gaze, and determines that the eyeballs of the user are focused to the position if the central eye line exceeds a first preset time and remains unchanged.
The second electronic device photographs the object in the viewing direction with a camera. The focal length is gradually lengthened, and the diopter is minimum at this time, because the ciliary muscle is extremely relaxed when the human eye is looking far; when the focal length of the camera is gradually lengthened and the distance equivalent to the telescopic distance of a person is gradually shortened, the ciliary muscle gradually returns to normal until the state of looking at an object is completely restored, and the fact that the user can normally telescopic at the moment is shown; since the focal length of the camera changes along with the state of the ciliary muscle, when the ciliary muscle gradually returns to normal, the state of looking at an object at ordinary times is restored, the focusing is completed, and the viewing distance is calculated according to the multiple of the camera long focus and the object existing on the sight line.
Specifically, according to the zoom multiple and a first corresponding relation, determining a focal length corresponding to the zoom multiple, wherein the first corresponding relation is a preset corresponding relation between the zoom multiple and the focal length; then determining object distances between a target object and the second user according to a second corresponding relation between the focal length and the object distances, wherein the object distances are the first distance values, and the second corresponding relation is a preset corresponding relation between the focal length and the object distances; and determining the first direction angle value according to an angle between the view angle center point and an initial center point of the view angle of the second user; each user may preset an initial center point, such as when the user first uses the second electronic device, or when commissioning the device.
As a third example, referring to fig. 6, an exemplary process of automatically focusing AR glasses as described above is shown in fig. 6, and mainly includes the following steps:
step 601, collecting pupil focal length of a user, and measuring and calculating the vision distance.
Step 602, detecting that the vision distance reaches a preset threshold value, and determining that the direction is fixed for a preset duration, and focusing the eyeballs of the user to the target position.
Step 603, turning on the auto-telescopic function.
Step 604, if the viewing angle changes, determining whether the eyeball of the user tracks the target object:
if yes, return to step 603; if not, step 605 is executed to close the auto-telescopic function and the flow is ended.
Focusing of the human eye is achieved by changing the shape of the lens, i.e. changing its focal length (or diopter), by stretching the ciliary muscle, so that the user is given a negative focus to a certain point by the glasses calculation. In addition, the human eye looks at objects far away, ciliary muscle is relaxed, lens is flat, diopter is minimum, focal length is maximum; looking at a near object, the ciliary muscle contracts, the crystalline lens becomes convex, the diopter becomes larger, and the focal length becomes shorter, so that imaging on retina is ensured under the condition that the length of the eye axis is unchanged; therefore, the glasses can monitor the varistors through the depth camera to measure and calculate the viewing distance.
When the sight distance reaches a certain threshold value and the focal length and the visual angle direction are fixed for a certain time, the tele lens is automatically started, and the telescopic function is automatically executed.
When the glasses detect that the user focuses on the eye-catching type telescope at the same time of telescope, the user is likely to want to carefully observe a certain object, the glasses start the function of amplifying the telescope through the long-focus lens, and the photographed video is put into the glasses. At the same time, it is also possible that the user observes dynamic things like football at the telescope, and at this time, the automatic tracking function of the glasses detects whether the eyeballs move with the object photographed by the camera, such as football, and if so, the eyeballs conform to the telescope scene despite slight variation of the viewing angle.
In the embodiment of the invention, a search request sent by first electronic equipment is received; transmitting second location information of a second user of the second electronic device and first location information of the target object in a view field of the second user of the second electronic device to the first electronic device in response to the search request; and the first electronic equipment determines second positioning information of the target object relative to a first user of the electronic equipment according to the first position information, the second position information and the first positioning information of the first electronic equipment, and tracks the target object according to the second positioning information so as to realize view tracking through the first user. Based on the fact that the second electronic equipment shares the view information of the second electronic equipment, second positioning information of the target object relative to the first electronic equipment is determined, and the second positioning information is directly converted into a view angle of the first user; the problem of cracking sense of the picture is solved through the AR glasses; meanwhile, the first electronic equipment can quickly realize the re-tracking of the target object based on the information shared by the second electronic equipment, and the process does not need user participation, so that the operation is simple and convenient. The embodiment of the invention solves the problem of poor perception of the visual angle received by the user in the process of using the human eye auxiliary equipment in the prior art.
Having described the object tracking method provided by the embodiment of the present invention, an electronic device provided by the embodiment of the present invention will be described below with reference to the accompanying drawings.
Referring to fig. 7, the embodiment of the present invention further provides an electronic device 700, where the electronic device 700 is a first electronic device 700, and the electronic device 700 includes:
an information obtaining module 701, configured to obtain first location information of the first electronic device and a target object tracked by an eyeball of a first user of the first electronic device; and obtaining second position information of the second electronic device and first position information in a view field of a second user of the second electronic device.
Wherein the field of view is the field of view, and the range of vision of the user. The target object is an object tracked by the eyeballs of the first user, and when the first electronic device 700 detects that the target object disappears from the view of the first user, the first electronic device obtains the first position information of the first user, and obtains the first position information of the target object in the view of the second user of the second electronic device; the target object is present in the field of view of the second user of the second electronic device, and therefore the first electronic device 700 obtains the first positioning information of the target object in the field of view of the second user. Specifically, the first positioning information is the position information of the target object relative to the second user, so the current second position information of the second user needs to be acquired, so that the position information of the target object relative to the first electronic device 700 is determined according to the view information of the second object.
An information determining module 702 is configured to determine second positioning information of the target object relative to the first user of the electronic device 700 according to the first position information, the second position information, and the first positioning information.
And then calculating the second positioning information of the target object relative to the first user according to the relative position information (namely, the first positioning information) of the target object and the second user and the relative position information of the first user and the second user. For example, a plane coordinate system with the second position information of the second user as the center is established, and the relative position relation between the first user and the target object in the same plane coordinate system, namely the second positioning information, can be determined through the relative position information of the first user, the target object and the second user.
And a target tracking module 703, configured to track the target object according to the second positioning information.
Wherein, according to the second positioning information, tracking the target object continuously; for example, after determining the second positioning information, the electronic device 700 automatically or guides the user to switch the view to the area of the second positioning information; through interaction between the first electronic device 700 and the second electronic device, sharing of the second user's view to the first user is achieved, helping the first user track target objects that disappear in the first user's view.
Optionally, in an embodiment of the present invention, the first positioning information includes a first distance value and a first direction angle value;
the information determination module 702 includes:
a first determining sub-module for determining third positioning information of the first user relative to the second user according to the first position information and the second position information; the third positioning information comprises a third distance value and a third direction angle value;
a second determining sub-module, configured to determine a second distance value of the target object with respect to the first user according to the first distance value and the third distance value, and determine a second direction angle value of the target object with respect to the first user according to the first direction angle value and the third direction angle value; the second positioning information includes the second distance value and a second direction angle value.
Optionally, in an embodiment of the present invention, the first electronic device includes:
a detection module, configured to detect that the target object disappears from the view of the first user of the first electronic device before the information acquisition module 701 acquires second location information of a second electronic device and first location information in the view of the second user of the second electronic device.
Optionally, in an embodiment of the present invention, the detection module includes:
the detection sub-module is used for acquiring a first movement track of the eyeball of the first user and acquiring a second movement track of the target object at the target position when detecting that the eyeball of the first user is fixed to the target position;
and the ion biasing module is used for determining that the target object disappears from the view of the first user of the first electronic device 700 when the first moving track deviates from the second moving track.
Optionally, in an embodiment of the present invention, the detection submodule includes:
a track processing unit, configured to acquire a movement track of each object at the target position;
and determining the object, of which the similarity between the moving track and the first moving track reaches a preset threshold value, as the target object within a first preset time period, wherein the moving track of the target object is the second moving track.
Optionally, in an embodiment of the present invention, the electronic device 700 includes:
the searching module is used for sending a searching request carrying the first image of the target object to the third electronic equipment;
and if a search response about the search request sent by the third electronic device is received, and the search response carries the second image of the target object, determining that the third electronic device corresponding to the search request is a second electronic device.
The electronic device 700 provided in the embodiment of the present invention can implement each process implemented by the electronic device 700 in the method embodiment of fig. 1 to 4, and in order to avoid repetition, a description is omitted here.
In the embodiment of the present invention, the information obtaining module 701 detects that the target object disappears from the view of the first user of the first electronic device 700, obtains the first position information of the first user, and obtains the first position information of the target object in the view of the second user of the second electronic device and the second position information of the second user; wherein the target object is an object tracked by the eyeball of the first user; an information determining module 702 determines second positioning information of the target object relative to a first user of the electronic device 700 according to the first position information, the second position information, and the first positioning information; the target tracking module 703 tracks the target object according to the second positioning information, so as to achieve view tracking by the first user. Determining second positioning information of the target object relative to the first electronic device 700 based on the second electronic device sharing the view information thereof, and directly converting the second positioning information into a view angle of the first user; the problem of cracking sense of the picture is solved through the AR glasses; meanwhile, the first electronic device 700 can quickly realize the re-tracking of the target object based on the information shared by the second electronic device, and the process does not need the participation of a user, so that the operation is simple and convenient.
Referring to fig. 8, the embodiment of the present invention further provides an electronic device 800, where the electronic device 800 is a second electronic device 800, and the electronic device 800 includes:
a request receiving module 801, configured to receive a search request sent by a first electronic device; the search request carries the identification of the target object.
The target object is an object tracked by eyeballs of the first user, and when the first electronic equipment detects that the target object disappears from the view of the first user, the first electronic equipment sends a search request to other equipment; after receiving the search request, the second electronic device 800 obtains the identifier of the target object carried in the search request, where the identifier may be an image, a feature, or the like.
An information sharing module 802, configured to send, to the first electronic device, second location information of a second user of the second electronic device 800 and first location information of the target object in a view field of the second user of the second electronic device 800 in response to the search request; the first electronic device determines second positioning information of the target object relative to a first user of the electronic device 800 according to the first position information, the second position information and the first positioning information of the first electronic device, and tracks the target object according to the second positioning information.
Wherein, if the target object exists in the view field of the second user of the second electronic device 800, the second electronic device 800 sends second location information of the second user of the second electronic device 800 and first location information of the target object in the view field of the second user of the second electronic device 800 to the first electronic device; the field of view is the field of view, and the range of vision of the user.
Determining second positioning information of the target object relative to a first user of the electronic device 800 by the first electronic device according to the first position information, the second position information and the first positioning information of the first electronic device, and tracking the target object according to the second positioning information; for example, a plane coordinate system with the second position information of the second user as the center is established, and the relative position relation between the first user and the target object in the same plane coordinate system, namely the second positioning information, can be determined through the relative position information of the first user, the target object and the second user.
Optionally, in an embodiment of the present invention, the second electronic device 800 includes:
a positioning obtaining module, configured to obtain first positioning information of the target object in a view field of a second user of the second electronic device 800.
Optionally, in an embodiment of the present invention, the first positioning information includes a first distance value and a first direction angle value;
the positioning acquisition module is used for: when the eyeball of the second user is fixed to the target object, the zoom multiple of the lens of the second electronic device 800 and the visual angle center point of the second user are obtained;
determining the object distance between the target object and the second user according to the zoom multiple, wherein the object distance is the first distance value; and determining the first direction angle value according to an angle between the view angle center point and an initial center point of the view angle of the second user.
The electronic device 800 provided in the embodiment of the present invention can implement each process implemented by the electronic device 800 in the method embodiment of fig. 5 to 6, and in order to avoid repetition, a description is omitted here.
In the embodiment of the present invention, the request receiving module 801 receives a search request sent by a first electronic device; transmitting second location information of a second user of the second electronic device 800 and first location information of the target object in a view field of the second user of the second electronic device 800 to the first electronic device in response to the search request; the information sharing module 802 determines second positioning information of the target object relative to the first user of the electronic device 800 according to the first position information, the second position information and the first positioning information of the first electronic device, and tracks the target object according to the second positioning information, so as to realize view tracking through the first user. Determining second positioning information of the target object relative to the first electronic device based on the second electronic device 800 sharing the view information thereof, and directly converting the second positioning information into a view angle of the first user; the problem of cracking sense of the picture is solved through the AR glasses; meanwhile, the first electronic device can quickly realize the re-tracking of the target object based on the information shared by the second electronic device 800, and the process does not need the participation of a user, so that the operation is simple and convenient. The embodiment of the invention solves the problem of poor perception of the visual angle received by the user in the process of using the human eye auxiliary equipment in the prior art.
FIG. 9 is a schematic diagram of a hardware architecture of an electronic device implementing various embodiments of the present invention;
the electronic device 900 includes, but is not limited to: radio frequency unit 901, network module 902, audio output unit 903, input unit 904, sensor 905, display unit 906, user input unit 907, interface unit 908, memory 909, processor 910, and power source 911. It will be appreciated by those skilled in the art that the electronic device structure shown in fig. 9 does not constitute a limitation of the electronic device, and the electronic device may include more or less components than illustrated, or may combine certain components, or may have a different arrangement of components. In the embodiment of the invention, the electronic equipment comprises, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal, a wearable device, a pedometer and the like.
The processor 910 is configured to obtain first location information of the first electronic device and a target object tracked by an eyeball of a first user of the first electronic device; acquiring second position information of a second electronic device and first position information in a view field of a second user of the second electronic device;
determining second positioning information of the target object relative to the first user according to the first position information, the second position information and the first positioning information;
And tracking the target object according to the second positioning information.
Or a radio frequency unit 901, configured to receive a search request sent by a first electronic device; the search request carries the identification of the target object;
a processor 910, configured to send, to the first electronic device, second location information of a second user of the second electronic device and first location information of the target object in a view field of the second user of the second electronic device in response to the search request; and the first electronic equipment determines second positioning information of the target object relative to a first user of the electronic equipment according to the first position information, the second position information and the first positioning information of the first electronic equipment, and tracks the target object according to the second positioning information.
In the embodiment of the invention, first position information of the first electronic equipment and a target object tracked by eyeballs of a first user of the first electronic equipment are acquired; acquiring second position information of a second electronic device and first position information in a view field of a second user of the second electronic device; determining second positioning information of the target object relative to the first user according to the first position information, the second position information and the first positioning information; and tracking the target object according to the second positioning information so as to realize view tracking through the first user. Based on the fact that the second electronic equipment shares the view information of the second electronic equipment, second positioning information of the target object relative to the first electronic equipment is determined, and the second positioning information is directly converted into a view angle of the first user; the problem of cracking sense of the picture is solved through the AR glasses; meanwhile, the first electronic equipment can quickly realize the re-tracking of the target object based on the information shared by the second electronic equipment, and the process does not need user participation, so that the operation is simple and convenient.
It should be noted that, in this embodiment, the electronic device 900 may implement each process in the method embodiment of the present invention and achieve the same beneficial effects, and in order to avoid repetition, the description is omitted here.
It should be understood that, in the embodiment of the present invention, the radio frequency unit 901 may be used for receiving and transmitting signals during the process of receiving and transmitting information or communication, specifically, receiving downlink data from a base station and then processing the downlink data by the processor 910; and, the uplink data is transmitted to the base station. Typically, the radio frequency unit 901 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 901 may also communicate with networks and other devices via a wireless communication system.
The electronic device provides wireless broadband internet access to the user via the network module 902, such as helping the user to send and receive e-mail, browse web pages, and access streaming media, etc.
The audio output unit 903 may convert audio data received by the radio frequency unit 901 or the network module 902 or stored in the memory 909 into an audio signal and output as sound. Also, the audio output unit 903 may also provide audio output (e.g., a call signal reception sound, a message reception sound, etc.) related to a specific function performed by the electronic device 900. The audio output unit 903 includes a speaker, a buzzer, a receiver, and the like.
The input unit 904 is used to receive an audio or video signal. The input unit 904 may include a graphics processor (Graphics Processing Unit, GPU) 9041 and a microphone 9042, the graphics processor 9041 processing image data of still pictures or video obtained by an image capturing device (such as a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 906. The image frames processed by the graphics processor 9041 may be stored in memory 909 (or other storage medium) or transmitted via the radio frequency unit 901 or the network module 902. The microphone 9042 may receive sound and may be capable of processing such sound into audio data. The processed audio data may be converted into a format output that can be transmitted to the mobile communication base station via the radio frequency unit 901 in the case of a telephone call mode.
The electronic device 900 also includes at least one sensor 905, such as a light sensor, a motion sensor, and other sensors. Specifically, the light sensor includes an ambient light sensor and a proximity sensor, wherein the ambient light sensor can adjust the brightness of the display panel 9061 according to the brightness of ambient light, and the proximity sensor can turn off the display panel 9061 and/or the backlight when the electronic device 900 moves to the ear. As one of the motion sensors, the accelerometer sensor can detect the acceleration in all directions (generally three axes), and can detect the gravity and direction when stationary, and can be used for recognizing the gesture of the electronic equipment (such as horizontal and vertical screen switching, related games, magnetometer gesture calibration), vibration recognition related functions (such as pedometer and knocking), and the like; the sensor 905 may further include a fingerprint sensor, a pressure sensor, an iris sensor, a molecular sensor, a gyroscope, a barometer, a hygrometer, a thermometer, an infrared sensor, etc., which are not described herein.
The display unit 906 is used to display information input by a user or information provided to the user. The display unit 906 may include a display panel 9061, and the display panel 9061 may be configured in the form of a liquid crystal display (Liquid Crystal Display, LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 907 is operable to receive input numeric or character information, and to generate key signal inputs related to user settings and function controls of the electronic device. In particular, the user input unit 907 includes a touch panel 9071 and other input devices 9072. Touch panel 9071, also referred to as a touch screen, may collect touch operations thereon or thereabout by a user (such as operations of the user on touch panel 9071 or thereabout using any suitable object or accessory such as a finger, stylus, or the like). The touch panel 9071 may include two parts, a touch detection device and a touch controller. The touch detection device detects the touch azimuth of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch detection device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 910, and receives and executes commands sent by the processor 910. In addition, the touch panel 9071 may be implemented in various types such as resistive, capacitive, infrared, and surface acoustic wave. The user input unit 907 may also include other input devices 9072 in addition to the touch panel 9071. In particular, other input devices 9072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein.
Further, the touch panel 9071 may be overlaid on the display panel 9061, and when the touch panel 9071 detects a touch operation thereon or thereabout, the touch operation is transmitted to the processor 910 to determine a type of touch event, and then the processor 910 provides a corresponding visual output on the display panel 9061 according to the type of touch event. Although in fig. 9, the touch panel 9071 and the display panel 9061 are two independent components for implementing the input and output functions of the electronic device, in some embodiments, the touch panel 9071 and the display panel 9061 may be integrated to implement the input and output functions of the electronic device, which is not limited herein.
The interface unit 908 is an interface to which an external device is connected to the electronic apparatus 900. For example, the external devices may include a wired or wireless headset port, an external power (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 908 may be used to receive input (e.g., data information, power, etc.) from an external device and transmit the received input to one or more elements within the electronic apparatus 900 or may be used to transmit data between the electronic apparatus 900 and an external device.
The memory 909 may be used to store software programs as well as various data. The memory 909 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, application programs (such as a sound playing function, an image playing function, etc.) required for at least one function, and the like; the storage data area may store data (such as audio data, phonebook, etc.) created according to the use of the handset, etc. In addition, the memory 909 may include high-speed random access memory and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid-state storage device.
The processor 910 is a control center of the electronic device, connects various parts of the entire electronic device using various interfaces and lines, and performs various functions of the electronic device and processes data by running or executing software programs and/or modules stored in the memory 909, and calling data stored in the memory 909, thereby performing overall monitoring of the electronic device. Processor 910 may include one or more processing units; preferably, the processor 910 may integrate an application processor that primarily handles operating systems, user interfaces, applications, etc., with a modem processor that primarily handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 910.
The electronic device 900 may also include a power supply 911 (e.g., a battery) for powering the various components, and the power supply 911 may preferably be logically coupled to the processor 910 by a power management system, such as to perform charge, discharge, and power consumption management functions.
In addition, the electronic device 900 includes some functional modules that are not shown, and will not be described herein.
Preferably, the embodiment of the present invention further provides an electronic device, including a processor 910, a memory 909, and a computer program stored in the memory 909 and capable of running on the processor 910, where the computer program when executed by the processor 910 implements the processes of the above embodiment of the object tracking method, and the same technical effects can be achieved, and for avoiding repetition, a detailed description is omitted herein.
The embodiment of the invention also provides a computer readable storage medium, on which a computer program is stored, which when executed by a processor, implements the processes of the above-mentioned object tracking method embodiment, and can achieve the same technical effects, so that repetition is avoided, and no further description is given here. Wherein the computer readable storage medium is selected from Read-Only Memory (ROM), random access Memory (Random Access Memory, RAM), magnetic disk or optical disk.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
From the above description of the embodiments, it will be clear to those skilled in the art that the above-described embodiment method may be implemented by means of software plus a necessary general hardware platform, but of course may also be implemented by means of hardware, but in many cases the former is a preferred embodiment. Based on such understanding, the technical solution of the present invention may be embodied essentially or in a part contributing to the prior art in the form of a software product stored in a storage medium (e.g. ROM/RAM, magnetic disk, optical disk) comprising instructions for causing a terminal (which may be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.) to perform the method according to the embodiments of the present invention.
The embodiments of the present invention have been described above with reference to the accompanying drawings, but the present invention is not limited to the above-described embodiments, which are merely illustrative and not restrictive, and many forms may be made by those having ordinary skill in the art without departing from the spirit of the present invention and the scope of the claims, which are to be protected by the present invention.
Claims (12)
1. An object tracking method applied to a first electronic device, the method comprising:
acquiring first position information of the first electronic equipment and a target object tracked by eyeballs of a first user of the first electronic equipment; acquiring second position information of a second electronic device and first position information of the target object in a view field of a second user of the second electronic device;
determining second positioning information of the target object relative to the first user according to the first position information, the second position information and the first positioning information;
and tracking the target object according to the second positioning information.
2. The object tracking method according to claim 1, wherein the first positioning information includes a first distance value and a first direction angle value;
The step of determining second positioning information of the target object relative to the first user according to the first position information, the second position information and the first positioning information includes:
determining third positioning information of the first user relative to the second user according to the first position information and the second position information; the third positioning information comprises a third distance value and a third direction angle value;
determining a second distance value of the target object relative to the first user according to the first distance value and the third distance value, and determining a second direction angle value of the target object relative to the first user according to the first direction angle value and the third direction angle value; the second positioning information includes the second distance value and a second direction angle value.
3. The object tracking method of claim 1, wherein prior to the obtaining the second location information of the second electronic device and the first location information in the view field of the second user of the second electronic device, further comprising: the target object is detected to disappear from the view of the first user of the first electronic device.
4. The object tracking method of claim 3, wherein the step of detecting the disappearance of the target object from the field of view of the first user of the first electronic device comprises:
when the fixed focus of the eyeballs of the first user to a target position is detected, a first moving track of the eyeballs of the first user and a second moving track of the target object at the target position are obtained;
and determining that the target object disappears from the view of the first user of the first electronic device under the condition that the first moving track deviates from the second moving track.
5. The object tracking method according to claim 4, wherein the step of acquiring the second movement locus of the target object at the target position includes:
acquiring a moving track of each object at the target position;
and determining the object, of which the similarity between the moving track and the first moving track reaches a preset threshold value, as the target object within a first preset time period, wherein the moving track of the target object is the second moving track.
6. The object tracking method of claim 4, wherein after the step of determining that the target object disappears from the view of the first user of the first electronic device, further comprising:
Sending a search request carrying a first image of the target object to third electronic equipment;
if a search response corresponding to the search request sent by the third electronic device is received, and the search response carries the second image of the target object, determining that the third electronic device corresponding to the search request is a second electronic device.
7. An object tracking method applied to a second electronic device, the method comprising:
receiving a search request sent by first electronic equipment; the search request carries the identification of the target object;
transmitting second location information of a second user of the second electronic device and first location information of the target object in a view field of the second user of the second electronic device to the first electronic device in response to the search request; and the first electronic equipment determines second positioning information of the target object relative to a first user of the electronic equipment according to the first position information, the second position information and the first positioning information of the first electronic equipment, and tracks the target object according to the second positioning information.
8. The object tracking method according to claim 7, wherein the step of transmitting second location information of a second user of the second electronic device and first location information of the target object in a view field of the second user of the second electronic device to the first electronic device further comprises:
and acquiring first positioning information of the target object in a view field of a second user of the second electronic equipment.
9. The object tracking method according to claim 8, wherein the first positioning information includes a first distance value and a first direction angle value;
the step of obtaining the first positioning information of the target object in the view field of the second user of the second electronic device includes:
when the eyeball of the second user is fixed to the target object, the zoom multiple of the lens of the second electronic equipment and the visual angle center point of the second user are obtained;
determining the object distance between the target object and the second user according to the zoom multiple, wherein the object distance is the first distance value; and determining the first direction angle value according to an angle between the view angle center point and an initial center point of the view angle of the second user.
10. An electronic device, the electronic device being a first electronic device, the electronic device comprising:
the information acquisition module is used for acquiring first position information of the first electronic equipment and a target object tracked by eyeballs of a first user of the first electronic equipment; acquiring second position information of a second electronic device and first position information of the target object in a view field of a second user of the second electronic device;
an information determining module, configured to determine second positioning information of the target object relative to the first user according to the first position information, the second position information, and the first positioning information;
and the target tracking module is used for tracking the target object according to the second positioning information.
11. An electronic device, the electronic device being a second electronic device, the electronic device comprising:
the request receiving module is used for receiving a search request sent by the first electronic equipment; the search request carries the identification of the target object;
an information sharing module, configured to send, to the first electronic device, second location information of a second user of the second electronic device and first location information of the target object in a view field of the second user of the second electronic device, in response to the search request; and the first electronic equipment determines second positioning information of the target object relative to a first user of the electronic equipment according to the first position information, the second position information and the first positioning information of the first electronic equipment, and tracks the target object according to the second positioning information.
12. An electronic device comprising a processor, a memory and a computer program stored on the memory and executable on the processor, which when executed by the processor performs the steps of the object tracking method according to any one of claims 1 to 9.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202010328930.5A CN111523474B (en) | 2020-04-23 | 2020-04-23 | Target object tracking method and electronic device |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202010328930.5A CN111523474B (en) | 2020-04-23 | 2020-04-23 | Target object tracking method and electronic device |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| CN111523474A CN111523474A (en) | 2020-08-11 |
| CN111523474B true CN111523474B (en) | 2023-09-26 |
Family
ID=71904539
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN202010328930.5A Active CN111523474B (en) | 2020-04-23 | 2020-04-23 | Target object tracking method and electronic device |
Country Status (1)
| Country | Link |
|---|---|
| CN (1) | CN111523474B (en) |
Families Citing this family (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN112699331A (en) * | 2020-12-31 | 2021-04-23 | 深圳市慧鲤科技有限公司 | Message information display method and device, electronic equipment and storage medium |
Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN105677024A (en) * | 2015-12-31 | 2016-06-15 | 北京元心科技有限公司 | Eye movement detection tracking method and device, and application of eye movement detection tracking method |
| CN106888212A (en) * | 2017-03-14 | 2017-06-23 | 四川以太原力科技有限公司 | The shared method of target |
| CN109725714A (en) * | 2018-11-14 | 2019-05-07 | 北京七鑫易维信息技术有限公司 | Sight determines method, apparatus, system and wear-type eye movement equipment |
Family Cites Families (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2020069006A1 (en) * | 2018-09-25 | 2020-04-02 | Magic Leap, Inc. | Systems and methods for augmented reality |
-
2020
- 2020-04-23 CN CN202010328930.5A patent/CN111523474B/en active Active
Patent Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN105677024A (en) * | 2015-12-31 | 2016-06-15 | 北京元心科技有限公司 | Eye movement detection tracking method and device, and application of eye movement detection tracking method |
| CN106888212A (en) * | 2017-03-14 | 2017-06-23 | 四川以太原力科技有限公司 | The shared method of target |
| CN109725714A (en) * | 2018-11-14 | 2019-05-07 | 北京七鑫易维信息技术有限公司 | Sight determines method, apparatus, system and wear-type eye movement equipment |
Also Published As
| Publication number | Publication date |
|---|---|
| CN111523474A (en) | 2020-08-11 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| CN111541845B (en) | Image processing method and device and electronic equipment | |
| CN111182205B (en) | Shooting method, electronic device and medium | |
| CN107592466B (en) | Photographing method and mobile terminal | |
| CN109040643B (en) | Mobile terminal and remote group photo method and device | |
| CN107800967A (en) | A kind of image pickup method and mobile terminal | |
| CN110505408B (en) | Terminal shooting method and device, mobile terminal and readable storage medium | |
| CN109905603B (en) | Shooting processing method and mobile terminal | |
| CN110266957B (en) | Image shooting method and mobile terminal | |
| CN110881105B (en) | A shooting method and electronic device | |
| CN110198413B (en) | A video shooting method, video shooting device and electronic equipment | |
| WO2020020134A1 (en) | Photographing method and mobile terminal | |
| CN110300267B (en) | Photographing method and terminal equipment | |
| CN108449546B (en) | Photographing method and mobile terminal | |
| WO2020216129A1 (en) | Parameter acquisition method and terminal device | |
| CN110602389A (en) | Display method and electronic equipment | |
| CN105245811A (en) | A recording method and device | |
| CN112437172A (en) | Photographing method, terminal and computer readable storage medium | |
| CN111385481A (en) | Image processing method and device, electronic device and storage medium | |
| CN111031248A (en) | Shooting method and electronic equipment | |
| CN110891122A (en) | A wallpaper push method and electronic device | |
| CN110602387B (en) | Shooting method and electronic device | |
| CN109688325B (en) | An image display method and terminal device | |
| CN111523474B (en) | Target object tracking method and electronic device | |
| CN108881721A (en) | A kind of display methods and terminal | |
| WO2021104226A1 (en) | Photographing method and electronic device |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| PB01 | Publication | ||
| PB01 | Publication | ||
| SE01 | Entry into force of request for substantive examination | ||
| SE01 | Entry into force of request for substantive examination | ||
| GR01 | Patent grant | ||
| GR01 | Patent grant |