[go: up one dir, main page]

CN113721665B - Machine vision-based cradle head control method applied to anti-slow small target - Google Patents

Machine vision-based cradle head control method applied to anti-slow small target Download PDF

Info

Publication number
CN113721665B
CN113721665B CN202011280436.2A CN202011280436A CN113721665B CN 113721665 B CN113721665 B CN 113721665B CN 202011280436 A CN202011280436 A CN 202011280436A CN 113721665 B CN113721665 B CN 113721665B
Authority
CN
China
Prior art keywords
target
tracking mode
unmanned aerial
aerial vehicle
cradle head
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011280436.2A
Other languages
Chinese (zh)
Other versions
CN113721665A (en
Inventor
林德福
李帆
王辉
宋韬
吴则良
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Institute of Technology BIT
Original Assignee
Beijing Institute of Technology BIT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Institute of Technology BIT filed Critical Beijing Institute of Technology BIT
Priority to CN202011280436.2A priority Critical patent/CN113721665B/en
Publication of CN113721665A publication Critical patent/CN113721665A/en
Application granted granted Critical
Publication of CN113721665B publication Critical patent/CN113721665B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/12Target-seeking control

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention discloses a cloud deck control method based on machine vision, which is applied to a reverse low-speed small target, wherein an unmanned aerial vehicle and an automatic control process of the cloud deck are arranged in the method, so that the unmanned aerial vehicle can carry the cloud deck to cruise within a preset range, thereby timely finding the invaded low-speed small target, the cloud deck rotates according to a set special angular speed after finding the target, the possibility that the target is separated from a visual field is reduced, and the target is positioned at the central position of the visual field as much as possible, so that the tracking effect on the target is enhanced.

Description

Machine vision-based cradle head control method applied to anti-slow small target
Technical Field
The invention relates to a control method of a cradle head on an unmanned aerial vehicle, in particular to a cradle head control method based on machine vision, which is applied to a reverse low-speed small target.
Background
"Low low-speed small" target (LSST) refers to small aircraft and airborne matter having all or part of the characteristics of low-altitude, ultra-low-altitude flight (flight altitude below 1000 m), flight speeds less than 200km/h, and not easily found by radar. The flying height of the low-speed and small-sized aircraft is below 1000 meters, the flying speed is less than 200km, and the radar reflection area is less than 2 square meters, so that the low-speed and small-sized aircraft is widely used and rapidly developed due to the wide range of the low-speed and small-sized aircraft (including the common aircraft such as small and medium-sized aircraft, helicopters, gliders, hot air balloons, unmanned aerial vehicles and the like, aviation sports equipment and the like) and the development of science and technology.
The development of the 'low-slow small' target improves the national economy development level, but the 'low-slow small' event is obviously improved in recent years, the development of the 'low-slow small' target is also increasingly remarkable in terms of important targets, important areas and important activities, and the development of the 'low-slow small' target can have unexpected results once being utilized by some people with poor mind. Along with the opening of the low-altitude airspace in China, the supervision and prevention of the 'low-speed' target become the problems to be solved urgently, and the accurate detection, interception, tracking and hitting of the 'low-speed' target are very important and urgent.
The 'low-slow small' target has the characteristics of difficult detection and difficult defense, and the current interception mode of the 'low-slow small' target is mainly divided into soft killing and hard killing. Soft-click achieves the combat capability of weakening the "low-low" target by interfering with the communication link, interfering with the navigational positioning system, and interfering with the detection device. Hard killing is by intervention in the form of dispatch helicopter blows, drone blows and destroys ground stations. The unmanned aerial vehicle is utilized to strike a 'low-low' target because of a series of advantages of strong battlefield sensing capability, high flexibility, low cost and the like, and the unmanned aerial vehicle becomes a very considerable means of cost-effectiveness ratio.
The precondition of all the processing of the low-speed small target is real-time identification and tracking of the target, and in the existing scheme, the tracking target is also identified by carrying a camera on a holder, however, in the actual working process, the target is often separated from the field of view of the camera due to improper rotation speed control of the holder, so that the tracking effect is not ideal.
For the above reasons, the present inventors have made intensive studies on the existing pan/tilt control method, in order to expect to design a machine vision-based pan/tilt control method applied to a very slow and small target, which can solve the above problems.
Disclosure of Invention
In order to overcome the problems, the inventor performs intensive researches and designs a cloud deck control method based on machine vision, which is applied to a reverse low-speed small target, in the method, through setting an automatic control process of an unmanned aerial vehicle and a cloud deck, the unmanned aerial vehicle can carry the cloud deck to cruise in a preset range, so that the invasion low-speed small target can be found timely, the cloud deck rotates according to a set special angular speed after the target is found, the possibility that the target is separated from a visual field is reduced, and the target is located in the center position of the visual field as much as possible, so that the tracking effect on the target is enhanced, and the invention is completed.
Specifically, the invention aims to provide a machine vision-based pan-tilt control method applied to a slow-down small target, which comprises the following steps:
Step 1, an unmanned aerial vehicle carrying a tripod head reaches a preset position to hover, a tripod head and a camera on the tripod head are controlled to search a target, and a search mode is entered;
step 2, in the search mode, the camera reads the shot image in real time, judges whether the image contains a target, and judges whether the image enters a tracking mode when the image contains the target;
Step 3, after entering the tracking mode, generating a control instruction according to the pixel deviation of the target in the image in real time to adjust the rotation angular velocity of the cradle head,
In the tracking mode, if the target is lost in a short time, the tracking mode is continuously maintained, if the target is lost for a long time, the tracking mode is terminated, the cradle head is controlled to be restored to an initial angle state, and the unmanned aerial vehicle is controlled to be restored to a preset position.
In step 2, the tracking mode is entered when the depth of the target in the image is smaller than the set value and the target is contained in all of the continuous multi-frame images.
Wherein the set value is 30 meters, and the multi-frame image is 5 frames or more than 5 frames.
In step 3, after entering the tracking mode, the rotational angular velocity of the pan-tilt is first adjusted at a small speed, and then adjusted at a large speed after a certain time.
In step 3, after entering a tracking mode, a desired rotation angular velocity of the pan-tilt is calculated by the following formula (one), and the pan-tilt is controlled to rotate according to the calculation;
Wherein, The desired rotation angular velocity of the cradle head is represented by k pmax, the maximum value of the pixel deviation term weight, k pmin, the minimum value of the pixel deviation term weight, t time, err, the maximum value of the pixel deviation change rate weight, k dmax, and k dmin, the minimum value of the pixel deviation change rate weight,/>Representing the rate of change of the element deviation.
Wherein in step 3, in the tracking mode, after the target is lost, the control instruction corresponding to the previous frame of image is continuously lost through the target to control the cradle head,
And if the target is not captured after the target is lost for 200ms, the tracking mode is terminated, and the control holder is restored to the initial angle state.
After losing the target 1s, the unmanned aerial vehicle is controlled to return to a preset position.
When the tracking mode is entered, the state estimation of the target is obtained through real-time calculation, and the unmanned aerial vehicle is controlled to track or chase the target according to the state estimation.
The invention has the beneficial effects that:
(1) According to the machine vision-based cradle head control method applied to the anti-low small target, the cradle head can be stably converted from a static state to a fast tracking state, and the cradle head cannot lose the target due to motion blur;
(2) According to the machine vision-based cradle head control method applied to the anti-low small target, the required navigation information can be conveniently obtained when a task is executed;
(3) According to the machine vision-based cradle head control method applied to the anti-low-speed small target, which is provided by the invention, the high-robustness control method enables the cradle head to be in a controllable state all the time when certain unexpected situations occur.
Drawings
FIG. 1 is a diagram showing overall logic of a machine vision based pan/tilt control method applied to a counter-slow small target in accordance with a preferred embodiment of the present invention;
FIG. 2 shows pixel deviation of pixel normalization obtained in the embodiment;
FIG. 3 shows a partial enlarged view of FIG. 2;
Fig. 4 shows a comparison of an actual trajectory with an observed trajectory.
Detailed Description
The invention is further described in detail below by means of the figures and examples. The features and advantages of the present invention will become more apparent from the description.
The word "exemplary" is used herein to mean "serving as an example, embodiment, or illustration. Any embodiment described herein as "exemplary" is not necessarily to be construed as preferred or advantageous over other embodiments. Although various aspects of the embodiments are illustrated in the accompanying drawings, the drawings are not necessarily drawn to scale unless specifically indicated.
According to the machine vision-based pan-tilt control method applied to the anti-slow small target provided by the invention, as shown in fig. 1, the method comprises the following steps:
Step 1, an unmanned aerial vehicle carrying a tripod head reaches a preset position to hover, a tripod head and a camera on the tripod head are controlled to search a target, and a search mode is entered; the preset position can be a coordinate point filled into the unmanned aerial vehicle before the unmanned aerial vehicle takes off, or can be a coordinate point transmitted to the unmanned aerial vehicle in real time by the ground control station, and the patrol area of the unmanned aerial vehicle can be defined by setting the preset position, so that the low-speed small target in the patrol detection area is ensured.
Step 2, in the search mode, the camera reads the shot image in real time, judges whether the image contains a target, and judges whether to enter a tracking mode when the image contains the target; in the search mode, the camera takes pictures in real time according to a preset frequency, such as 25Hz, and searches for a target in each frame of image in an image recognition mode, and the possible appearance of the target is filled into an image recognition system in a machine learning mode.
Step 3, after entering the tracking mode, generating a control instruction according to the pixel deviation of the target in the image in real time to adjust the rotation angular velocity of the cradle head,
In the tracking mode, if the target is lost in a short time, the tracking mode is continuously maintained, if the target is lost for a long time, the tracking mode is terminated, the cradle head is controlled to be restored to an initial angle state, and the unmanned aerial vehicle is controlled to be restored to a preset position. When the target is just lost, the target is considered to be possibly re-identified in a short time, so that for the cradle head, the control instruction which is kept at the moment before the target is lost can ensure that the movement direction of the cradle head is kept consistent with the movement direction of the target to the greatest extent, and a condition is provided for re-identifying the target; when the target is lost for a longer period of time and has not been re-identified, then the target is treated as being completely lost in case the unmanned aerial vehicle is at risk of unknown occurrence and in preparation for the task.
In a preferred embodiment, in step 2, the tracking mode is entered when the depth of the object in the image is less than the set value and the objects are contained in all of the successive multi-frame images.
Preferably, the set value is 30 meters, and the multi-frame image is 5 frames or more than 5 frames. By setting the condition for entering the tracking mode, the target can be ensured not to be blurred due to too far distance after entering the tracking mode, and not to be frequently lost due to the fact that the target deviates from the center of the field of view.
In a preferred embodiment, in step 3, after entering the tracking mode, the rotational angular velocity of the head is first adjusted at a small speed, and after a certain time, the rotational angular velocity of the head is adjusted at a large speed. When the tripod head just starts to move, as the target is likely to be at the edge position in the field of view, a larger control instruction is given to the tripod head at the moment, so that the tripod head moves too fast, the image generates motion blur, and the condition that the target is still unrecognizable although being in the field of view is caused, so that the control instruction given in the first stage is not likely to be too large; when the pan-tilt tracking exceeds a certain period of time, if the control command is still kept small, the target is lost due to insufficient maneuvering capability of the pan-tilt when the target moves in front of the pan-tilt, so that the maneuvering capability of the pan-tilt needs to be increased at the stage, and the control command is increased.
Preferably, after entering the tracking mode, a desired rotation angular velocity of the pan-tilt is calculated by the following formula (one), and the pan-tilt is controlled to rotate in accordance therewith;
Wherein, The desired rotation angular velocity of the cradle head is represented by k pmax, the maximum value of the pixel deviation term weight, k pmin, the minimum value of the pixel deviation term weight, t, the time, specifically, the time after entering the tracking mode, that is, the time after entering the tracking mode, err, the pixel deviation, k dmax, the maximum value of the pixel deviation change rate weight, k dmin, the minimum value of the pixel deviation change rate weight,/>Representing the rate of change of the pixel bias, i.e. the derivative of err.
In a preferred embodiment, in step 3, in the tracking mode, after the target is lost, the pan-tilt is controlled by a control instruction corresponding to an image of a previous frame of the lost target, that is, the target is considered to be still available for recovery within 200ms of the lost target;
And after the target is lost for 200ms, the target is not captured, the tracking mode is terminated, and the control holder is restored to the initial angle state, namely the target is considered to be unable to be retrieved at the moment.
By setting the time condition, the inspection efficiency of the unmanned aerial vehicle and the camera on the unmanned aerial vehicle can be improved to the greatest extent, and the possibility that the inspection area has low-speed small target penetration is reduced.
Preferably, after the target 1s is lost, the unmanned aerial vehicle is controlled to return to the preset position, and the next cruising operation can be started after the unmanned aerial vehicle returns to the preset position.
Preferably, when the tracking mode is entered, the state estimation of the target is obtained through real-time calculation, and the unmanned aerial vehicle is controlled to track or chase the target according to the state estimation. The drone may carry a device that attacks or captures the target so that the target also disappears from the field of view after it is knocked down or captured.
In a preferred embodiment, when entering the tracking mode, the unmanned aerial vehicle carries the pan-tilt and camera to track or chase the found slow and small target, and specific operations such as tracking, knockdown or capturing can be selected in advance according to a set instruction. The unmanned aerial vehicle image recognition system extracts at least 4 characteristic points from the target in each frame of image, and calculates state estimation of the target according to the characteristic points, wherein the state estimation comprises the position, the gesture and the speed of the target. The characteristic points are specific points on the target, so that the identification is convenient, and the characteristic points can be selected and set according to the type and the appearance of the target, such as four motor positions of a four-rotor unmanned aerial vehicle. Specifically comprises the following steps:
Step A, obtaining a rotation matrix through pixel coordinates of the target feature points,
Step B, obtaining the gesture of the target through rotating the matrix,
Step C, obtaining the acceleration of the target through the gesture of the target,
And D, obtaining the actual position and the actual speed of the target through the acceleration of the target.
Preferably, in said step a, the rotation parameter of the target is obtained by the following formula (two):
Wherein R represents a rotation matrix, i.e. a 3×3 rotation matrix for conversion from an orthogonal coordinate system O aXaYaZa to a camera coordinate system O cXcYcZc, 9 parameters in the rotation matrix are also referred to as rotation parameters;
r 'represents an arbitrary rotation matrix, the third column [ R 7 r8 r9]T ] of which is equal to the rotation axis Za, and R' satisfies the orthogonal constraint of the rotation matrix;
rotary shaft A vector representing point P i0 pointing to point P j0, i P i0Pj0 i represents the modulus of the vector of point P i0 pointing to point P j0;
The depth of two points P i0 and P j0 can be solved by extracting the pixel coordinates of 4 feature points from the target in each frame of image, so as to determine the rotation axis Za in the formula (II), namely [ r 7 r8 r9]T;
rot (Z, alpha) represents that the rotation angle of the target around the Z axis is alpha;
c=cosα,s=sinα;
R 1 to R 9 each represent each element in any 3 x 3 rotation matrix R', the third column [ R 7 r8 r9]T ] being equal to the rotation axis Za.
In the step B, the pose of the target is obtained by the following formula (three):
Wherein, theta 1 represents the pitch angle of the target, and the range of the pitch angle is
Θ 2 represents the pitch angle of the target, the unmanned aerial vehicle pitch angle is represented by θ 2 when θ 1 is greater than 90 ° or less than-90 degrees,
And 1 denotes a yaw angle of the corresponding solved target at a pitch angle of θ 1,
And 2 denotes a yaw angle of the corresponding solved target at a pitch angle of θ 2,
Phi 1 represents the roll angle of the target obtained by corresponding solving when the pitch angle is theta 1,
Phi 2 represents the roll angle of the target obtained by corresponding solving when the pitch angle is theta 2, R 31、R32、R33 represents three elements of the third row in the rotation matrix R obtained by solving in the formula (II),
R 21 represents the first element of the second row in the rotation matrix R solved in equation (two),
R 11 represents the first element of the first row in the rotation matrix R obtained by solving in the formula (II);
asin represents an arcsine function, and atan 2 represents an arctangent function.
The target attitude comprises three included angles of a target body coordinate system and an inertial coordinate, namely a roll angle, a pitch angle and a yaw angle, and can be obtained through the formula (III).
In the step C, the acceleration degree of the target unmanned aerial vehicle is obtained by the following formula (four):
a= [ a x,ay,az]T (four)
Wherein a represents the acceleration of the target,
A x denotes an acceleration component in the X-axis direction in the inertial coordinate system,
A y denotes an acceleration component in the Y-axis direction in the inertial coordinate system,
A z denotes an acceleration component in the vertical direction, a z =0
Wherein g represents gravitational acceleration;
solving the pitch angle of the obtained target unmanned aerial vehicle in the expression (III),
Solving the roll angle of the obtained target unmanned aerial vehicle in the expression (III),
And psi represents the yaw angle of the target unmanned aerial vehicle obtained by solving in the formula (III).
In the present application, preferably, the unmanned aerial vehicle controls itself to be in a horizontal plane parallel to the target when tracking the target, and sets the target to keep flying stably in the horizontal plane.
In the step D, the actual position and velocity of the target are obtained by the following equation (five),
Where K k represents the kalman gain, γ k represents a binary random variable for analog intermittent measurement, γ k =1 if the target is detected in the kth frame image, and γ k =0 if the target drone is not detected in the kth frame image;
w k denotes a process noise corresponding to the kth frame image, w k-1 denotes a process noise corresponding to the kth-1 frame image,
Represents a state quantity corresponding to a kth frame image estimated based on a k-1 frame image,
Representing the state quantity corresponding to the estimated optimal k-1 frame image, namely X,
Representing the state quantity corresponding to the optimal k frame image obtained by estimation, namely X;
z k represents the corresponding quantity measurement of the kth frame image, namely Z;
A represents a process matrix, and H represents an observation matrix;
P represents the position of the target, v represents the velocity of the target, a represents the acceleration of the target, h represents the sampling period of the image, preferably 25Hz, and I 3 represents the three-dimensional identity matrix.
According to the method, the target state estimation corresponding to each frame of image can be obtained when each frame of image is obtained, so that the speed of the unmanned aerial vehicle can be controlled accordingly, the distance between the target and the unmanned aerial vehicle is kept within a certain range, for example, within 30 meters, the distance between the target and the unmanned aerial vehicle is gradually close to the target or kept constant, and the cradle head and the camera are facilitated to capture the target more clearly.
Examples
Selecting a low-speed small target to move on a plane at a speed of 12 m/s, tracking the low-speed small target through a cradle head carried by an unmanned aerial vehicle and a camera, wherein the movement track is shown as a solid line in fig. 4, finding the target after the unmanned aerial vehicle enters a search mode, judging that the depth of the target is 23 m through continuous 5 frames of images with the target, and entering a tracking mode, namely, the unmanned aerial vehicle enters the tracking mode at the moment 0, wherein the tracking mode lasts more than 6 seconds, the cradle head is controlled to rotate through the following formula (one) in the tracking mode, and the cradle head is controlled by a control instruction corresponding to the image of the previous frame of the target to be lost in the period of losing;
Wherein, The desired rotation angular velocity of the cradle head is represented by k pmax, the maximum value of the pixel deviation term weight, k pmin, the minimum value of the pixel deviation term weight, t the time after entering the tracking mode, err, the maximum value of the pixel deviation change rate weight, k dmax, and k dmin, the minimum value of the pixel deviation change rate weight,/>The change rate of the pixel deviation is shown.
In tracking mode, the drone controls itself to follow the target and maintains a fixed distance, which is the distance between the drone and the target when entering tracking mode, i.e. 23 meters. Specifically, the unmanned aerial vehicle obtains the position and speed information of the target in real time through the following steps:
step A, extracting 4 characteristic points from a target in each frame of image, obtaining a rotation matrix according to pixel coordinates,
Step B, obtaining the gesture of the target through rotating the matrix,
Step C, obtaining the acceleration of the target through the gesture of the target,
And D, obtaining the actual position and the actual speed of the target through the acceleration of the target.
Wherein the rotation parameter of the target is obtained by the following formula (two):
Wherein R represents a rotation matrix, i.e. a 3×3 rotation matrix for conversion from an orthogonal coordinate system O aXaYaZa to a camera coordinate system O cXcYcZc, 9 parameters in the rotation matrix are also referred to as rotation parameters;
r 'represents an arbitrary rotation matrix, the third column [ R 7 r8 r9]T ] of which is equal to the rotation axis Za, and R' satisfies the orthogonal constraint of the rotation matrix;
rotary shaft A vector representing point P i0 pointing to point P j0, i P i0Pj0 i represents the modulus of the vector of point P i0 pointing to point P j0;
The depth of two points P i0 and P j0 can be solved by extracting the pixel coordinates of 4 feature points from the target in each frame of image, so as to determine the rotation axis Za in the formula (II), namely [ r 7 r8 r9]T;
rot (Z, alpha) represents that the rotation angle of the target around the Z axis is alpha;
c=cosα,s=sinα;
R 1 to R 9 each represent each element in any 3 x 3 rotation matrix R', the third column [ R 7 r8 r9]T ] being equal to the rotation axis Za.
In the step B, the pose of the target is obtained by the following formula (three):
Wherein, theta 1 represents the pitch angle of the target, and the range of the pitch angle is
Θ 2 represents the pitch angle of the target, the unmanned aerial vehicle pitch angle is represented by θ 2 when θ 1 is greater than 90 ° or less than-90 degrees,
And 1 denotes a yaw angle of the corresponding solved target at a pitch angle of θ 1,
And 2 denotes a yaw angle of the corresponding solved target at a pitch angle of θ 2,
Phi 1 represents the roll angle of the target obtained by corresponding solving when the pitch angle is theta 1,
Phi 2 represents the roll angle of the target obtained by corresponding solving when the pitch angle is theta 2,
R 31、R32、R33 represents the three elements of the third row in the rotation matrix R solved in equation (two),
R 21 represents the first element of the second row in the rotation matrix R solved in equation (two),
R 11 represents the first element of the first row in the rotation matrix R obtained by solving in the formula (II);
asin denotes an arcsine function, and atan2 denotes an arctangent function.
The target attitude comprises three included angles of a target body coordinate system and an inertial coordinate, namely a roll angle, a pitch angle and a yaw angle, and can be obtained through the formula (III).
In the step C, the acceleration degree of the target unmanned aerial vehicle is obtained by the following formula (four):
a= [ a x,ay,az]T (four)
Wherein a represents the acceleration of the target,
A x denotes an acceleration component in the X-axis direction in the inertial coordinate system,
A y denotes an acceleration component in the Y-axis direction in the inertial coordinate system,
A z denotes an acceleration component in the vertical direction, a z =0
Wherein g represents gravitational acceleration;
solving the pitch angle of the obtained target unmanned aerial vehicle in the expression (III),
Solving the roll angle of the obtained target unmanned aerial vehicle in the expression (III),
And psi represents the yaw angle of the target unmanned aerial vehicle obtained by solving in the formula (III).
In the present application, preferably, the unmanned aerial vehicle controls itself to be in a horizontal plane parallel to the target when tracking the target, and sets the target to keep flying stably in the horizontal plane.
In the step D, the actual position and velocity of the target are obtained by the following equation (five),
Where K k represents the kalman gain, γ k represents a binary random variable for analog intermittent measurement, γ k =1 if the target is detected in the kth frame image, and γ k =0 if the target drone is not detected in the kth frame image;
w k denotes a process noise corresponding to the kth frame image, w k-1 denotes a process noise corresponding to the kth-1 frame image,
Represents a state quantity corresponding to a kth frame image estimated based on a k-1 frame image,
Representing the state quantity corresponding to the estimated optimal k-1 frame image, namely X,
Representing the state quantity corresponding to the optimal k frame image obtained by estimation, namely X;
z k represents the corresponding quantity measurement of the kth frame image, namely Z;
A represents a process matrix, and H represents an observation matrix;
P represents the position of the target, v represents the velocity of the target, a represents the acceleration of the target, h represents the sampling period of the image, preferably 25Hz, and I 3 represents the three-dimensional identity matrix.
And selecting a target track obtained by the method in the first 5 seconds in the tracking mode, comparing the track with a real target track, wherein the obtained track deviation condition is shown in fig. 2 and 3, and the obtained observation target position track is shown in a dotted line in fig. 4.
As can be seen from fig. 2 and 3, the pan-tilt and camera can basically make the target within the error range of 0.1 (the total up-down error is-1) about 1s after entering the tracking mode, and no motion blur occurs so that the target cannot be identified (i.e. the pixel deviation is 0); in addition, in the tracking mode, when the target is lost in the middle, the control instruction corresponding to the previous frame of image is continuously lost through the target to control the cradle head, the target is captured again within 200ms, and the tracking can be further continued.
As can be seen from fig. 4, the observed target track substantially coincides with the real target track, and the two tracks have small deviations, and the observed track can be used to characterize the real track.
The invention has been described in connection with the preferred embodiments, which are, however, exemplary only and for illustrative purposes. On this basis, the invention can be subjected to various substitutions and improvements, and all fall within the protection scope of the invention.

Claims (6)

1. The machine vision-based cradle head control method applied to the anti-slow small target is characterized by comprising the following steps of:
Step 1, an unmanned aerial vehicle carrying a tripod head reaches a preset position to hover, a tripod head and a camera on the tripod head are controlled to search a target, and a search mode is entered;
Step2, in the search mode, the camera reads the shot image in real time, judges whether the image contains a target, and judges whether to enter a tracking mode when the image contains the target;
step 3, after entering the tracking mode, generating a control instruction according to the pixel deviation of the target in the image in real time to adjust the rotation angular velocity of the cradle head,
In the tracking mode, if the target is lost in a short time, the tracking mode is continuously maintained, if the target is lost for a long time, the tracking mode is terminated, the cradle head is controlled to recover to an initial angle state, and the unmanned aerial vehicle is controlled to recover to a preset position;
In step 3, after entering the tracking mode, firstly, adjusting the rotation angular velocity of the cradle head at a smaller speed, and after a certain time, adjusting the rotation angular velocity of the cradle head at a larger speed;
In step 3, after entering the tracking mode, obtaining a desired rotation angular velocity of the pan-tilt through the following formula (one), and controlling the pan-tilt to rotate according to the desired rotation angular velocity;
Wherein, The desired rotation angular velocity of the cradle head is represented by k pmax, the maximum value of the pixel deviation term weight, k pmin, the minimum value of the pixel deviation term weight, t time, err, the maximum value of the pixel deviation change rate weight, k dmax, and k dmin, the minimum value of the pixel deviation change rate weight,/>Representing the rate of change of the element deviation.
2. The machine vision based pan-tilt control method applied to a very slow and small target according to claim 1, wherein,
In step 2, the tracking mode is entered when the depth of the target in the image is less than the set value and the target is contained in all of the consecutive multi-frame images.
3. The machine vision based pan-tilt control method applied to a very slow and small target according to claim 2, wherein,
The set value is 30 meters, and the multi-frame image is 5 frames or more than 5 frames.
4. The machine vision based pan-tilt control method applied to a very slow and small target according to claim 1, wherein,
In step 3, in the tracking mode, after the target is lost, the cradle head is controlled by a control instruction corresponding to the image of the previous frame of the target loss,
And if the target is not captured after the target is lost for 200ms, the tracking mode is terminated, and the control holder is restored to the initial angle state.
5. The machine vision based pan-tilt control method applied to a very slow and small target according to claim 1, wherein,
After losing the target 1s, the unmanned aerial vehicle is controlled to return to a preset position.
6. The machine vision-based pan-tilt control method applied to a very slow small target according to claim 1, wherein:
when the tracking mode is entered, the state estimation of the target is obtained through real-time calculation, and the unmanned aerial vehicle is controlled to track or chase the target according to the state estimation.
CN202011280436.2A 2020-11-16 2020-11-16 Machine vision-based cradle head control method applied to anti-slow small target Active CN113721665B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011280436.2A CN113721665B (en) 2020-11-16 2020-11-16 Machine vision-based cradle head control method applied to anti-slow small target

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011280436.2A CN113721665B (en) 2020-11-16 2020-11-16 Machine vision-based cradle head control method applied to anti-slow small target

Publications (2)

Publication Number Publication Date
CN113721665A CN113721665A (en) 2021-11-30
CN113721665B true CN113721665B (en) 2024-06-14

Family

ID=78672358

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011280436.2A Active CN113721665B (en) 2020-11-16 2020-11-16 Machine vision-based cradle head control method applied to anti-slow small target

Country Status (1)

Country Link
CN (1) CN113721665B (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109753076A (en) * 2017-11-03 2019-05-14 南京奇蛙智能科技有限公司 A kind of unmanned plane vision tracing implementing method
CN110322474A (en) * 2019-07-11 2019-10-11 史彩成 A kind of image motive target real-time detection method based on unmanned aerial vehicle platform

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8041077B2 (en) * 2007-12-18 2011-10-18 Robert Bosch Gmbh Method of motion detection and autonomous motion tracking using dynamic sensitivity masks in a pan-tilt camera
JP5498454B2 (en) * 2011-09-15 2014-05-21 株式会社東芝 TRACKING DEVICE, TRACKING METHOD, AND PROGRAM
CN111656403A (en) * 2019-06-27 2020-09-11 深圳市大疆创新科技有限公司 Method and device for tracking target and computer storage medium
CN111932588B (en) * 2020-08-07 2024-01-30 浙江大学 A tracking method for airborne UAV multi-target tracking system based on deep learning

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109753076A (en) * 2017-11-03 2019-05-14 南京奇蛙智能科技有限公司 A kind of unmanned plane vision tracing implementing method
CN110322474A (en) * 2019-07-11 2019-10-11 史彩成 A kind of image motive target real-time detection method based on unmanned aerial vehicle platform

Also Published As

Publication number Publication date
CN113721665A (en) 2021-11-30

Similar Documents

Publication Publication Date Title
CN108227751B (en) Landing method and system of unmanned aerial vehicle
US11604479B2 (en) Methods and system for vision-based landing
US20170059692A1 (en) Mitigation of Small Unmanned Aircraft Systems Threats
CN108459618A (en) A kind of flight control system and method that unmanned plane automatically launches mobile platform
US20190068829A1 (en) Systems and Methods for Improving Performance of a Robotic Vehicle by Managing On-board Camera Obstructions
CN109753076A (en) A kind of unmanned plane vision tracing implementing method
CN105644785B (en) A UAV landing method based on optical flow method and horizon detection
CN107463181A (en) A kind of quadrotor self-adoptive trace system based on AprilTag
CN105446351A (en) Robotic airship system capable of locking target area for observation based on autonomous navigation
CN110132060A (en) A method of intercepting drones based on visual navigation
CN110058604A (en) A kind of accurate landing system of unmanned plane based on computer vision
KR20200083951A (en) Control system and method to patrol an RFID tag path of a drone having a camera and embedded with a directional speaker
CN114564034A (en) Unmanned aerial vehicle autonomous landing strategy based on holder visual servo in GNSS-free environment
CN115857520B (en) Unmanned aerial vehicle landing state monitoring method based on combination of vision and ship state
CN110068827A (en) A kind of method of the autonomous object ranging of unmanned plane
KR101980095B1 (en) Method and system for controlling movement of a UAV by predicting the trajectory of a spherical target through a camera
CN114296479B (en) An image-based UAV tracking method and system for ground vehicles
CN113721665B (en) Machine vision-based cradle head control method applied to anti-slow small target
Lee et al. Autonomous target following with monocular camera on uas using recursive-ransac tracker
Morais et al. Trajectory and Guidance Mode for autonomously landing an UAV on a naval platform using a vision approach
CN114859960A (en) Method for continuously tracking and reconnaissance fixed-wing unmanned aerial vehicle photoelectric pod to fixed-point target
Cao et al. Research on application of computer vision assist technology in high-precision UAV navigation and positioning
CN113075937B (en) Control method for capturing target by unmanned aerial vehicle based on target acceleration estimation
Ho et al. Automatic landing system of a quadrotor UAV using visual servoing
CN112198894B (en) Rotor UAV autonomous mobile landing guidance method and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant