CN116193256A - Determining a focal position of a camera - Google Patents
Determining a focal position of a camera Download PDFInfo
- Publication number
- CN116193256A CN116193256A CN202310184881.6A CN202310184881A CN116193256A CN 116193256 A CN116193256 A CN 116193256A CN 202310184881 A CN202310184881 A CN 202310184881A CN 116193256 A CN116193256 A CN 116193256A
- Authority
- CN
- China
- Prior art keywords
- target
- sample
- distance
- ptz camera
- determining
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 claims abstract description 79
- 238000013507 mapping Methods 0.000 claims abstract description 42
- 238000012937 correction Methods 0.000 claims description 91
- 230000008569 process Effects 0.000 claims description 42
- 230000002093 peripheral effect Effects 0.000 claims description 24
- 238000006243 chemical reaction Methods 0.000 claims description 9
- 238000012545 processing Methods 0.000 claims description 9
- 239000000725 suspension Substances 0.000 claims description 8
- 239000000758 substrate Substances 0.000 claims 1
- 230000006870 function Effects 0.000 description 13
- 238000010276 construction Methods 0.000 description 10
- 238000010586 diagram Methods 0.000 description 10
- 238000004590 computer program Methods 0.000 description 7
- 230000009471 action Effects 0.000 description 6
- 230000003287 optical effect Effects 0.000 description 6
- 230000003044 adaptive effect Effects 0.000 description 4
- 238000013461 design Methods 0.000 description 4
- NAWXUBYGYWOOIX-SFHVURJKSA-N (2s)-2-[[4-[2-(2,4-diaminoquinazolin-6-yl)ethyl]benzoyl]amino]-4-methylidenepentanedioic acid Chemical compound C1=CC2=NC(N)=NC(N)=C2C=C1CCC1=CC=C(C(=O)N[C@@H](CC(=C)C(O)=O)C(O)=O)C=C1 NAWXUBYGYWOOIX-SFHVURJKSA-N 0.000 description 3
- 238000004422 calculation algorithm Methods 0.000 description 3
- 238000009434 installation Methods 0.000 description 3
- 238000004364 calculation method Methods 0.000 description 2
- 238000004519 manufacturing process Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 239000007858 starting material Substances 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 230000006978 adaptation Effects 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 238000012423 maintenance Methods 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 230000035945 sensitivity Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Images
Landscapes
- Studio Devices (AREA)
- Automatic Focus Adjustment (AREA)
Abstract
In one embodiment, the application provides a method, a device and equipment for determining a focusing position, wherein the method comprises the following steps: acquiring a target deflection angle of horizontal rotation and a target pitch angle of vertical rotation of a PTZ camera; determining a target distance between a target to be detected and a PTZ camera based on the target deflection angle, the direction vector pointed by the target pitch angle and the intersection point coordinates between a target plane equation; inquiring the configured mapping relation based on the target distance to obtain a target focusing position corresponding to the target distance; wherein the mapping relationship represents a relationship between distance and focus position; focusing a PTZ camera based on the target focusing position, and acquiring a target image aiming at the target to be detected through the PTZ camera after the PTZ camera is focused. According to the technical scheme, the target to be detected can be accurately focused, the time consumption for searching the focusing position of the target is relatively short, and the clear focusing position can be found, so that the target to be detected is clear.
Description
Technical Field
The present invention relates to the field of image processing, and in particular, to a method, an apparatus, and a device for determining a focal position of a camera.
Background
A video camera generally has a zoom (may also be referred to as a zoom) function and a focus function, and for example, the video camera includes a zoom motor, a focus motor, a zoom lens, and a focus lens, the zoom lens being driven to move by the zoom motor to realize the zoom function, the focus lens being driven to move by the focus motor to realize the focus function.
In order to realize the focusing function, it is necessary to determine the focusing position first, and drive the focusing lens to move based on this focusing position to realize the focusing function. In order to ensure the definition of an image, when determining a focusing position, reference is also required to be made to an AFD (automatic focusing image definition evaluation parameter) statistical value, the focusing lens is driven to oscillate near the clear focusing position based on the AFD statistical value, the focusing position where the maximum AFD statistical value is located is gradually approximated, the focusing position is the sharpest focusing position, and when the focusing lens is driven to move based on the focusing position to realize the focusing function, the image with very high definition can be acquired, so that the definition of the image is ensured.
However, in the above manner, the time required for searching the focus position is relatively long, the sharpest focus position cannot be found for the moving object to be detected, and the proportion of the object to be detected in the image is usually small.
Disclosure of Invention
The application provides a focusing position determining method, which is applied to a PTZ camera and comprises the following steps:
acquiring a target deflection angle of horizontal rotation and a target pitch angle of vertical rotation of the PTZ camera;
determining a target distance between a target to be detected and the PTZ camera based on the target deflection angle, the direction vector pointed by the target pitch angle and the intersection point coordinates between a target plane equation;
inquiring the configured mapping relation based on the target distance to obtain a target focusing position corresponding to the target distance; wherein the mapping relationship represents a relationship between distance and focus position;
focusing the PTZ camera based on the target focusing position, and acquiring a target image aiming at the target to be detected through the PTZ camera after the PTZ camera is focused.
As can be seen from the above technical solutions, in the embodiments of the present application, the target distance between the target to be detected and the PTZ camera may be determined based on the target plane equation, the target focusing position may be determined based on the target distance between the target to be detected and the PTZ camera, and the PTZ camera may be focused based on the target focusing position, so that the target to be detected may be accurately focused. The time consumption for searching the target focusing position is relatively short, the clear focusing position can be found for the moving target to be detected, even if the proportion of the target to be detected in the image is small, the target to be detected can be focused on the target to be detected when the clear target focusing position is searched, the target to be detected cannot be focused on the background, namely the target to be detected is clear, the focusing position can be accurately enabled to fall on the target to be detected, and the target to be detected is ensured to be always in a clear state. The focusing position of each point in the physical space can be accurately calculated, and when the target to be detected appears in the picture, the clear focusing position of the target to be detected can be calculated in real time, so that the focusing is fast.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the following description will briefly describe the drawings that are required to be used in the embodiments of the present application or the description in the prior art, and it is obvious that the drawings in the following description are only some embodiments described in the present application, and other drawings may also be obtained according to these drawings of the embodiments of the present application for a person having ordinary skill in the art.
FIG. 1 is a flow chart of a method of focal position determination in one embodiment of the present application;
FIG. 2 is a schematic diagram of a plane equation construction process in one embodiment of the present application;
FIG. 3A is a schematic diagram of frame acquisition in one embodiment of the present application;
FIG. 3B is a schematic illustration of ZOY in cross-section in one embodiment of the present application;
FIG. 4 is a schematic view of focal position versus distance in one embodiment of the present application;
FIG. 5 is a schematic diagram of sharpness scores versus focus positions in one embodiment of the present application;
FIG. 6 is a schematic diagram of a plane equation based focal position determination process in one embodiment of the present application.
Detailed Description
The terminology used in the embodiments of the application is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in this application and the claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to any or all possible combinations including one or more of the associated listed items.
It should be understood that although the terms first, second, third, etc. may be used in embodiments of the present application to describe various information, these information should not be limited to these terms. These terms are only used to distinguish one type of information from another. For example, a first message may also be referred to as a second message, and similarly, a second message may also be referred to as a first message, without departing from the scope of the present application. Depending on the context, furthermore, the word "if" used may be interpreted as "at … …" or "at … …" or "in response to a determination".
In the embodiment of the present application, a focal position determining method is provided, and the method may be applied to a PTZ (Pan Tilt Zoom) camera (may also be referred to as a PTZ camera) or other types of cameras, and is shown in fig. 1, which is a schematic flow chart of the method, and the method may include:
And 102, determining the target distance between the target to be detected and the PTZ camera based on the intersection point coordinates between the target deflection angle, the target pitch angle pointed direction vector and the target plane equation.
And 104, focusing the PTZ camera based on the target focusing position, and acquiring a target image aiming at a target to be detected through the PTZ camera after the focusing of the PTZ camera is completed.
In one possible implementation, the target plane equation may also be obtained, and the process of obtaining the target plane equation may include, but is not limited to: acquiring a plurality of sample space coordinates; the method for acquiring the space coordinates of each sample comprises the following steps: determining a first sample focus position based on a first sample distance between the sample target and the PTZ camera; searching the peripheral focusing position of the first sample focusing position to obtain a second sample focusing position; a second sample distance is determined based on the second sample focal position, and sample space coordinates are acquired based on the second sample distance. After the plurality of sample space coordinates are obtained, plane equation parameters may be determined based on the plurality of sample space coordinates, and an initial plane equation may be determined based on the plane equation parameters. After the initial plane equation is obtained, the initial plane equation may be determined as a target plane equation; alternatively, the plane equation parameters of the initial plane equation may be corrected, and the corrected plane equation may be determined as the target plane equation.
For example, the pitch angle range of the PTZ camera may include a plurality of pitch angle intervals, for each pitch angle interval, a sample space coordinate corresponding to the pitch angle interval is selected from all sample space coordinates, and a target plane equation corresponding to the pitch angle interval is determined based on the sample space coordinate corresponding to the pitch angle interval.
Determining a target distance between the target to be detected and the PTZ camera based on the coordinates of the intersection between the directional vector pointed by the target yaw angle and the target pitch angle and the target plane equation may include, but is not limited to: determining a target pitch angle interval in which a target pitch angle is positioned from a plurality of pitch angle intervals; and determining the target distance between the target to be detected and the PTZ camera based on the coordinate of the intersection point between the target deflection angle, the directional vector pointed by the target pitch angle and the target plane equation corresponding to the target pitch angle interval.
Illustratively, determining the first sample focus position based on the first sample distance between the sample target and the PTZ camera may include, but is not limited to: and determining a first sample distance based on the longitudinal field angle of the PTZ camera, the sample pitch angle of the vertical rotation of the PTZ camera, the suspension height of the PTZ camera, the height of the sample target, the corresponding longitudinal pixel coordinates of the sample target in the picture and the total number of the longitudinal pixels of the picture, and inquiring the mapping relation based on the first sample distance to obtain a first sample focusing position.
Illustratively, searching for a peripheral focus position of the first sample focus position to obtain a second sample focus position may include, but is not limited to: acquiring a plurality of scanning step sizes corresponding to a plurality of object distance segments; for each object distance segment, a focusing position corresponding to the first object distance is obtained through a first object distance query mapping relation corresponding to the object distance segment, and a focusing position corresponding to the second object distance is obtained through a second object distance query mapping relation corresponding to the object distance segment; and determining a scanning step length corresponding to the object distance section based on the first object distance, the second object distance, the focusing position corresponding to the first object distance and the focusing position corresponding to the second object distance. Then, for each scanning step, searching for a peripheral focus position of the first sample focus position based on the scanning step, focusing the PTZ camera based on the searched focus position, collecting a sample image for the sample target, and determining a sharpness score based on the sample image. Then, based on the sharpness score corresponding to each of the searched focus positions, the focus position corresponding to the maximum sharpness score may be determined as the second sample focus position.
Illustratively, determining a second sample distance based on the second sample focal position, acquiring sample space coordinates based on the second sample distance may include, but is not limited to: inquiring the mapping relation through the second sample focusing position to obtain a second sample distance corresponding to the second sample focusing; determining spherical coordinates of the sample target in a polar coordinate system based on the second sample distance, a horizontally rotated sample deflection angle of the PTZ camera, and a vertically rotated sample pitch angle; based on the conversion relation between the polar coordinate system and the three-dimensional coordinate system, the spherical coordinates of the sample target in the polar coordinate system are converted into sample space coordinates of the sample target in the three-dimensional coordinate system.
Illustratively, modifying the plane equation parameters of the initial plane equation, determining the modified plane equation as the target plane equation may include, but is not limited to: and acquiring parameter correction data corresponding to a correction target, wherein the correction target can be a target before a target to be detected, and the parameter correction data can comprise a target distance between the correction target and the PTZ camera, a target deflection angle and a target pitch angle of the PTZ camera and a target focusing position corresponding to the target distance. Searching for a peripheral focus position of the target focus position, selecting a clear focus position from the searched focus positions, and determining a correction distance between the correction target and the PTZ camera based on the clear focus position. Determining a first spatial coordinate of the correction target in a three-dimensional coordinate system based on the target distance, the target deflection angle and the target pitch angle; and determining a second space coordinate of the correction target in a three-dimensional coordinate system based on the correction distance, the target deflection angle and the target pitch angle. And determining a deviation vector based on the first space coordinate and the second space coordinate, and correcting the plane equation parameter based on the deviation vector and the configured correction coefficient to obtain a target plane equation.
As can be seen from the above technical solutions, in the embodiments of the present application, the target distance between the target to be detected and the PTZ camera may be determined based on the target plane equation, the target focusing position may be determined based on the target distance between the target to be detected and the PTZ camera, and the PTZ camera may be focused based on the target focusing position, so that the target to be detected may be accurately focused. The time consumption for searching the target focusing position is relatively short, the clear focusing position can be found for the moving target to be detected, even if the proportion of the target to be detected in the image is small, the target to be detected can be focused on the target to be detected when the clear target focusing position is searched, the target to be detected cannot be focused on the background, namely the target to be detected is clear, the focusing position can be accurately enabled to fall on the target to be detected, and the target to be detected is ensured to be always in a clear state. The focusing position of each point in the physical space can be accurately calculated, and when the target to be detected appears in the picture, the clear focusing position of the target to be detected can be calculated in real time, so that the focusing is fast.
The above technical solutions of the embodiments of the present application are described below with reference to specific application scenarios.
In the focusing process of the camera, in order to improve the definition of an image, the AFD statistical value needs to be referred, the focusing lens is driven to oscillate near the clear focusing position based on the AFD statistical value, the focusing position where the maximum AFD statistical value is located is gradually approximated, the focusing position is the sharpest focusing position, and the image with very high definition can be acquired when the focusing lens is driven based on the focusing position, so that the definition of the image is ensured.
However, in the above manner, the time required for searching the focus position is relatively long, the sharpest focus position cannot be found for the moving object to be detected, and the proportion of the object to be detected in the image is usually small.
Aiming at the finding, the embodiment of the application provides the self-adaptive focusing method based on the image definition, which can determine the clear focusing position of the target to be detected, accurately focus the target to be detected, has shorter time consumption in the focusing process and can accurately lead the focusing position to fall on the target to be detected.
The adaptive focusing method of the embodiment of the application can be applied to a PTZ camera or other types of cameras, and is not limited to this, and the PTZ camera is exemplified by a PTZ camera, which can also be referred to as a dome camera, the PTZ camera can rotate in a horizontal direction, an angle of horizontal rotation of the PTZ camera is referred to as a deflection angle, the PTZ camera can rotate in a vertical direction, and an angle of vertical rotation of the PTZ camera is referred to as a pitch angle.
The adaptive focusing method is used for carrying out adaptive focusing on the target, namely, the PTZ camera is rotated to enable the target to be located in the visual field range of the PTZ camera, and the PTZ camera is focused to enable the focusing position to fall on the target, so that the target is guaranteed to be in a clear state. The "target" in this embodiment may be a part of a human body, a region of a vehicle (such as a license plate), or the like, which is not limited.
The adaptive focusing method of the embodiment of the present application may relate to a construction process of a plane equation (may also be referred to as a model construction process) and a focus position determination process based on the plane equation (may also be referred to as a model use process). In the construction of the plane equation, an initial plane equation may be constructed, and in the determination of the focal position based on the plane equation, the focal position may be determined based on the initial plane equation.
For the construction process of the plane equation, see fig. 2, the process may include:
For example, during construction of the plane equation, the target may be moved to the field of view of the PTZ camera, data about the target is acquired by the PTZ camera, and the plane equation is constructed based on the data. For convenience of distinction, the "target" in the construction process of the plane equation may be referred to as a sample target.
For example, a polar coordinate system (also referred to as a spherical coordinate system) is established with the PTZ camera as the center of the sphere O', and the position P in space (i.e., the intersection position of the optical axes) uses the spherical coordinates in the polar coordinate systemIndicating, θ indicates the sample pitch angle of vertical rotation of the PTZ camera, +.>Representing the sample deflection angle of the horizontal rotation of the PTZ camera, r represents the distance between position P and the PTZ camera (i.e., the distance between position P and the center of sphere O' of the PTZ camera). θ can be obtained from motor coordinates driving the PTZ camera to rotate verticallyCan be a known value, < + >>May be obtained from motor coordinates that drive the PTZ camera horizontally, and may be a known value.
The projection point of the PTZ camera on the ground is taken as an origin O to establish a three-dimensional coordinate system XYZ (also called a three-dimensional plane coordinate system), and when a sample target moves in the plane of the three-dimensional coordinate system XYZ, the PTZ camera can rotate around the sphere center O' in multiple directions so as to acquire pictures of different scenes. Referring to fig. 3A, a picture of two scenes (scene a and scene B) acquired when the PTZ camera rotates around the center O' is shown.
When the distance between the sample target and the PTZ camera is relatively close, the PTZ camera can acquire an image of the sample target in a 'low head' manner, when the distance between the sample target and the PTZ camera is relatively far, the PTZ camera can acquire an image of the sample target in a 'overhead' manner, the hanging height of the PTZ camera can be between 2 and 8 meters (of course, other heights can be adopted, and the hanging height of the PTZ camera can be larger than the height of the sample target). For convenience of description, let the projection point O of the center O' of sphere of the PTZ camera on the ground be 0 degree, then the range of the sample pitch angle θ is 0 degree <θ<90 degrees, while sample deflection angleIn the range of 0 degrees to 360 degrees, let +.>Then, a schematic view of ZOY as a cross section is shown in FIG. 3B.
Referring to fig. 3B, the position Q is the position of the sample object, α is the longitudinal angle of view of the PTZ camera, α is a known parameter after the lens optical design is completed, β is the transverse angle of view of the PTZ camera (not shown in fig. 3B), and β is a known parameter after the lens optical design is completed.
Referring to fig. 3B, θ is a sample pitch angle of vertical rotation of the PTZ camera, i.e., an angle between O 'O and a longitudinal field of view centerline, i.e., an angle between O' S and O 'O, MN is a line perpendicular to the longitudinal field of view centerline O' S, and epsilon is an angle of deviation of a sample target from the longitudinal field of view centerline. H represents the suspension height of the PTZ camera, which may be constant, and H is the height of the sample target, which may be constant.
Assuming that the vertical pixel coordinate corresponding to the position Q in the picture is Vp, the half of the total number of vertical pixels in the picture is Vp, the horizontal pixel coordinate corresponding to the position Q in the picture is Hp, the half of the total number of horizontal pixels in the picture is Hp, and the ratio of the number of QS pixels to the number of SN pixels in the imaging picture is γ, then the relationship of formula (1) can be obtained, and the relationship of formula (2) can be obtained after the conversion of formula (1).
tan (ε)/tan (α/2) =sq/sn= (Vp-Vp)/vp=γ formula (1)
Epsilon=arctan (γtan (α/2))=arctan ((Vp-Vp)/Vp tan (α/2)) formula (2)
As can be seen from fig. 3B, the distance L between the sample target and the PTZ camera (i.e., the distance between the positions Q and O') can be shown by referring to equation (3), and equation (4) can be obtained by converting equation (3).
L= (H-H)/cos (θ - ε) formula (3)
L= (H-H)/cos (θ -arctan ((Vp-Vp)/Vp) tan (α/2))) formula (4)
As can be seen from equation (4), the first sample distance L between the sample object and the PTZ camera can be determined based on the longitudinal field angle α of the PTZ camera, the sample pitch angle θ of the vertical rotation of the PTZ camera, the suspension height H of the PTZ camera, the height H of the sample object, the corresponding longitudinal pixel coordinates Vp of the sample object in the picture, and the total number of longitudinal pixels of the picture (e.g., half of the total number of longitudinal pixels, i.e., vp).
Illustratively, the clear focusing element of the PTZ camera is to determine the focusing position F, which is related to the distance L (i.e. the distance between the object and the PTZ camera), and is shown in fig. 4, where the focusing position F is plotted against the distance L, the abscissa indicates the distance L between the object and the PTZ camera, and the ordinate indicates the focusing position F.
By way of example, the relationship shown in fig. 4 can be represented by a function F, and therefore the relationship between the focal position F and the distance L can also be represented by the following formula: f=f (L).
By way of example, the mapping relationship may be the relationship shown in fig. 4, and by looking up the relationship shown in fig. 4 with the first sample distance (i.e., the first sample distance is taken as L), the first sample focal position corresponding to the first sample distance (the first sample focal position is taken as F) may be obtained. Alternatively, the mapping relationship may be the formula f=f (L), and substituting the first sample distance (i.e., the first sample distance is taken as L) into the formula, the first sample focal position corresponding to the first sample distance (the first sample focal position is taken as F) may be obtained.
By way of example, in combination with the formula f=f (L) and the formula (4), the following formula (5) may be obtained, and the first sample focus position corresponding to the first sample distance may be determined by the formula (5).
F=f (L) =f ((H-H)/cos (θ -epsilon))=f ((H-H)/cos (θ -arctan ((Vp-Vp)/Vp) tan (α/2))) equation (5)
For example, after the first sample focal position is obtained, since the first sample focal position may not be a clear focal position, the first sample focal position may also be taken as an initial focal position, a clear focal position may be searched for from among peripheral focal positions of the first sample focal position on the basis of the first sample focal position, and the clear focal position may be taken as a second sample focal position.
In one possible embodiment, the second sample focus position may be obtained by:
step 2031, obtaining a plurality of scanning step sizes corresponding to the plurality of object distance segments.
For example, the entire object distance range can be divided into m object distance segments, m can be a positive integer greater than 1, and the i-th object distance segment is denoted as (L1 i ,L2 i ),L1 i The initial object distance, L2, representing the i-th object distance segment i Representing the ending object distance of the i-th object distance segment. The focus position range corresponding to the i-th object distance segment is denoted as (F1 i ,F2 i ),F1 i Representing the initial focus position of the ith object distance segment, F2 i Indicating the end focus position of the i-th object distance segment. Wherein, through L1 i By looking up the relationship shown in fig. 4 or the formula f=f (L), F1 can be obtained i By L2 i F2 can be obtained by looking up the relationship shown in fig. 4 or the formula f=f (L) i 。
From the above, it can be seen that for each object distance segment (e.g., the i-th object distance segment), the first object distance (e.g., the initial object distance L1 i Or at a starting material distance L1 i Distance from end object L2 i Between) the first object distance and the second object distance to obtain a first object distance, such as a first object distance F1 i Can be obtained by the corresponding second object distance (such as the end object distance L2 i Or at a starting material distance L1 i Distance from end object L2 i Between) the second distance is obtained by inquiring the mapping relation, such as the end focusing position F2 i 。
For each object distance segment, a scan step corresponding to the object distance segment may be determined based on the first object distance, the second object distance, the focus position corresponding to the first object distance, and the focus position corresponding to the second object distance. For example, taking the first object distance as the initial object distance, the second object distance as the final object distance, the focusing position corresponding to the first object distance as the initial focusing position, and the focusing position corresponding to the second object distance as the final focusing position, then the scanning STEP corresponding to the i-th object distance segment i The expression can be expressed by the following formula: k (F1) i -F2 i )/(L1 i -L2 i ). It is obvious that the process is not limited to,for each object distance segment, the scanning step length corresponding to the object distance segment can be obtained through the formula, so that a plurality of scanning step lengths corresponding to a plurality of object distance segments, such as m scanning step lengths corresponding to m object distance segments, are obtained.
In the above formula, K is a focusing operation coefficient, which can be empirically configured, (F1) i -F2 i )/(L1 i -L2 i ) Is the sensitivity degree G of the focus of the lens to the object distance, g= (focus 1-focus 2)/(L1-L2).
For example, a plurality of scanning steps corresponding to a plurality of object distance segments may be recorded in the configuration table, i.e. the configuration table may include a plurality of scanning steps corresponding to a plurality of object distance segments. For example, if the frame rate of the PTZ camera is 25fp, the configuration table may include p scan steps, and p may be less than 25, which is not limited.
Step 2032, for each scanning step, searching for a peripheral focus position of the first sample focus position based on the scanning step, focusing the PTZ camera based on the searched focus position, collecting a sample image for the sample target, and determining a sharpness score based on the sample image.
For example, for each scan step, one focus position is searched by subtracting the scan step from the first sample focus position, and another focus position is searched by adding the scan step to the first sample focus position, and obviously, a plurality of focus positions can be searched based on a plurality of scan steps corresponding to a plurality of object distance segments.
For each of the searched focusing positions, when the focusing position is searched, the PTZ camera can be focused based on the focusing position, and the focusing process is not limited. During the focusing process of the PTZ camera, a sample image aiming at a sample target is acquired, so that a sample image corresponding to the focusing position is obtained. Obviously, when a plurality of focus positions are searched, a sample image corresponding to each focus position can be obtained.
After obtaining the sample image corresponding to the focus position, a sharpness score corresponding to the focus position may be determined based on the sample image, thereby obtaining the sharpness score corresponding to the focus position. Obviously, when a plurality of focus positions are searched, a definition score corresponding to each focus position can be obtained.
In one possible embodiment, after obtaining the sample image corresponding to the focal position, the sharpness score corresponding to the focal position may be determined using the following formula (6):
score=r Score1+ (1-r) Score2 formula (6)
In formula (6), score represents a sharpness Score corresponding to the focus position. Score1 represents a first sharpness Score corresponding to the focus position, for example, a region where a sample target is located is determined from a sample image, the region where the sample target is located is scratched out from the sample image, and the first sharpness Score1 of the region where the sample target is located is calculated through a filtering algorithm, so that the calculation mode is not limited. Score2 represents the second sharpness Score corresponding to the focus position, for example, an identification algorithm is adopted, and the second sharpness Score2 of the area where the sample target is located is directly determined based on the sample image (i.e. the area where the sample target is located does not need to be scratched out of the sample image), which is not limited in the determination manner. r represents a weight coefficient of Score1, 1-r represents a weight coefficient of Score2, and r may be greater than or equal to 0 and less than or equal to 1.
Step 2033, determining, based on the sharpness scores corresponding to each of the searched focus positions, the focus position corresponding to the maximum sharpness score as the second sample focus position.
For example, referring to fig. 5, after the sharpness score corresponding to each focus position is obtained, the focus position corresponding to the sharpness score having the highest score may be regarded as the second sample focus position.
For example, assuming that the value range of the sharpness Score is 0 to 100, the sharpness may be judged as unclear when the sharpness Score is smaller than 20, and the sharpness may be judged as sharp when the sharpness Score is higher than 80, and the higher the sharpness Score is, the higher the sharpness is considered. Each object has an independent ID, and the IDs do not change when the object moves in a picture, and a single picture supports simultaneous calculation of sharpness for a plurality of objects.
For example, assuming that the image capturing process of the PTZ camera lasts for k seconds, in order to search for the peripheral focusing position of the first sample focusing position and obtain the second sample focusing position, the PTZ camera may complete the focusing process in the first 1 second or even shorter of k seconds, and obtain the second sample focusing position in the manner of steps 2032-2033, where the process is an active focusing action, and this action affects the image preview display, but only lasts for 1 second at most, and a clear point is output at the end of this action to make the image clear, that is, the remaining time after k seconds may ensure the image to be clear, so as to meet the image capturing requirement.
To this end, step 203 is completed, a second sample focus position may be found from the peripheral focus positions of the first sample focus position, and the second sample focus position is the clear focus position.
For example, the mapping relationship may be the relationship shown in fig. 4, and the second sample distance may be obtained by querying the relationship shown in fig. 4 through the second sample focusing position F. Alternatively, the mapping relationship may be a formula f=f (L), and the second sample distance may be obtained by substituting the second sample focal position F into the formula.
For example, the spherical position of the sample target in the polar coordinate system may be denoted as (L, ω, ρ), where L represents the distance between the sample target and the PTZ camera, i.e. the above-mentioned second sample distance, ω represents the longitudinal rotation angle of the position of the sample target relative to the PTZ camera, and ρ represents the lateral rotation angle of the position of the sample target relative to the PTZ camera. Wherein the longitudinal rotation angle omega of the position of the sample target relative to the PTZ camera can be determined based on the sample pitch angle theta of the vertical rotation of the PTZ camera, and the horizontal rotation of the PTZ camera can be based on the sample Deflection angleThe transverse rotation angle rho of the position of the sample target relative to the PTZ camera is determined.
For example, based on the sample pitch angle θ, the longitudinal rotation angle ω can be determined using equation (7):
ω=θ - ε=θ -arctan ((Vp-Vp)/Vp tan (α/2)) equation (7)
In formula (7), ε is the angle of the sample object deviating from the center line of the longitudinal field of view, vp is half of the total number of longitudinal pixels of the frame, vp is the corresponding longitudinal pixel coordinates of the sample object in the frame, and α is the longitudinal field angle of the PTZ camera. In summary, the longitudinal rotation angle ω can be determined based on the sample pitch angle θ, the total number of longitudinal pixels, the longitudinal pixel coordinates vp of the sample target, and the longitudinal field angle α of the PTZ camera.
For example, based on sample deflection angleThe lateral rotation angle ρ can be determined using equation (8):
in formula (8), λ is the angle of the sample object deviating from the transverse field of view centerline, hp is half of the total number of transverse pixels of the picture, hp is the corresponding transverse pixel coordinates of the sample object in the picture, and β is the transverse field angle of the PTZ camera. In summary, the deflection angle can be based on the sampleThe total number of transverse pixels, the transverse pixel coordinates hp of the sample object, the transverse field angle β of the PTZ camera determine the transverse rotation angle ρ.
In the formulas (7) and (8), the sample pitch angle θ is obtained from the motor coordinates driving the PTZ camera to vertically rotate, and is a known value, the sample yaw angleObtained from motor coordinates driving the PTZ camera to rotate horizontally, is a known value. The longitudinal angle of view α is a known parameter after the lens optical design is complete, and the lateral angle of view β is a known parameter after the lens optical design is complete. The total number of vertical pixels and the total number of horizontal pixels are related to the image resolution, i.e. the total number of vertical pixels is the image vertical resolution and the total number of horizontal pixels is the image horizontal resolution, and thus Vp and Hp are known values. The vertical pixel coordinates vp and the horizontal pixel coordinates hp are related to the position of the sample object in the screen, and may be obtained by an identification algorithm, which is not limited. />
From the above, it can be seen that the parameters can be based onDetermining spherical coordinates of a sample target in a polar coordinate system, e.g., a sample pitch angle θ and a longitudinal pixel coordinate vp for determining a longitudinal rotation angle ω, a sample deflection angleAnd the lateral pixel coordinate hp is used to determine the lateral rotation angle ρ and the focus position F is used to determine the distance L between the sample target and the PTZ camera, i.e., the second sample distance described above.
In a possible embodiment, referring to step 201, the distance between the sample object and the PTZ camera may already be obtained, i.e. the first sample distance between the sample object and the PTZ camera is determined based on the longitudinal field angle α, the sample pitch angle θ, the suspension height H of the PTZ camera, the height H of the sample object, the corresponding longitudinal pixel coordinates Vp of the sample object in the picture and half Vp of the total number of longitudinal pixels of the picture, and then the focus position F is accurately calculated based on the distance. However, since the installation situation of the PTZ camera is complicated and variable, especially the inclination situation of the PTZ camera, the degree of flatness of the ground, etc., it is difficult to ensure the sharpness of the image at the finally calculated focal position F. To solve this problem, the stability and adaptability of the prediction model are improved, and in this embodiment, the lateral parameters Hp, beta and beta may be introducedFinally, the predictive model can be associated with the parameter +.>Concerning, parameters->Is converted into spherical coordinates (L, omega, ρ).
For example, the polar coordinates of any position in the physical space and the three-dimensional coordinates (x, y, z) of the position may be converted with each other, for example, based on a conversion relationship between the polar coordinate system and the three-dimensional coordinate system, and the conversion relationship between the polar coordinate system and the three-dimensional coordinate system may be shown by referring to formula (9).
In formula 9, L represents the distance between the sample target and the PTZ camera, i.e., the second sample distance, ω represents the longitudinal rotation angle of the position of the sample target relative to the PTZ camera, and ρ represents the transverse rotation angle of the position of the sample target relative to the PTZ camera. Obviously, on the basis of the known spherical coordinates (L, ω, ρ) of the sample object in the polar coordinate system, the spherical coordinates (L, ω, ρ) may be converted into three-dimensional coordinates (x, y, z) of the sample object in the three-dimensional coordinate system based on the conversion relation shown in the formula (9), and the three-dimensional coordinates (x, y, z) of the sample object in the three-dimensional coordinate system may be referred to as sample space coordinates.
For example, in the construction process of the plane equation, a plurality of sample targets may be moved to the field of view of the PTZ camera, the PTZ camera collects data related to the sample targets, and when each sample target moves to the field of view of the PTZ camera, the sample space coordinates corresponding to the sample target may be obtained based on steps 201 to 206, so as to obtain a plurality of sample space coordinates. After the plurality of sample space coordinates are obtained, plane equation parameters are determined based on the plurality of sample space coordinates, and an initial plane equation is determined based on the plane equation parameters.
In one possible implementation, an example of a plane equation may be shown with reference to equation (10):
z=a x+b y+c formula (10)
In formula (10), a, b, and c are plane equation parameters, and in order to solve the plane equation parameters, a plane fitting may be performed on a plurality of sample space coordinates (x, y, z), and the plane equation parameters may be solved by using a least square method, as shown in formula (11), which is an example of solving the plane equation parameters.
In formula (11), n represents the total number of sample space coordinates, n may be a positive integer greater than or equal to 3, (x) j ,y j ,z j ) Representing the j-th sample space coordinates. Obviously, after substituting n sample space coordinates (x, y, z) into the formula (11), the plane equation parameters a, b, c can be obtained, and after obtaining the plane equation parameters a, b, c, the initial plane equation shown in the formula (10) can be obtained.
In one possible embodiment, in order to ensure stability and adaptability of the initial plane equation, the pitch angle range of the PTZ camera may be further divided into a plurality of pitch angle intervals, and if the pitch angle θ is 0 degrees < θ <90 degrees, the pitch angle range may be divided into a pitch angle interval 1, a pitch angle interval 2 and a pitch angle interval 3, where the pitch angle interval 1 is (0, 30), the pitch angle interval 2 is (30, 60), and the pitch angle interval 3 is (60, 30), and of course, the above is only an example of dividing the pitch angle interval, which is not limited.
For each pitch interval, in step 207, the sample space coordinates corresponding to the pitch interval may be selected from all the sample space coordinates. For example, if the vertically rotated sample pitch angle of the PTZ camera is located in the pitch angle interval 1, the sample space coordinate corresponding to the sample target belongs to the coordinate set 1 corresponding to the pitch angle interval 1, if the vertically rotated sample pitch angle of the PTZ camera is located in the pitch angle interval 2, the sample space coordinate corresponding to the sample target belongs to the coordinate set 2 corresponding to the pitch angle interval 2, and if the vertically rotated sample pitch angle of the PTZ camera is located in the pitch angle interval 3, the sample space coordinate corresponding to the sample target belongs to the coordinate set 3 corresponding to the pitch angle interval 3. In summary, the coordinate set 1 corresponding to the pitch angle interval 1, the coordinate set 2 corresponding to the pitch angle interval 2, and the coordinate set 3 corresponding to the pitch angle interval 3 can be obtained.
For each pitch interval, in step 207, a plane equation parameter corresponding to the pitch interval may be determined based on a plurality of sample space coordinates corresponding to the pitch interval, and an initial plane equation corresponding to the pitch interval may be determined based on the plane equation parameter. For example, plane equation parameters corresponding to the pitch angle interval 1 may be determined based on a plurality of sample space coordinates in the coordinate set 1 corresponding to the pitch angle interval 1, and an initial plane equation corresponding to the pitch angle interval 1 may be determined based on the plane equation parameters, and so on.
For example, an example of the plane equation for the ith pitch angle interval can be shown by equation (12):
z=a i *x+b i *y+c i formula (12)
In the formula (12), a i 、b i 、c i For plane equation parameters of the ith pitch angle interval, for solving the plane equation parameters, plane fitting can be performed on a plurality of sample space coordinates (x, y, z) of the ith pitch angle interval, and the plane equation parameters can be solved by using a least square method, as shown in formula (13).
In equation (13), n represents the total number of sample space coordinates corresponding to the ith pitch angle interval, n may be a positive integer greater than or equal to 3, (x) j ,y j ,z j ) And (3) representing the jth sample space coordinate corresponding to the ith pitch angle interval, and finally obtaining an initial plane equation of the ith pitch angle interval based on the formula (13).
Thus, the construction process of the plane equation is completed, an initial plane equation may be obtained, after the initial plane equation is obtained, the determination of the focal position may be implemented based on the plane equation, and for the focal position determination process based on the plane equation, as shown in fig. 6, the process may include the steps of:
and 601, acquiring a target deflection angle of horizontal rotation of the PTZ camera and a target pitch angle of vertical rotation of the PTZ camera when the target to be detected moves to the visual field of the PTZ camera.
In one possible implementation, the initial plane equation may be determined as the target plane equation. In another possible implementation manner, if the pitch angle range of the PTZ camera is divided into a plurality of pitch angle intervals, and an initial plane equation corresponding to each pitch angle interval has been obtained, then a target pitch angle interval in which the target pitch angle is located (i.e., the target pitch angle is in the target pitch angle interval) may be determined from the plurality of pitch angle intervals, and the initial plane equation corresponding to the target pitch angle interval may be determined as the target plane equation.
And 603, determining the target distance between the target to be detected and the PTZ camera based on the intersection point coordinates between the target deflection angle, the target pitch angle pointed direction vector and the target plane equation.
For example, based on the target yaw angle and the target pitch angle, a direction vector, i.e., a direction vector in which the target yaw angle and the target pitch angle are directed, may be determined. Since the target plane equation represents a plane and an intersection point exists between the direction vector and the plane, an intersection point coordinate between the direction vector and the target plane equation, which represents three-dimensional coordinates (x, y, z) of the target to be detected in the three-dimensional coordinate system, can be determined, and then a target distance between the target to be detected and the PTZ camera can be determined based on the intersection point coordinate.
For example, the mapping relationship may be the relationship curve shown in fig. 4, and the target focusing position F corresponding to the target distance may be obtained by querying the relationship curve shown in fig. 4 through the target distance L. Alternatively, the mapping relationship may be a formula f=f (L), and the target focal position F may be obtained by substituting the target distance L into the formula.
In one possible embodiment, the method is based on parametersWhen the spherical coordinates of the sample target in the polar coordinate system are determined, the definition is a relative quantity, random noise exists, the scanning step length in the focusing process cannot be very small, and therefore a certain error exists in the focusing position F, a certain error exists in the spherical coordinates, and a certain error exists in the initial plane equation. Based on the above, in the focal position determining process based on the plane equation, the plane equation can be continuously corrected, the correction method is a method of adopting normal vector superposition offset, when the plane equation is continuously subjected to iterative correction, whether the deviation vector approaches to be unchanged can be determined, if not, the correction process is continued, and if yes, the correction process is stopped.
Illustratively, to iteratively correct the plane equation, the following steps may be employed:
step S11, acquiring parameter correction data corresponding to a correction target, where the correction target may be a target before the target to be detected, and the parameter correction data may include a target distance between the correction target and the PTZ camera, a target yaw angle and a target pitch angle of the PTZ camera, and a target focus position corresponding to the target distance.
For example, assuming that the current target to be detected is the target a and the target to be detected before the target a is the target B, the target B may be referred to as a correction target, and when determining the target focus position corresponding to the correction target in steps 601-605, parameter correction data corresponding to the correction target may be collected. Referring to steps 601-605, parameter correction data such as a target distance between the correction target and the PTZ camera, a target yaw angle and a target pitch angle of the PTZ camera, a target focus position corresponding to the target distance, and the like can be obtained.
And step S12, searching the peripheral focusing positions of the target focusing positions, selecting a clear focusing position from the searched focusing positions, and determining the correction distance between the correction target and the PTZ camera based on the clear focusing position.
Illustratively, to obtain a clear focus position, a plurality of scan steps corresponding to a plurality of object distance segments are acquired, for each scan step, a peripheral focus position of a target focus position is searched based on the scan step, a PTZ camera is focused based on the searched focus position, after focusing is completed, an image for a correction target is acquired, and a sharpness score is determined based on the image. Based on the definition score corresponding to each of the searched focus positions, the focus position corresponding to the maximum definition score may be determined as the clear focus position. For the implementation process of obtaining the clear focus position, reference may be made to step 203, and the detailed description will not be repeated here.
After the clear focusing position is obtained, the mapping relation can be queried through the clear focusing position to obtain the distance corresponding to the clear focusing position, wherein the distance is the correction distance between the correction target and the PTZ camera.
For example, assuming that the image capturing process of the PTZ camera lasts for k seconds, in order to search the peripheral focusing position of the target focusing position and obtain the clear focusing position, the focusing process can be completed in the first 1 second or even shorter in k seconds to obtain the clear focusing position, and the above process is an active focusing action, which affects the image preview display, but the action lasts for 1 second at most, and a clear point is output at the end of the action to make the image clear, that is, the remaining time in k seconds ensures the image to be clear, so as to meet the image capturing requirement.
Step S13, determining a first space coordinate of the correction target under a three-dimensional coordinate system based on the target distance, the target deflection angle and the target pitch angle; and determining a second space coordinate of the correction target in a three-dimensional coordinate system based on the correction distance, the target deflection angle and the target pitch angle.
For example, since the parameter correction data includes the target distance between the correction target and the PTZ camera, the target yaw angle and the target pitch angle of the PTZ camera, the first spatial coordinate of the correction target in the three-dimensional coordinate system may be determined based on the target distance, the target yaw angle and the target pitch angle, and the determination manner may refer to step 205 and step 206, which are not repeated herein. Wherein the first spatial coordinate is understood as a predicted spatial coordinate, i.e. when a predicted focus parameter is obtainedAfterwards, the focus parameter will be predicted +.>Is converted into a first space coordinate (x P ,y P ,z P )。
For example, since the parameter correction data includes a target focus position, and a clear focus position is searched based on the target focus position, and a correction distance between the correction target and the PTZ camera is determined based on the clear focus position, a second spatial coordinate of the correction target in a three-dimensional coordinate system may be determined based on the correction distance, the target yaw angle, and the target pitch angle, a determination method The formula can be referred to as step 205 and step 206, and the detailed description is not repeated here. Wherein this second spatial coordinate can be understood as a real spatial coordinate, i.e. based on a real focus parameterAfterwards, the real focus parameter +.> Is converted into a second space coordinate (x R ,y R ,z R ). In real focusing parameters In F R Indicating the clear focus position is based on the target focus position F P Searching to obtain the product.
Step S14, determining a bias vector based on the first spatial coordinate and the second spatial coordinate.
(x R ,y R ,z R ) Is the second space coordinate, (x P ,y P ,z P ) Is the first spatial coordinate.
And S15, correcting the plane equation parameters based on the deviation vector and the configured correction coefficient to obtain a corrected plane equation, and determining the corrected plane equation as a target plane equation.
Illustratively, the plane equation parameter may be modified by using the formula (15):
in the case of the formula (15),representing the modified plane equation parameters, +.>Representing the plane equation parameters before correction, the plane equation parameters after correction +.>And the plane equation parameters before correction +.>May be the normal plane vector of the plane equation, for example, it is assumed that the plane equation before correction is: z=a0 i *x+b0 i *y+c0 i The plane equation parameters before correction>The method comprises the following steps:in addition, k may represent a configured correction factor, k may be a value greater than 0 and less than 1, +.>A bias vector may be represented.
In one possible implementation, assuming that the current target to be detected is the target a and the target to be detected before the target a is the target B, after the processing of the target B is finished, the plane equation may be modified based on the parameter correction data of the target B to obtain a modified plane equation, and when the target a is processed, the modified plane equation is used as the target plane equation of the target a.
In one possible embodiment, after the deviation vector is obtained, it may also be determined whether the plane equation tends to stabilize based on the deviation vector, for example, if the deviation vector is smaller than a preset threshold, it is determined that the plane equation tends to stabilize, otherwise, it is determined that the plane equation does not tend to stabilize. If the plane equation tends to be stable, the correction process of the plane equation is ended (i.e. the correction process of the plane equation is exited), and in the subsequent process, the plane equation is not corrected any more. If the plane equation does not tend to be stable, the plane equation correction process may be continued, for example, after the processing process of the target a is finished, the plane equation is corrected based on the parameter correction data of the target a, so as to obtain a corrected plane equation, and when the target to be detected is processed, the corrected plane equation is used as the target plane equation of the target to be detected, and so on.
Along with the continuous progress of the image acquisition process, a plane equation (also called a focus prediction model) can be continuously corrected, and finally a focus prediction model of a plurality of pitch angle intervals can be output, so that the model establishment of the whole area is completed, and the rapid and accurate target acquisition focusing process is finally realized.
In one possible implementation, when the plane equation is corrected based on the parameter correction data, a pitch angle interval in which the target pitch angle in the parameter correction data is located may be determined first, and then the plane equation corresponding to the pitch angle interval is corrected based on the parameter correction data.
As can be seen from the above technical solutions, in the embodiments of the present application, the target distance between the target to be detected and the PTZ camera may be determined based on the target plane equation, the target focusing position may be determined based on the target distance between the target to be detected and the PTZ camera, and the PTZ camera may be focused based on the target focusing position, so that the target to be detected may be accurately focused. The time consumption for searching the target focusing position is relatively short, the clear focusing position can be found for the moving target to be detected, even if the proportion of the target to be detected in the image is small, the target to be detected can be focused on the target to be detected when the clear target focusing position is searched, the target to be detected cannot be focused on the background, namely the target to be detected is clear, the focusing position can be accurately enabled to fall on the target to be detected, and the target to be detected is ensured to be always in a clear state. The focusing position of each point in the physical space can be accurately calculated, and when the target to be detected appears in the picture, the clear focusing position of the target to be detected can be calculated in real time, so that the focusing is fast.
The object distance of the target can be accurately focused, if the traditional focusing is easily interfered by external environment, the focusing is carried out on the background with high probability, the interested area is blurred, in the embodiment, the target definition is used as the premise of model parameter adaptation, the external environment interference factor is avoided, the focus can be accurately dropped on the target, and the interested area in the picture is kept in a clear state all the time. The method can focus rapidly, after the model is built, the focusing position of each position in the space can be accurately calculated, when a target appears in a picture, the required focusing position F can be calculated and predicted in real time, and the focusing motor is directly driven to reach the predicted position, so that the target can be clear, particularly, for the rapidly moving target, rapid image acquisition can be realized, and the target is kept in a clear state in the whole image acquisition process. The maintenance cost is reduced, the model calibration and establishment process does not need personnel intervention, the model is not limited by installation conditions, and the model can be subjected to self-adaptive correction and perfection again after the installation position changes.
Based on the same application concept as the above method, an embodiment of the present application proposes a focus position determining apparatus applied to a PTZ camera, the apparatus including:
An acquisition module 71 for acquiring a target yaw angle of horizontal rotation of the PTZ camera and acquiring a target pitch angle of vertical rotation of the PTZ camera;
a determining module 72, configured to determine a target distance between the target to be detected and the PTZ camera based on coordinates of an intersection point between the target yaw angle and a direction vector pointed by the target pitch angle and a target plane equation;
a query module 73, configured to query the configured mapping relationship based on the target distance, and obtain a target focus position corresponding to the target distance; the mapping relationship represents a relationship between distance and focus position;
a processing module 74, configured to focus the PTZ camera based on the target focus position, and acquire, by the PTZ camera, a target image for the target to be detected after the PTZ camera is focused.
Illustratively, the obtaining module 71 is further configured to obtain a target plane equation; the acquiring module 71 is specifically configured to, when acquiring the target plane equation: acquiring a plurality of sample space coordinates, wherein the acquiring mode of each sample space coordinate comprises the following steps: determining a first sample focus position based on a first sample distance between the sample target and the PTZ camera; searching the peripheral focusing positions of the first sample focusing position to obtain a second sample focusing position; determining a second sample distance based on the second sample focus position, and acquiring sample space coordinates based on the second sample distance; determining plane equation parameters based on the plurality of sample space coordinates; determining an initial plane equation based on the plane equation parameters; and determining the initial plane equation as a target plane equation, or correcting plane equation parameters of the initial plane equation, and determining the corrected plane equation as the target plane equation.
Illustratively, the pitch angle range of the PTZ camera includes a plurality of pitch angle intervals, for each pitch angle interval, a sample space coordinate corresponding to the pitch angle interval is selected from all sample space coordinates, and a target plane equation corresponding to the pitch angle interval is determined based on the sample space coordinate corresponding to the pitch angle interval; the determining module 72 is specifically configured to determine a target distance between the target to be detected and the PTZ camera based on coordinates of an intersection point between the target yaw angle and a direction vector pointed by the target pitch angle and a target plane equation: determining a target pitch angle interval in which the target pitch angle is located from the plurality of pitch angle intervals; and determining the target distance between the target to be detected and the PTZ camera based on the intersection point coordinates between the direction vector and the target plane equation corresponding to the target pitch angle interval.
Illustratively, the acquisition module 71 is specifically configured to determine a first sample focus position based on a first sample distance between the sample target and the PTZ camera: determining the first sample distance based on a longitudinal field angle of the PTZ camera, a sample pitch angle of a vertical rotation of the PTZ camera, a suspension height of the PTZ camera, a height of a sample target, a corresponding longitudinal pixel coordinate of the sample target in a picture, and a total number of longitudinal pixels of the picture; and inquiring the mapping relation based on the first sample distance to obtain the first sample focusing position.
Illustratively, the acquiring module 71 searches for the peripheral focus position of the first sample focus position, and is specifically configured to: acquiring a plurality of scanning step sizes corresponding to a plurality of object distance segments; for each object distance segment, inquiring the mapping relation through a first object distance corresponding to the object distance segment to obtain a focusing position corresponding to the first object distance, and inquiring the mapping relation through a second object distance corresponding to the object distance segment to obtain a focusing position corresponding to the second object distance; determining a scanning step length corresponding to the object distance section based on the first object distance, the second object distance, a focusing position corresponding to the first object distance and a focusing position corresponding to the second object distance; for each scanning step, searching for a peripheral focusing position of the first sample focusing position based on the scanning step, focusing the PTZ camera based on the searched focusing position, collecting a sample image for a sample target, and determining a sharpness score based on the sample image; and determining the focusing position corresponding to the maximum definition fraction as the second sample focusing position.
Illustratively, the acquiring module 71 determines a second sample distance based on the second sample focus position, and is specifically configured to, when acquiring the sample space coordinates based on the second sample distance: inquiring the mapping relation through a second sample focusing position to obtain a second sample distance; determining spherical coordinates of the sample target in a polar coordinate system based on the second sample distance, a horizontally rotated sample deflection angle of the PTZ camera, and a vertically rotated sample pitch angle; and converting the spherical coordinates of the sample target in the polar coordinate system into sample space coordinates of the sample target in the three-dimensional coordinate system based on the conversion relation between the polar coordinate system and the three-dimensional coordinate system.
Illustratively, the obtaining module 71 corrects the plane equation parameters of the initial plane equation, and determines the corrected plane equation as the target plane equation specifically for: acquiring parameter correction data corresponding to a correction target, wherein the correction target is a target before the target to be detected, and the parameter correction data comprises a target distance between the correction target and the PTZ camera, a target deflection angle and a target pitch angle of the PTZ camera and a target focusing position corresponding to the target distance; searching a peripheral focusing position of the target focusing position, selecting a clear focusing position from the searched focusing positions, and determining a correction distance between the correction target and the PTZ camera based on the clear focusing position; determining a first spatial coordinate of the correction target in a three-dimensional coordinate system based on the target distance, the target deflection angle and the target pitch angle; determining a second spatial coordinate of the correction target in a three-dimensional coordinate system based on the correction distance, the target deflection angle and the target pitch angle; and determining a deviation vector based on the first space coordinate and the second space coordinate, and correcting the plane equation parameter based on the deviation vector and the configured correction coefficient to obtain a target plane equation.
Based on the same application concept as the above method, an electronic device is proposed in the embodiments of the present application, where the electronic device includes a processor 81 and a machine-readable storage medium 82, where the machine-readable storage medium 82 stores machine-executable instructions that can be executed by the processor 81; the processor 81 is configured to execute machine executable instructions to implement the focus position determination method disclosed in the above examples of the present application.
The system, apparatus, module or unit set forth in the above embodiments may be implemented in particular by a computer entity or by an article of manufacture having some functionality. A typical implementation device is a computer, which may be in the form of a personal computer, laptop computer, cellular telephone, camera phone, smart phone, personal digital assistant, media player, navigation device, email device, game console, tablet computer, wearable device, or a combination of any of these devices.
For convenience of description, the above devices are described as being functionally divided into various units, respectively. Of course, the functions of each element may be implemented in one or more software and/or hardware elements when implemented in the present application.
It will be appreciated by those skilled in the art that embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment, or an embodiment combining software and hardware aspects. Furthermore, embodiments of the present application may take the form of a computer program product on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, etc.) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
Moreover, these computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
The foregoing is merely exemplary of the present application and is not intended to limit the present application. Various modifications and changes may be made to the present application by those skilled in the art. Any modifications, equivalent substitutions, improvements, etc. which are within the spirit and principles of the present application are intended to be included within the scope of the claims of the present application.
Claims (10)
1. A focus position determination method, applied to a PTZ camera, comprising:
acquiring a target deflection angle of horizontal rotation and a target pitch angle of vertical rotation of the PTZ camera;
determining a target distance between a target to be detected and the PTZ camera based on the target deflection angle, the direction vector pointed by the target pitch angle and the intersection point coordinates between a target plane equation;
inquiring the configured mapping relation based on the target distance to obtain a target focusing position corresponding to the target distance; wherein the mapping relationship represents a relationship between distance and focus position;
focusing the PTZ camera based on the target focusing position, and acquiring a target image aiming at the target to be detected through the PTZ camera after the PTZ camera is focused.
2. The method according to claim 1, wherein the method further comprises: the target plane equation is acquired, and the acquisition process of the target plane equation comprises the following steps:
acquiring a plurality of sample space coordinates; the method for acquiring the space coordinates of each sample comprises the following steps: determining a first sample focus position based on a first sample distance between a sample target and the PTZ camera; searching the peripheral focusing positions of the first sample focusing position to obtain a second sample focusing position; determining a second sample distance based on the second sample focus position, and acquiring sample space coordinates based on the second sample distance;
Determining plane equation parameters based on the plurality of sample space coordinates;
determining an initial plane equation based on the plane equation parameters;
determining the initial plane equation as a target plane equation; or, correcting the plane equation parameters of the initial plane equation, and determining the corrected plane equation as a target plane equation.
3. The method of claim 2, wherein the step of determining the position of the substrate comprises,
the pitch angle range of the PTZ camera comprises a plurality of pitch angle intervals, sample space coordinates corresponding to the pitch angle intervals are selected from all sample space coordinates for each pitch angle interval, and a target plane equation corresponding to the pitch angle intervals is determined based on the sample space coordinates corresponding to the pitch angle intervals;
the determining the target distance between the target to be detected and the PTZ camera based on the intersection point coordinates between the target deflection angle, the target pitch angle pointing direction vector and a target plane equation comprises the following steps:
determining a target pitch angle interval in which the target pitch angle is located from the plurality of pitch angle intervals;
and determining the target distance between the target to be detected and the PTZ camera based on the intersection point coordinates between the direction vector and the target plane equation corresponding to the target pitch angle interval.
4. The method of claim 2, wherein the determining a first sample focus position based on a first sample distance between a sample target and the PTZ camera comprises:
determining the first sample distance based on a longitudinal field angle of the PTZ camera, a sample pitch angle of a vertical rotation of the PTZ camera, a suspension height of the PTZ camera, a height of the sample target, a corresponding longitudinal pixel coordinate of the sample target in a picture, and a total number of longitudinal pixels of the picture;
and inquiring the mapping relation based on the first sample distance to obtain the first sample focusing position.
5. The method of claim 2, wherein the searching for the peripheral focal position of the first sample focal position to obtain a second sample focal position comprises:
acquiring a plurality of scanning step sizes corresponding to a plurality of object distance segments; for each object distance segment, inquiring the mapping relation through a first object distance corresponding to the object distance segment to obtain a focusing position corresponding to the first object distance, and inquiring the mapping relation through a second object distance corresponding to the object distance segment to obtain a focusing position corresponding to the second object distance; determining a scanning step length corresponding to the object distance section based on the first object distance, the second object distance, a focusing position corresponding to the first object distance and a focusing position corresponding to the second object distance;
For each scanning step, searching for a peripheral focusing position of the first sample focusing position based on the scanning step, focusing the PTZ camera based on the searched focusing position, collecting a sample image for a sample target, and determining a sharpness score based on the sample image;
and determining the focusing position corresponding to the maximum definition fraction as the second sample focusing position.
6. The method of claim 2, wherein the determining a second sample distance based on the second sample focus position, obtaining sample space coordinates based on the second sample distance, comprises:
inquiring the mapping relation through the second sample focusing position to obtain the second sample distance;
determining spherical coordinates of the sample target in a polar coordinate system based on the second sample distance, a horizontally rotated sample deflection angle of the PTZ camera, and a vertically rotated sample pitch angle;
and converting the spherical coordinates of the sample target in the polar coordinate system into sample space coordinates of the sample target in the three-dimensional coordinate system based on the conversion relation between the polar coordinate system and the three-dimensional coordinate system.
7. The method of claim 2, wherein modifying the plane equation parameters of the initial plane equation and determining the modified plane equation as the target plane equation comprises:
Acquiring parameter correction data corresponding to a correction target, wherein the correction target is a target before the target to be detected, and the parameter correction data comprises a target distance between the correction target and the PTZ camera, a target deflection angle and a target pitch angle of the PTZ camera, and a target focusing position corresponding to the target distance;
searching a peripheral focusing position of the target focusing position, selecting a clear focusing position from the searched focusing positions, and determining a correction distance between the correction target and the PTZ camera based on the clear focusing position;
determining a first spatial coordinate of the correction target in a three-dimensional coordinate system based on the target distance, the target deflection angle and the target pitch angle; determining a second spatial coordinate of the correction target in a three-dimensional coordinate system based on the correction distance, the target deflection angle and the target pitch angle;
and determining a deviation vector based on the first space coordinate and the second space coordinate, and correcting the plane equation parameter based on the deviation vector and the configured correction coefficient to obtain a target plane equation.
8. A focus position determining apparatus, characterized by being applied to a PTZ camera, comprising:
The acquisition module is used for acquiring a target deflection angle of the horizontal rotation of the PTZ camera and acquiring a target pitch angle of the vertical rotation of the PTZ camera;
the determining module is used for determining the target distance between the target to be detected and the PTZ camera based on the intersection point coordinates between the target deflection angle, the target pitch angle pointed direction vector and the target plane equation;
the query module is used for querying the configured mapping relation based on the target distance to obtain a target focusing position corresponding to the target distance; the mapping relationship represents a relationship between distance and focus position;
and the processing module is used for focusing the PTZ camera based on the target focusing position, and acquiring a target image aiming at the target to be detected through the PTZ camera after the PTZ camera is focused.
9. The apparatus of claim 8, wherein the device comprises a plurality of sensors,
the acquisition module is further used for acquiring the target plane equation; the obtaining module is specifically configured to: acquiring a plurality of sample space coordinates, wherein the acquiring mode of each sample space coordinate comprises the following steps: determining a first sample focus position based on a first sample distance between the sample target and the PTZ camera; searching the peripheral focusing positions of the first sample focusing position to obtain a second sample focusing position; determining a second sample distance based on the second sample focus position, and acquiring sample space coordinates based on the second sample distance; determining plane equation parameters based on the plurality of sample space coordinates; determining an initial plane equation based on the plane equation parameters; determining an initial plane equation as a target plane equation, or correcting plane equation parameters of the initial plane equation, and determining the corrected plane equation as the target plane equation;
The PTZ camera comprises a plurality of pitch angle intervals, wherein for each pitch angle interval, sample space coordinates corresponding to the pitch angle interval are selected from all sample space coordinates, and a target plane equation corresponding to the pitch angle interval is determined based on the sample space coordinates corresponding to the pitch angle interval; the determining module is specifically configured to determine a target distance between a target to be detected and the PTZ camera based on coordinates of an intersection point between the target yaw angle, a direction vector pointed by the target pitch angle, and a target plane equation: determining a target pitch angle interval in which the target pitch angle is located from the plurality of pitch angle intervals; determining a target distance between the target to be detected and the PTZ camera based on an intersection point coordinate between the direction vector and a target plane equation corresponding to the target pitch angle interval;
wherein the acquisition module is specifically configured to, when determining the first sample focus position based on the first sample distance between the sample target and the PTZ camera: determining the first sample distance based on a longitudinal field angle of the PTZ camera, a sample pitch angle of a vertical rotation of the PTZ camera, a suspension height of the PTZ camera, a height of the sample target, a corresponding longitudinal pixel coordinate of the sample target in a picture, and a total number of longitudinal pixels of the picture; inquiring the mapping relation based on the first sample distance to obtain the first sample focusing position;
The acquiring module searches the peripheral focusing position of the first sample focusing position, and is specifically used for obtaining a second sample focusing position when the second sample focusing position is obtained: acquiring a plurality of scanning step sizes corresponding to a plurality of object distance segments; for each object distance segment, inquiring the mapping relation through a first object distance corresponding to the object distance segment to obtain a focusing position corresponding to the first object distance, and inquiring the mapping relation through a second object distance corresponding to the object distance segment to obtain a focusing position corresponding to the second object distance; determining a scanning step length corresponding to the object distance section based on the first object distance, the second object distance, a focusing position corresponding to the first object distance and a focusing position corresponding to the second object distance; for each scanning step, searching for a peripheral focusing position of the first sample focusing position based on the scanning step, focusing the PTZ camera based on the searched focusing position, collecting a sample image for a sample target, and determining a sharpness score based on the sample image; determining a focusing position corresponding to the maximum definition fraction as the second sample focusing position;
the acquiring module determines a second sample distance based on the second sample focusing position, and is specifically configured to, when acquiring the sample space coordinate based on the second sample distance: inquiring the mapping relation through the second sample focusing position to obtain the second sample distance; determining spherical coordinates of the sample target in a polar coordinate system based on the second sample distance, a horizontally rotated sample deflection angle of the PTZ camera, and a vertically rotated sample pitch angle; based on a conversion relation between a polar coordinate system and a three-dimensional coordinate system, converting spherical coordinates of the sample target in the polar coordinate system into sample space coordinates of the sample target in the three-dimensional coordinate system;
The acquiring module corrects the plane equation parameters of the initial plane equation, and is specifically used for determining the corrected plane equation as the target plane equation: acquiring parameter correction data corresponding to a correction target, wherein the correction target is a target before the target to be detected, and the parameter correction data comprises a target distance between the correction target and the PTZ camera, a target deflection angle and a target pitch angle of the PTZ camera, and a target focusing position corresponding to the target distance; searching a peripheral focusing position of the target focusing position, selecting a clear focusing position from the searched focusing positions, and determining a correction distance between the correction target and the PTZ camera based on the clear focusing position; determining a first spatial coordinate of the correction target in a three-dimensional coordinate system based on the target distance, the target deflection angle and the target pitch angle; determining a second spatial coordinate of the correction target in a three-dimensional coordinate system based on the correction distance, the target deflection angle and the target pitch angle; and determining a deviation vector based on the first space coordinate and the second space coordinate, and correcting the plane equation parameter based on the deviation vector and the configured correction coefficient to obtain a target plane equation.
10. An electronic device, comprising: a processor and a machine-readable storage medium storing machine-executable instructions executable by the processor; the processor is configured to execute machine executable instructions to implement the method of any of claims 1-7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310184881.6A CN116193256A (en) | 2023-02-17 | 2023-02-17 | Determining a focal position of a camera |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310184881.6A CN116193256A (en) | 2023-02-17 | 2023-02-17 | Determining a focal position of a camera |
Publications (1)
Publication Number | Publication Date |
---|---|
CN116193256A true CN116193256A (en) | 2023-05-30 |
Family
ID=86440229
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310184881.6A Pending CN116193256A (en) | 2023-02-17 | 2023-02-17 | Determining a focal position of a camera |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116193256A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN118381999A (en) * | 2024-06-21 | 2024-07-23 | 浙江大华技术股份有限公司 | Focusing method, device, equipment and medium |
-
2023
- 2023-02-17 CN CN202310184881.6A patent/CN116193256A/en active Pending
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN118381999A (en) * | 2024-06-21 | 2024-07-23 | 浙江大华技术股份有限公司 | Focusing method, device, equipment and medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107659774B (en) | Video imaging system and video processing method based on multi-scale camera array | |
US9723203B1 (en) | Method, system, and computer program product for providing a target user interface for capturing panoramic images | |
US9667862B2 (en) | Method, system, and computer program product for gamifying the process of obtaining panoramic images | |
US10033924B2 (en) | Panoramic view imaging system | |
US8279297B2 (en) | Image processing apparatus, image processing method, and storage medium | |
JP4488804B2 (en) | Stereo image association method and three-dimensional data creation apparatus | |
CN112714287B (en) | Cloud deck target conversion control method, device, equipment and storage medium | |
US9565348B2 (en) | Automatic tracking apparatus | |
CN113194263A (en) | Gun and ball linkage control method and device, computer equipment and storage medium | |
WO2013096331A1 (en) | Method and system for image centering during zooming | |
WO2022057800A1 (en) | Gimbal camera, gimbal camera tracking control method and apparatus, and device | |
JPWO2006092961A1 (en) | Image generation method, object detection method, object detection apparatus, and image generation program | |
CN116193256A (en) | Determining a focal position of a camera | |
US7376249B2 (en) | Determination of a motion of a background in a series of images | |
TWI726536B (en) | Image capturing method and image capturing apparatus | |
CN112995624A (en) | Trapezoidal error correction method and device for projector | |
CN112702513B (en) | Double-optical-pan-tilt cooperative control method, device, equipment and storage medium | |
US10783646B2 (en) | Method for detecting motion in a video sequence | |
CN112203066A (en) | Target tracking dynamic projection method and dynamic projection equipment | |
CN114500839B (en) | Visual cradle head control method and system based on attention tracking mechanism | |
CN114845050A (en) | A focusing method, camera device, drone and storage medium | |
CN113963119A (en) | Method for constructing live-action three-dimensional model | |
CN108174054B (en) | Panoramic motion detection method and device | |
CN114268732A (en) | Pan-tilt camera, pan-tilt camera tracking control method, pan-tilt camera tracking control device and pan-tilt camera tracking control equipment | |
CN113038120B (en) | Backhaul difference determining method and device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |