CN118301474B - Image forming method, system, electronic device and computer readable storage medium - Google Patents
Image forming method, system, electronic device and computer readable storage medium Download PDFInfo
- Publication number
- CN118301474B CN118301474B CN202410705918.XA CN202410705918A CN118301474B CN 118301474 B CN118301474 B CN 118301474B CN 202410705918 A CN202410705918 A CN 202410705918A CN 118301474 B CN118301474 B CN 118301474B
- Authority
- CN
- China
- Prior art keywords
- point
- lens
- imaging
- target
- far
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/67—Focus control based on electronic image sensor signals
- H04N23/671—Focus control based on electronic image sensor signals in combination with active ranging signals, e.g. using light or sound signals emitted toward objects
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
- H04N23/54—Mounting of pick-up tubes, electronic image sensors, deviation or focusing coils
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
- H04N23/55—Optical parts specially adapted for electronic image sensors; Mounting thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/95—Computational photography systems, e.g. light-field imaging systems
- H04N23/958—Computational photography systems, e.g. light-field imaging systems for extended depth of field imaging
- H04N23/959—Computational photography systems, e.g. light-field imaging systems for extended depth of field imaging by adjusting depth of field during image capture, e.g. maximising or setting range based on scene characteristics
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Computing Systems (AREA)
- Theoretical Computer Science (AREA)
- Studio Devices (AREA)
Abstract
The application discloses an image imaging method, an image imaging system, electronic equipment and a computer readable storage medium, wherein the method comprises the following steps: acquiring a shooting area of imaging equipment, and determining a near point object distance corresponding to a near point position of a target in the shooting area and a far point object distance corresponding to a far point position of the target in the shooting area; determining a near point defocus amount between an imaging point of the near point position and the target image plane and a far point defocus amount between an imaging point of the far point position and the target image plane based on the near point object distance, the far point object distance and the focal length of the imaging device; the target image plane passes through a focus of the imaging device and forms a preset angle with an optical axis of the imaging device; and determining lens parameters of a coding lens corresponding to the imaging device based on the near point defocus amount and the far point defocus amount, and imaging a target positioned in a shooting area to a target image plane by using the coding lens. By the mode, the depth of field of the imaging equipment can be expanded, and the imaging quality of an image can be guaranteed.
Description
Technical Field
The present application relates to the field of image acquisition technology, and in particular, to an image imaging method, an image imaging system, an electronic device, and a computer readable storage medium.
Background
With the popularity of imaging devices, attention is paid to how to acquire a clear target, and the depth of field of the imaging device is an important factor affecting the sharpness of the target at different positions in an image. In the prior art, the depth of field is usually expanded by reducing the light inlet amount of the imaging equipment, but a large amount of attenuation of the light inlet amount is difficult to meet the requirement of obtaining high-quality imaging under different illumination environments, and finally the depth of field expansion and the imaging quality are difficult to be considered. In view of this, how to expand the depth of field of the imaging device and ensure the imaging quality of the image is a technical problem to be solved.
Disclosure of Invention
The application mainly solves the technical problem of providing an image imaging method, an image imaging system, electronic equipment and a computer readable storage medium, which can expand the depth of field of the imaging equipment and ensure the imaging quality of images.
To solve the above technical problem, a first aspect of the present application provides an image imaging method, including: acquiring a shooting area of imaging equipment, and determining a near point object distance corresponding to a near point position of a target in the shooting area and a far point object distance corresponding to a far point position of the target in the shooting area; determining a near point defocus amount between an imaging point of the near point position and a target image plane and a far point defocus amount between an imaging point of the far point position and a target image plane based on the near point object distance, the far point object distance, and a focal length of the imaging device; the target image plane passes through a focus of the imaging device and forms a preset angle with an optical axis of the imaging device; and determining lens parameters of a coding lens corresponding to the imaging device based on the near point defocus amount and the far point defocus amount, and imaging a target positioned in the shooting area to the target image plane by using the coding lens.
To solve the above technical problem, a second aspect of the present application provides an image imaging system, including: the imaging device comprises a lens and an image sensor, wherein the encoding lens is arranged at the exit pupil position of the lens, and the lens parameters of the encoding lens are determined based on the method of the first aspect.
To solve the above technical problem, a third aspect of the present application provides an electronic device, including: a memory and a processor coupled to each other, wherein the memory stores program data, and the processor invokes the program data to perform the method of the first aspect.
To solve the above technical problem, a fourth aspect of the present application provides a computer-readable storage medium having stored thereon program data which, when executed by a processor, implements the method described in the first aspect.
According to the scheme, the shooting area of the imaging equipment is acquired, the near point object distance corresponding to the target in the near point position in the shooting area is determined, the far point object distance corresponding to the target in the far point position in the shooting area is determined, the imaging point of the near point position is determined based on the near point object distance, the far point object distance and the focal length of the imaging equipment, so that the near point defocus amount between the imaging point of the near point position and the target image plane is obtained, the imaging point of the far point position is determined, and the far point defocus amount between the imaging point of the far point position and the target image plane is obtained. The target image plane passes through a focus of the imaging device, an included angle between the target image plane and an optical axis of the imaging device is a preset angle, and the target image plane is used for realizing clear imaging of a target in a full range. Based on the near point defocus amount and the far point defocus amount, lens parameters of a coding lens capable of adjusting the imaging position of the target are determined, when the imaging device needs to image the target in the shooting area, the target in the shooting area is imaged to a target imaging surface by using the coding lens, so that the targets from the near point position to the far point position can be imaged to the target imaging surface by setting the coding lens, the depth of field of the imaging device is expanded, the light entering amount does not need to be reduced sharply, and the imaging quality of the image is ensured.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings required for the description of the embodiments will be briefly described below, and it is apparent that the drawings in the following description are only some embodiments of the present application, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art. Wherein:
FIG. 1 is a flow chart of an embodiment of an image forming method according to the present application;
FIG. 2 is a schematic diagram of the imaging device of the present application imaging while facing a plane in which a target is located;
FIG. 3 is a schematic view of the installation of the imaging device of the present application in a practical application scenario;
FIG. 4 is a schematic diagram of the imaging device of the present application when arranged in a practical application scenario for imaging;
FIG. 5 is a flow chart of another embodiment of the image forming method of the present application;
FIG. 6 is a schematic diagram of an embodiment of an image imaging system according to the present application;
FIG. 7 is a schematic diagram of an embodiment of an electronic device of the present application;
fig. 8 is a schematic diagram of a computer-readable storage medium according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only some embodiments of the present application, but not all embodiments, and that adaptive combinations may be made between different embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
The terms "system" and "network" are often used interchangeably herein. The term "and/or" is herein merely an association relationship describing an associated object, meaning that there may be three relationships, e.g., a and/or B, may represent: a exists alone, A and B exist together, and B exists alone. In addition, the character "/" herein generally indicates that the front and rear associated objects are an "or" relationship. Further, "a plurality" herein means two or more than two.
The image imaging method provided by the application is applied to imaging equipment, and the corresponding execution main body is a processing unit in the imaging equipment.
Referring to fig. 1, fig. 1 is a flowchart of an embodiment of an image forming method according to the present application, the method includes:
S101: and acquiring a shooting area of the imaging equipment, and determining a near point object distance corresponding to a near point position of the target in the shooting area and a far point object distance corresponding to a far point position of the target in the shooting area.
Specifically, a shooting area of the imaging device is acquired, a near point object distance corresponding to a target in a near point position in the shooting area is determined, and a far point object distance corresponding to the target in a far point position in the shooting area is determined.
In some implementations, a shooting area of an imaging device is acquired, two endpoints capable of acquiring a target are determined in the shooting area, an endpoint close to the imaging device is taken as a near point position, an endpoint far away from the imaging device is taken as a far point position, a near point object distance corresponding to the near point position, and a far point object distance corresponding to the far point position are determined.
In some implementation scenes, a shooting area of imaging equipment is acquired, a reference plane where a target is located is determined in the shooting area, two points are arbitrarily calibrated on the reference plane, a point position close to the imaging equipment is taken as a near point position, a point position far away from the imaging equipment is taken as a far point position, and a near point object distance corresponding to the near point position and a far point object distance corresponding to the far point position are determined.
It will be appreciated that the imaging area is related to the mounting position and angle of view of the imaging device, and that the imaging area of the imaging device can be determined after the imaging device mounting position is determined.
S102: determining a near point defocus amount between an imaging point of the near point position and the target image plane and a far point defocus amount between an imaging point of the far point position and the target image plane based on the near point object distance, the far point object distance and the focal length of the imaging device; the target image plane passes through the focus of the imaging device and forms a preset angle with the optical axis of the imaging device.
Specifically, based on the near point object distance, the far point object distance and the focal length of the imaging device, determining an imaging point of the near point position, thereby obtaining a near point defocus amount between the imaging point of the near point position and the target image plane, and determining an imaging point of the far point position, thereby obtaining a far point defocus amount between the imaging point of the far point position and the target image plane. The target image plane passes through a focus of the imaging device, an included angle between the target image plane and an optical axis of the imaging device is a preset angle, and the target image plane is used for realizing clear imaging of a target in a full range.
It should be noted that, when the imaging device faces the plane where the object is located, the object can be imaged without aberration, for convenience of description, please refer to fig. 2, fig. 2 is a schematic diagram of the imaging device of the present application when the imaging device faces the plane where the object is located, where the entrance pupil to the exit pupil of the imaging device are simplified to be a black box in fig. 2, for convenience of description, the object plane coordinate system is (X 0, Y0), the image plane coordinate system is (X i, Yi), the entrance pupil and the exit pupil are conjugated, the imaging device coordinate system is (X, Y), and there is a corresponding pupil function Q (X, Y).
Further, the real object can be seen as being composed of many different spatial frequency points, each of which has different light intensity and amplitude. After a point on the object plane passes through the exit pupil, the complex amplitude distribution formed on the image plane is an impulse response function h i(xi, yi, x0, y0, which is obtained by fraunhofer diffraction of a pupil function Q (x, y); the point spread function is subjected to Fourier transformation to obtain an optical transfer function H (f x, fy), wherein the optical transfer function is a normalization function for comprehensively and objectively evaluating imaging quality of an optical system, and the relation between the optical transfer function and a pupil function is as follows:
(1)
Where z 0 is the object distance, z i is the image distance, and λ is the wavelength.
Further, in the practical application scene, the imaging device is usually not installed over the plane where the target is located, for convenience of explanation, referring to fig. 3, fig. 3 is a schematic installation diagram of the imaging device in the practical application scene, because the imaging device is not over the plane where the target is located, taking the target shown by a rectangle in the figure as an example, when the plane where the top of the target is located needs to be clearly imaged, a certain aberration exists on the wave surface of the oblique plane, at this time, the exit pupil function P (x, y) =q (x, y) exp [ ikW (x, y) ], due to the existence of the wave aberration W (x, y), the ideal imaging plane and the practical imaging plane have a certain defocus, at this time, the optical transfer function H (f x, fy) on the practical imaging plane changes, so that the imaging quality on the practical imaging plane is reduced, and the imaging cannot be clearly performed in the deep panoramic range.
It should be noted that, referring to fig. 4, fig. 4 is a schematic diagram of an imaging device of the present application when the imaging device is set in an actual application scene to perform imaging, after a near point object distance and a far point object distance are determined, a focal length of the imaging device is acquired, the near point object distance is marked as L1, the near point image distance is marked as L1', the far point object distance is marked as L2, the far point image distance is marked as L2', the focal length is marked as f, the near point image distance can be determined based on an object image relationship 1/l1+1/L1 '=1/f, so as to obtain an imaging point corresponding to the near point position, and the far point image distance can be determined based on an object image relationship 1/l2+1/L2' =1/f, so as to obtain an imaging point corresponding to the far point position.
In some implementation scenes, a near point image distance corresponding to a near point position is determined based on the near point image distance and a focal length of imaging equipment, a distance between an imaging point corresponding to the near point image distance and a target image surface is acquired, a near point defocus amount is obtained, a far point image distance corresponding to a far point position is determined based on the far point image distance and the focal length of the imaging equipment, and a distance between an imaging point corresponding to the far point image distance and the target image surface is acquired, so that a far point defocus amount is obtained.
In some implementations, a current image plane of the imaging device is determined based on a near point image distance, a far point image distance and a focal length, an imaging point of the near point position on the current image plane and an imaging point of the far point position on the current item image plane are acquired, a near point defocus amount corresponding to the near point position is determined based on the imaging point of the near point position on the current image plane and a target image plane, and a far point defocus amount corresponding to the far point position is determined based on the imaging point of the far point position on the current image plane and the target image plane.
It is understood that in the conventional imaging apparatus, the preset angle between the target image plane and the optical axis of the imaging apparatus is 90 degrees, and the numerical value of the preset angle is not particularly limited in the present application.
S103: and determining lens parameters of a coding lens corresponding to the imaging device based on the near point defocus amount and the far point defocus amount, and imaging a target positioned in a shooting area to a target image plane by using the coding lens.
Specifically, based on the near-point defocus amount and the far-point defocus amount, lens parameters of a coding lens capable of adjusting the imaging position of the target are determined, and when the imaging apparatus needs to image the target in the photographing region, the target located in the photographing region is imaged to the target image plane using the coding lens.
In some implementations, an optical path adjustment function between the current image plane and the target image plane is determined based on the near point defocus amount and the far point defocus amount, and refractive index and thickness parameters of the encoding lens are determined based on the optical path adjustment function, resulting in lens parameters of the encoding lens. Wherein the thickness parameter of the encoded lens is matched with the optical path adjustment function.
In some implementations, the refractive index corresponding to the encoding lens is obtained, the thickness corresponding to the near point position on the encoding lens is determined based on the near point defocus amount and the refractive index of the encoding lens, the thickness corresponding to the far point position on the encoding lens is determined based on the far point defocus amount and the refractive index of the encoding lens, and the thickness parameter of the encoding lens is determined based on the thicknesses of the near point position and the far point position on the encoding lens. Wherein the lens parameters include refractive index and thickness parameters.
It should be noted that, the encoding lens is configured to be disposed at an exit pupil position of the imaging device, so as to adjust light rays after the exit pupil, and adjust different spatial frequency points on the object plane to the target image plane, thereby realizing clear imaging in a full depth of field range.
It can be understood that by arranging the coding lens, targets from the near point position to the far point position can be imaged to the target image surface, the depth of field of the imaging equipment is expanded, the light incoming quantity is not required to be reduced sharply, and the imaging quality of the image is ensured.
According to the scheme, the shooting area of the imaging equipment is acquired, the near point object distance corresponding to the target in the near point position in the shooting area is determined, the far point object distance corresponding to the target in the far point position in the shooting area is determined, the imaging point of the near point position is determined based on the near point object distance, the far point object distance and the focal length of the imaging equipment, so that the near point defocus amount between the imaging point of the near point position and the target image plane is obtained, the imaging point of the far point position is determined, and the far point defocus amount between the imaging point of the far point position and the target image plane is obtained. The target image plane passes through a focus of the imaging device, an included angle between the target image plane and an optical axis of the imaging device is a preset angle, and the target image plane is used for realizing clear imaging of a target in a full range. Based on the near point defocus amount and the far point defocus amount, lens parameters of a coding lens capable of adjusting the imaging position of the target are determined, when the imaging device needs to image the target in the shooting area, the target in the shooting area is imaged to a target imaging surface by using the coding lens, so that the targets from the near point position to the far point position can be imaged to the target imaging surface by setting the coding lens, the depth of field of the imaging device is expanded, the light entering amount does not need to be reduced sharply, and the imaging quality of the image is ensured.
Referring to fig. 5, fig. 5 is a flowchart illustrating another embodiment of an image forming method according to the present application, the method includes:
s501: and acquiring a shooting area of the imaging equipment, and determining a near point object distance corresponding to a near point position of the target in the shooting area and a far point object distance corresponding to a far point position of the target in the shooting area.
Specifically, a shooting area of the imaging device is acquired, a near point object distance corresponding to a target in a near point position in the shooting area is determined, and a far point object distance corresponding to the target in a far point position in the shooting area is determined.
In some implementation scenarios, acquiring an installation position of the imaging device, and determining a shooting area with the installation position matched; a near point object distance is determined based on the mounting position and the target located at the near point position in the photographing region, and a far point object distance is determined based on the mounting position and the target located at the far point position in the photographing region.
Specifically, the installation position of the imaging device is acquired, a shooting area matched with the imaging device on the corresponding installation position is determined based on the installation position and the field angle of the imaging device, a target located at a near point position in the shooting area is acquired, an accurate near point object distance is obtained based on the installation position and the target located at the near point position, a target located at a far point position in the shooting area is acquired, and an accurate far point object distance is obtained based on the installation position and the target located at the far point position.
It can be understood that the mounting position is calibrated in advance, and after the accurate mounting position is obtained, the corresponding shooting area can be calibrated, so that the accurate object distance corresponding to the target in the shooting area is obtained, and the accurate image distance is obtained later, so that the light path can be adjusted.
It will be appreciated that the mounting location includes the mounting height of the imaging device relative to the target plane in which the target is located, as well as the mounting angle between the optical axis of the imaging device and the target plane in which the target is located.
Specifically, referring to fig. 3 again, the horizontal axis indicated by the solid line is the target plane where the target is located, and the installation position of the imaging device includes the installation height of the imaging device compared to the target plane, that is, the height difference between the imaging device and the horizontal axis on the vertical axis indicated by the solid line in fig. 3, and the installation position of the imaging device further includes the included angle between the optical axis of the imaging device and the target plane as the installation angle, that is, the included angle between the corresponding solid line and the horizontal axis at the imaging device in fig. 3.
Further, determining a near point object distance based on the mounting position and the target located at the near point position in the photographing region, and determining a far point object distance based on the mounting position and the target located at the far point position in the photographing region, includes: acquiring a reference height of a target, and determining a reference plane corresponding to the reference height; the near point object distance is determined based on the mounting height, the mounting angle, and the intersection point of the reference plane and the photographing region near the imaging device, and the far point object distance is determined based on the mounting height, the mounting angle, and the intersection point of the reference plane and the photographing region far from the imaging device.
Specifically, please continue to refer to fig. 3, the area covered by the dashed line emitted by the imaging device in fig. 3 is a shooting area, the reference height of the target is obtained, the reference plane corresponding to the reference height is determined, wherein the reference plane is parallel to the target plane, that is, the dashed line on the top of the target represented by the rectangular frame in fig. 3 is the reference plane, the intersection point of the reference plane and the shooting area near the imaging device is obtained, that is, the first point in fig. 3, the near point object distance corresponding to the near point position is determined based on the installation height, the installation angle and the intersection point of the reference plane and the shooting area near the imaging device, that is, the second point in fig. 3, the far point object distance corresponding to the far point position is determined based on the installation height, the installation angle and the intersection point of the reference plane and the shooting area far from the imaging device.
It can be understood that the reference plane and the shooting area of the target are used for calibrating the near point object distance of the near point position, calibrating the far point object distance of the far point position, and the near point position and the far point position correspond to the two end points, so that the accurate near point object distance and the far point object distance corresponding to the most edge position can be obtained, and the thickness of the two end points can be obtained when the thickness parameter of the coding lens is determined later.
Optionally, the reference height of the target corresponds to the category of the target, and the reference plane and the category of the target are adaptively changed, so that the target can be adaptively calibrated when the target is switched.
S502: determining a near point defocus amount between an imaging point of the near point position and the target image plane and a far point defocus amount between an imaging point of the far point position and the target image plane based on the near point object distance, the far point object distance and the focal length of the imaging device; the target image plane passes through the focus of the imaging device and forms a preset angle with the optical axis of the imaging device.
Specifically, based on the near point object distance, the far point object distance and the focal length of the imaging device, determining an imaging point of the near point position, thereby obtaining a near point defocus amount between the imaging point of the near point position and the target image plane, and determining an imaging point of the far point position, thereby obtaining a far point defocus amount between the imaging point of the far point position and the target image plane.
In some implementations, determining a near point defocus amount between an imaging point of a near point location and a target image plane and a far point defocus amount between an imaging point of a far point location and the target image plane based on the near point object distance, the far point object distance, and a focal length of the imaging device includes: determining an imaging point of a near point position and a corresponding near point image distance based on the near point object distance and the focal distance, and determining a near point defocus amount between the imaging point of the near point position and a target image surface based on the focal distance and the near point image distance; and determining an imaging point of the far point position and a corresponding far point image distance based on the far point object distance and the focal distance, and determining a far point defocus amount between the imaging point of the far point position and the target image surface based on the focal distance and the far point image distance.
Specifically, based on an object image relationship, the imaging point of the near point position and the near point image distance corresponding to the imaging point are determined by utilizing the near point object distance and the focal length, the difference value between the near point image distance and the focal length is used as the near point defocus amount between the imaging point of the near point position and the target image surface, based on the object image relationship, the imaging point of the far point position and the far point image distance corresponding to the imaging point are determined by utilizing the far point object distance and the focal length, the difference value between the far point image distance and the focal length is used as the far point defocus amount between the imaging point of the far point position and the target image surface, so that the accurate defocus amounts at different positions are determined, the defocus amounts are obtained after the uniform difference between the imaging distance and the focal length, and the direction of defocus amount deviation can be determined.
S503: and determining lens parameters of the coded lens corresponding to the imaging device based on the near point defocus amount and the far point defocus amount.
Specifically, lens parameters of the encoding lens capable of adjusting the imaging position of the target are determined based on the near-point defocus amount and the far-point defocus amount.
In some implementations, determining lens parameters of a coded lens corresponding to an imaging device based on a near point defocus amount and a far point defocus amount includes: acquiring the refractive index of the coded lens, determining the near point lens adjustment thickness corresponding to the near point position based on the near point defocus amount and the refractive index, and determining the far point lens adjustment thickness corresponding to the far point position based on the far point defocus amount and the refractive index; acquiring the center thickness of the coding lens, and acquiring thickness parameters of the coding lens based on the center thickness, the near point lens adjustment thickness and the far point lens adjustment thickness; wherein the lens parameters include refractive index and thickness parameters.
Specifically, for points on the oblique object plane where the object distances are different, there is a different defocus amount Δl, and then the wave aberration W (x, y) =Δl, the corresponding phase change is: ψ=2pi/λ·Δl, if the phase after encoding by the encoding lens is changed to: psi' = -psi = -2 pi/lambda.delta L, the phase change due to wave aberration can be counteracted.
The relationship between the phase change of the code lens and the thickness to be adjusted is: ψ' =2pi/λ·n·Δd, where n is the refractive index of the encoded lens. Therefore, the relation between the thickness of the coded lens and the defocus amount is: after the refractive index of the coded lens is obtained, the near-point lens adjustment thickness corresponding to the near-point position can be determined based on the near-point defocus amount and the refractive index, and the far-point lens adjustment thickness corresponding to the far-point position is determined based on the far-point defocus amount and the refractive index, so that the thicknesses required to be adjusted on the coded lens of the near-point position and the far-point position which are mutually spaced are respectively determined.
Further, the center thickness of the coded lens is obtained, and the thickness parameters of the coded lens are set based on the center thickness, the near point lens adjustment thickness and the far point lens adjustment thickness to obtain the thickness parameters, so that the lens parameters composed of the refractive index and the thickness parameters are obtained.
Optionally, obtaining the thickness parameter of the encoded lens based on the center thickness, the near point lens adjustment thickness, and the far point lens adjustment thickness includes: adjusting the thickness based on the center thickness and the near point lens to obtain the thickness of the near point lens, and adjusting the thickness based on the center thickness and the far point lens to obtain the thickness of the far point lens; and obtaining a transformation function from the thickness of the near point lens to the thickness of the far point lens, and obtaining the thickness parameter of the coded lens based on the transformation function and the center thickness.
Specifically, the sum of the center thickness and the near-point lens adjustment thickness is taken as the near-point lens thickness, and the sum of the center thickness and the far-point lens thickness is taken as the far-point lens thickness.
It should be noted that, referring to fig. 4 again, since the defocus amount corresponds to the distance difference between the image distance and the focal length, the defocus amount of the near point is positive, the defocus amount of the far point is negative, and the relation between the thickness to be adjusted and the defocus amount corresponding to the encoding lens is: and Δd= (- Δl)/n, so that the near-point lens adjustment thickness is a negative value, the far-point lens adjustment thickness is a positive value, the value of the near-point lens thickness is finally smaller than the value of the far-point lens thickness, and after the near-point lens thickness and the far-point lens thickness are obtained, a transformation function from the near-point lens thickness to the far-point lens thickness can be determined, wherein the transformation function is linear transformation, and the thickness parameter of the whole coding lens can be determined based on the transformation function and the center thickness, so that the coding lens can be matched with the current target, and clear imaging in the full depth range is realized.
It will be appreciated that the cross-section of the side of the encoded lens facing the lens, which is based on the transformation function, is trapezoidal.
S504: and arranging the coding lens at the exit pupil position of the lens to obtain the image imaging system.
Specifically, a coding lens matched with the lens parameters is obtained, the coding lens is arranged at the exit pupil position of the lens and used for adjusting light rays, and an image imaging system formed by the imaging equipment and the coding lens is obtained. Therefore, through an optical coding mode, the coding lens for depth expansion is designed, the depth expansion of the whole image imaging system is realized, and the imaging lens has the characteristics of simple structure, low cost, easiness in use and perception.
Optionally, the reference height of the target corresponds to the category of the target, and the reference plane and the category of the target are adaptively changed, so that the target can be adaptively calibrated when the target is switched, the change of the reference height can cause the change of the object distance, and further finally affect the lens parameters of the encoding lens, the image imaging system comprises the encoding lens matched with the targets of various categories, and after the target category is identified, the encoding lens matched with the target category is acquired, so that the targets of different categories are adaptively matched.
S505: and acquiring an image of the target in the shooting area by using an image imaging system, and imaging the target to a target image plane.
Specifically, an image imaging system is used for acquiring an image of a target in a shooting area, so that the target in the shooting area is imaged to a target image plane, and the target is clearly imaged in a full depth range.
In this embodiment, the installation position of the imaging device is acquired, the corresponding shooting area is calibrated, so that the accurate object distance corresponding to the target located in the shooting area is acquired, the accurate defocus amount at different positions is determined based on the object image relationship, the defocus amount is obtained after the image distance and the focal length are uniformly differed, the direction of defocus amount deviation can be determined, the refractive index of the encoding lens is acquired, the near point lens adjustment thickness corresponding to the near point position is determined based on the near point defocus amount and the refractive index, the far point lens adjustment thickness corresponding to the far point position is determined based on the far point defocus amount and the refractive index, and the thickness required to be adjusted on the encoding lens is determined according to the near point position and the far point position which are spaced from each other, so that the thickness parameter of the encoding lens is determined, and the lens parameter composed of the refractive index and the thickness parameter is obtained. And acquiring a coding lens matched with the lens parameters, and setting the coding lens at the exit pupil position of the lens for adjusting light rays to obtain an image imaging system formed by the imaging equipment and the coding lens. Therefore, through an optical coding mode, the coding lens for depth expansion is designed, the depth expansion of the whole image imaging system is realized, and the imaging lens has the characteristics of simple structure, low cost, easiness in use and perception.
Referring to fig. 6, fig. 6 is a schematic structural diagram of an embodiment of an image imaging system according to the present application, the image imaging system 60 includes an imaging device 601 and a coding lens 602, wherein the imaging device 601 includes a lens 6010 and an image sensor 6011, the coding lens 602 is disposed at an exit pupil position of the lens 6010, lens parameters of the coding lens 602 are determined based on the method in any of the foregoing embodiments, and details of the related embodiments are described in the foregoing method embodiments and will not be repeated herein. As can be seen from any of the above embodiments, the image imaging system 60 can expand the depth of field of the imaging device and ensure the imaging quality of the image.
Referring to fig. 7, fig. 7 is a schematic structural diagram of an electronic device according to an embodiment of the present application, the electronic device 70 includes a memory 701 and a processor 702 coupled to each other, wherein the memory 701 stores program data (not shown), and the processor 702 invokes the program data to implement the method in any of the above embodiments, and the description of the related content is referred to the detailed description of the above method embodiments and is not repeated herein.
Referring to fig. 8, fig. 8 is a schematic structural diagram of an embodiment of a computer readable storage medium 80 according to the present application, where the computer readable storage medium 80 stores program data 800, and the program data 800 when executed by a processor implements the method in any of the above embodiments, and details of the related content are described in the above embodiments, which are not repeated herein.
The units described as separate units may or may not be physically separate, and units displayed as units may or may not be physical units, may be located in one place, or may be distributed over a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the embodiment.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be embodied in essence or a part contributing to the prior art or all or part of the technical solution in the form of a software product stored in a storage medium, including several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) or a processor (processor) to execute all or part of the steps of the methods of the embodiments of the present application. And the aforementioned storage medium includes: a usb disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The foregoing description is only of embodiments of the present application, and is not intended to limit the scope of the application, and all equivalent structures or equivalent processes using the descriptions and the drawings of the present application or directly or indirectly applied to other related technical fields are included in the scope of the present application.
Claims (10)
1. An image imaging method, the image imaging method comprising:
acquiring a shooting area of imaging equipment, and determining a near point object distance corresponding to a near point position of a target in the shooting area and a far point object distance corresponding to a far point position of the target in the shooting area; wherein the photographing region is related to a mounting position and a viewing angle of the imaging apparatus;
determining a near point defocus amount between an imaging point of the near point position and a target image plane and a far point defocus amount between an imaging point of the far point position and a target image plane based on the near point object distance, the far point object distance, and a focal length of the imaging device; the target image plane passes through a focus of the imaging device and forms a preset angle with an optical axis of the imaging device;
and determining lens parameters of a coding lens corresponding to the imaging device based on the near point defocus amount and the far point defocus amount, and imaging a target located from the near point position to the far point position in the shooting area to the target image plane by using the coding lens.
2. The method of claim 1, wherein the determining lens parameters of the encoded lens corresponding to the imaging device based on the near point defocus amount and the far point defocus amount comprises:
acquiring the refractive index of the coded lens, determining the near point lens adjustment thickness corresponding to the near point position based on the near point defocus amount and the refractive index, and determining the far point lens adjustment thickness corresponding to the far point position based on the far point defocus amount and the refractive index;
Acquiring the center thickness of the coding lens, and obtaining thickness parameters of the coding lens based on the center thickness, the near point lens adjustment thickness and the far point lens adjustment thickness; wherein the lens parameters include the refractive index and the thickness parameter.
3. The method of claim 2, wherein the obtaining the thickness parameter of the encoded lens based on the center thickness, the near-point lens adjustment thickness, and the far-point lens adjustment thickness comprises:
Adjusting the thickness based on the center thickness and the near point lens to obtain the thickness of the near point lens, and adjusting the thickness based on the center thickness and the far point lens to obtain the thickness of the far point lens;
and obtaining a transformation function from the thickness of the near point lens to the thickness of the far point lens, and obtaining the thickness parameter of the coded lens based on the transformation function and the center thickness.
4. The method according to claim 1, wherein the acquiring a shooting area of the imaging apparatus, determining a near-point object distance corresponding to a near-point position of a target in the shooting area, and a far-point object distance corresponding to a far-point position of the target in the shooting area, comprises:
acquiring the installation position of the imaging equipment, and determining the shooting area matched with the installation position;
the near point object distance is determined based on the mounting position and the target located at the near point position in the shooting area, and the far point object distance is determined based on the mounting position and the target located at the far point position in the shooting area.
5. The method of claim 4, wherein the mounting location comprises a mounting height of the imaging device relative to a target plane in which the target is located, and a mounting angle between an optical axis of the imaging device and the target plane in which the target is located;
the determining the near point object distance based on the mounting position and the target located at the near point position in the shooting area, and determining the far point object distance based on the mounting position and the target located at the far point position in the shooting area includes:
acquiring a reference height of a target, and determining a reference plane corresponding to the reference height;
the near point object distance is determined based on the mounting height, the mounting angle, and an intersection point of the reference plane and the photographing region near the imaging device, and the far point object distance is determined based on the mounting height, the mounting angle, and an intersection point of the reference plane and the photographing region far from the imaging device.
6. The method of claim 1, wherein the determining a near point defocus amount between the imaging point of the near point location and the target image plane and a far point defocus amount between the imaging point of the far point location and the target image plane based on the near point object distance, the far point object distance, and the focal length of the imaging device comprises:
determining an imaging point of the near point position and a corresponding near point image distance thereof based on the near point object distance and the focal distance, and determining a near point defocus amount between an imaging point of the near point position and a target image plane based on the focal distance and the near point image distance; and
And determining an imaging point of the far point position and a corresponding far point image distance based on the far point object distance and the focal distance, and determining a far point defocus amount between an imaging point of the far point position and a target image surface based on the focal distance and the far point image distance.
7. The method of claim 1, wherein the imaging device comprises a lens and an image sensor arranged in sequence, the imaging the subject located in the photographing region to the subject image plane using the encoding lens, comprising:
The coding lens is arranged at the exit pupil position of the lens, and an image imaging system is obtained;
And acquiring an image of the target in the shooting area by using the image imaging system, and imaging the target to the target image plane.
8. An image imaging system, the image imaging system comprising: an imaging device and a coded lens, wherein the imaging device comprises a lens and an image sensor, the coded lens is arranged at an exit pupil position of the lens, and lens parameters of the coded lens are determined based on the method of any one of claims 1-7.
9. An electronic device, comprising: a memory and a processor coupled to each other, wherein the memory stores program data that the processor invokes to perform the method of any of claims 1-7.
10. A computer readable storage medium having stored thereon program data, which when executed by a processor implements the method of any of claims 1-7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202410705918.XA CN118301474B (en) | 2024-06-03 | 2024-06-03 | Image forming method, system, electronic device and computer readable storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202410705918.XA CN118301474B (en) | 2024-06-03 | 2024-06-03 | Image forming method, system, electronic device and computer readable storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN118301474A CN118301474A (en) | 2024-07-05 |
CN118301474B true CN118301474B (en) | 2024-08-30 |
Family
ID=91686180
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202410705918.XA Active CN118301474B (en) | 2024-06-03 | 2024-06-03 | Image forming method, system, electronic device and computer readable storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN118301474B (en) |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101390009A (en) * | 2006-02-22 | 2009-03-18 | 诺基亚公司 | Hydraulic Optical Focus Stabilizer |
CN113014790A (en) * | 2019-12-19 | 2021-06-22 | 华为技术有限公司 | Defocus conversion coefficient calibration method, PDAF method and camera module |
Family Cites Families (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5576739B2 (en) * | 2010-08-04 | 2014-08-20 | オリンパス株式会社 | Image processing apparatus, image processing method, imaging apparatus, and program |
TWI605418B (en) * | 2013-04-15 | 2017-11-11 | 晨星半導體股份有限公司 | Image editing method and image processing device |
FR3017765B1 (en) * | 2014-02-14 | 2017-05-19 | Foretec Soc Forezienne De Tech | VIDEO DEVICE AND VIEWING METHOD |
CN103869484B (en) * | 2014-03-07 | 2016-01-13 | 南开大学 | Method for Determining Imaging Depth in Large Imaging Depth 3D Display System of Optical 4f System |
CN105334681B (en) * | 2014-06-25 | 2018-01-30 | 深圳市墨克瑞光电子研究院 | Liquid crystal lens imaging device and liquid crystal lens imaging method |
JP2018101055A (en) * | 2016-12-20 | 2018-06-28 | オリンパス株式会社 | Focus adjustment device, imaging device and focus adjustment method |
CN114758290A (en) * | 2020-12-29 | 2022-07-15 | 浙江宇视科技有限公司 | Fire detection method, fire detection device, electronic equipment and storage medium |
CN112822402B (en) * | 2021-01-08 | 2023-04-18 | 重庆创通联智物联网有限公司 | Image shooting method and device, electronic equipment and readable storage medium |
CN114554085A (en) * | 2022-02-08 | 2022-05-27 | 维沃移动通信有限公司 | Focusing method and device, electronic equipment and storage medium |
-
2024
- 2024-06-03 CN CN202410705918.XA patent/CN118301474B/en active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101390009A (en) * | 2006-02-22 | 2009-03-18 | 诺基亚公司 | Hydraulic Optical Focus Stabilizer |
CN113014790A (en) * | 2019-12-19 | 2021-06-22 | 华为技术有限公司 | Defocus conversion coefficient calibration method, PDAF method and camera module |
Also Published As
Publication number | Publication date |
---|---|
CN118301474A (en) | 2024-07-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9142582B2 (en) | Imaging device and imaging system | |
CN110689581B (en) | Structured light module calibration method, electronic device, and computer-readable storage medium | |
US8773550B2 (en) | Range measurement using multiple coded apertures | |
US9742994B2 (en) | Content-aware wide-angle images | |
US8432479B2 (en) | Range measurement using a zoom camera | |
US7929801B2 (en) | Depth information for auto focus using two pictures and two-dimensional Gaussian scale space theory | |
WO2018153149A1 (en) | Automatic focusing method and apparatus based on region of interest | |
US8305485B2 (en) | Digital camera with coded aperture rangefinder | |
JP5358039B1 (en) | Imaging device | |
US11209268B2 (en) | Depth measuring method and system | |
US20110157387A1 (en) | Method and apparatus for generating image data | |
US20110267485A1 (en) | Range measurement using a coded aperture | |
JP2008511859A (en) | Extended depth of focus using a multifocal length lens with a controlled spherical aberration range and a concealed central aperture | |
US20100171815A1 (en) | Image data obtaining method and apparatus therefor | |
CN109191506B (en) | Depth map processing method, system and computer readable storage medium | |
JP6479666B2 (en) | Design method of passive single channel imager capable of estimating depth of field | |
CN118301474B (en) | Image forming method, system, electronic device and computer readable storage medium | |
CN110290395A (en) | A kind of image processing method, device and computer readable storage medium | |
Takemura et al. | Depth from defocus technique based on cross reblurring | |
US10176559B2 (en) | Image processing method applied to an electronic device with an image acquiring unit and electronic device | |
JP2004077914A (en) | Imaging device and its optical system | |
CN117528236B (en) | Adjustment method and device for virtual camera | |
JP2000249543A (en) | Distance measurement method using defocus | |
CN119629475A (en) | Focusing method and imaging device | |
CN119784822A (en) | A method, device and system for positioning based on a monocular fixed-focus pan-tilt camera |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |