[go: up one dir, main page]

CN111854981B - Deep learning wavefront restoration method based on single-frame focal plane light intensity image - Google Patents

Deep learning wavefront restoration method based on single-frame focal plane light intensity image Download PDF

Info

Publication number
CN111854981B
CN111854981B CN202010660807.3A CN202010660807A CN111854981B CN 111854981 B CN111854981 B CN 111854981B CN 202010660807 A CN202010660807 A CN 202010660807A CN 111854981 B CN111854981 B CN 111854981B
Authority
CN
China
Prior art keywords
far
field
wavefront
unique
light intensity
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010660807.3A
Other languages
Chinese (zh)
Other versions
CN111854981A (en
Inventor
孔令曦
程涛
邱学晶
杨超
王帅
杨平
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Institute of Optics and Electronics of CAS
Original Assignee
Institute of Optics and Electronics of CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Institute of Optics and Electronics of CAS filed Critical Institute of Optics and Electronics of CAS
Priority to CN202010660807.3A priority Critical patent/CN111854981B/en
Publication of CN111854981A publication Critical patent/CN111854981A/en
Application granted granted Critical
Publication of CN111854981B publication Critical patent/CN111854981B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J9/00Measuring optical phase difference; Determining degree of coherence; Measuring optical wavelength
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/77Retouching; Inpainting; Scratch removal
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J9/00Measuring optical phase difference; Determining degree of coherence; Measuring optical wavelength
    • G01J2009/002Wavefront phase distribution
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Biomedical Technology (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Image Analysis (AREA)
  • Photometry And Measurement Of Optical Pulse Characteristics (AREA)

Abstract

本发明公开了一种基于单帧焦面光强图像的深度学习波前复原方法,由于自适应光学系统中两组互为旋转180°复共轭关系的波前具有相同的远场光斑分布,导致由单个远场光斑反演近场波前时存在多解问题。基于walsh函数相位调制的波前复原方法可以保证远场光斑分布对应唯一的近场波前,但计算速度仍受迭代次数和单步迭代计算时间限制。深度学习算法可自提取图像深层特征信息,因此可在walsh函数对相位调制的基础上学习出远场光强图像到近场波前的映射关系,是远场图像到近场波前端到端的计算,可避免传统波前复原方法的迭代计算过程。基于此,本发明利用深度学习算法规避传统波前复原方法的迭代计算过程,提升计算效率,实现单帧焦面光强图像的快速波前复原。

Figure 202010660807

The invention discloses a deep learning wavefront restoration method based on a single-frame focal plane light intensity image. Since two sets of wavefronts in an adaptive optical system that are in a complex conjugate relationship rotated by 180° have the same far-field light spot distribution, This leads to the problem of multiple solutions when inverting the near-field wavefront from a single far-field spot. The wavefront restoration method based on the phase modulation of the walsh function can ensure that the far-field spot distribution corresponds to a unique near-field wavefront, but the calculation speed is still limited by the number of iterations and the calculation time of a single-step iteration. The deep learning algorithm can self-extract the deep feature information of the image, so the mapping relationship between the far-field light intensity image and the near-field wavefront can be learned based on the phase modulation of the walsh function, which is an end-to-end calculation from the far-field image to the near-field wavefront. , which can avoid the iterative calculation process of traditional wavefront restoration methods. Based on this, the present invention uses a deep learning algorithm to avoid the iterative calculation process of the traditional wavefront restoration method, improves calculation efficiency, and realizes fast wavefront restoration of a single frame of focal plane light intensity image.

Figure 202010660807

Description

一种基于单帧焦面光强图像的深度学习波前复原方法A deep learning wavefront restoration method based on single-frame focal plane light intensity image

技术领域technical field

本发明涉及一种波前复原方法,尤其涉及一种基于单帧焦面光强图像的深度学习波前复原方法。The invention relates to a wavefront restoration method, in particular to a deep learning wavefront restoration method based on a single-frame focal plane light intensity image.

背景技术Background technique

自适应光学系统中,常用泽尼克多项式表征圆域内波前,选用不同组泽尼克系数可生成不同的随机波前像差。但当两组波前具有互为旋转180°复共轭关系时,它们会产生相同的远场光斑分布,此种情况下采用传统波前复原算法从单个远场光斑反演近场波前相位,产生的对应波前不唯一,这将导致复原计算得到错误解甚至无法收敛。基于walsh函数相位调制的波前复原算法,能在相对于传统波前复原算法迭代次数较少的情况下,打破近场波前的空间对称性,使得远场光斑分布对应唯一的近场波前,也即对应唯一一组泽尼克系数(见于孔庆峰.基于单帧焦面图像的波前相位反演方法研究[D].电子科技大学,2019)。基于walsh函数调制的波前复原方法具有很多优点,其利用函数的非对称性解决了由近场波前旋转180°复共轭关系引起的多解问题,同时相比于传统算法减少了迭代次数,降低了算法运行时间。In adaptive optics systems, Zernike polynomials are often used to characterize the wavefront in the circular domain, and different random wavefront aberrations can be generated by selecting different groups of Zernike coefficients. However, when the two sets of wavefronts have a complex conjugate relationship that is rotated 180° to each other, they will produce the same far-field spot distribution. In this case, the traditional wavefront restoration algorithm is used to invert the near-field wavefront phase from a single far-field spot. , the corresponding wavefronts generated are not unique, which will lead to wrong solutions or even failure to converge in the restoration calculation. The wavefront restoration algorithm based on the phase modulation of the walsh function can break the spatial symmetry of the near-field wavefront with fewer iterations than the traditional wavefront restoration algorithm, so that the far-field spot distribution corresponds to a unique near-field wavefront , that is, corresponding to a unique set of Zernike coefficients (see Kong Qingfeng. Research on wavefront phase inversion method based on single-frame focal plane image [D]. University of Electronic Science and Technology of China, 2019). The wavefront restoration method based on walsh function modulation has many advantages. It uses the asymmetry of the function to solve the multi-solution problem caused by the 180° complex conjugate relationship of the near-field wavefront rotation, and at the same time reduces the number of iterations compared with the traditional algorithm. , reducing the running time of the algorithm.

但walsh函数相位片调制远场对波前复原结果的有效性,不仅取决于是否选取了正确的具有非180°旋转翻转对称性的walsh函数形状,不同阶次walsh函数与相片的相位台阶深度的选取同样会影响波前复原的精度。且该改进算法依旧延续了传统相位反演法的迭代解法,虽然迭代次数相对减少,但速度方面仍有欠缺。因此,如何在保证远场光斑分布对应唯一近场波前的同时,提高计算效率是一个厄需解决的问题。However, the effectiveness of the far-field modulation of the walsh function phase plate on the wavefront restoration result depends not only on whether the correct shape of the walsh function with non-180° rotational flip symmetry is selected, but also on the relationship between the walsh function of different orders and the phase step depth of the photo. The selection also affects the accuracy of the wavefront restoration. And the improved algorithm still continues the iterative solution of the traditional phase inversion method. Although the number of iterations is relatively reduced, there is still a lack of speed. Therefore, how to improve the computational efficiency while ensuring that the far-field spot distribution corresponds to a unique near-field wavefront is a problem that needs to be solved.

发明内容SUMMARY OF THE INVENTION

本发明要解决的技术问题是:在保证远场光斑反演近场波前相位解唯一和复原精确度的基础上,进一步提高运算速度。The technical problem to be solved by the present invention is to further improve the operation speed on the basis of ensuring the uniqueness of the near-field wavefront phase solution and the restoration accuracy of the far-field spot inversion.

本发明解决上述问题采用的技术方案是:一种基于单帧焦面光强图像的深度学习波前复原方法,选用具有多样性的样本数据集学习远场光斑分布与近场波前的映射关系,网络训练收敛之后,输入远场光斑图像,即可获得其唯一对应的波前像差,此映射求解过程不再使用迭代运算,减少求解时间,具体实现步骤如下:The technical scheme adopted by the present invention to solve the above problems is: a deep learning wavefront restoration method based on a single frame focal plane light intensity image, selecting a diverse sample data set to learn the mapping relationship between the far-field spot distribution and the near-field wavefront , after the network training converges, input the far-field spot image to obtain its unique corresponding wavefront aberration. This mapping solution process no longer uses iterative operations and reduces the solution time. The specific implementation steps are as follows:

步骤1:设计基于walsh函数相位调制的波前传感器;Step 1: Design a wavefront sensor based on walsh function phase modulation;

步骤2:验证步骤1设计的传感器是否可以保证远场光斑分布仅对应一个波前信息,即解唯一;Step 2: Verify whether the sensor designed in Step 1 can ensure that the far-field spot distribution corresponds to only one wavefront information, that is, the solution is unique;

步骤3:若解唯一,则根据步骤1的设计得到一一对应的远场光斑与近场波前数据,作为数据集,若解不唯一,则重复执行步骤1再次设计传感器,直至解唯一;Step 3: If the solution is unique, obtain the one-to-one correspondence of far-field spot and near-field wavefront data according to the design in Step 1. As a data set, if the solution is not unique, repeat Step 1 to design the sensor again until the solution is unique;

步骤4:配置深度学习环境,搭建学习网络;Step 4: Configure the deep learning environment and build a learning network;

步骤5:将数据集数量的90%作为训练集,使网络学习到远场光斑与近场波前的对应关系,剩余的10%样本做为验证集,调整并验证网络的正确性。Step 5: Use 90% of the data set as the training set, so that the network can learn the correspondence between the far-field spot and the near-field wavefront, and the remaining 10% of the samples are used as the verification set to adjust and verify the correctness of the network.

其中,walsh函数形状的选取应满足其调制相位为非180°旋转翻转对称。Among them, the selection of the shape of the walsh function should satisfy that its modulation phase is non-180° rotational flip symmetry.

其中,判断基于walsh函数对近场波前相位调制后是否获得唯一对应的远场光斑分布的理论基础为:Among them, the theoretical basis for judging whether the only corresponding far-field spot distribution can be obtained after phase modulation of the near-field wavefront based on the walsh function is:

对于互为180°旋转翻转对称的一组波前波前

Figure BDA0002578494180000021
Figure BDA0002578494180000022
其相位满足:For a set of wavefronts that are 180° rotationally flip symmetrical to each other
Figure BDA0002578494180000021
and
Figure BDA0002578494180000022
Its phase satisfies:

Figure BDA0002578494180000023
Figure BDA0002578494180000023

即两个波前对应的远场,在忽略系数的情况下,其复振幅可以分别表示为:That is, the far fields corresponding to the two wavefronts, in the case of ignoring the coefficients, their complex amplitudes can be expressed as:

Figure BDA0002578494180000024
Figure BDA0002578494180000024

Figure BDA0002578494180000025
Figure BDA0002578494180000025

式中(x,y)和(x0,y0)分别为近场和远场的笛卡尔空间二维坐标,(u,v)为频域坐标;可看出Ufar(x0,y0)与U′far(x0,y0)实部相等、虚部符号相反,则其对应的光强分布:where (x, y) and (x 0 , y 0 ) are the Cartesian space two-dimensional coordinates of the near field and far field, respectively, and (u, v) are the frequency domain coordinates; it can be seen that U far (x 0 , y 0 ) and U′ far (x 0 , y 0 ) have the same real part and opposite sign of the imaginary part, then the corresponding light intensity distribution:

|U′far(x0,y0)|2=|Ufar(x0,y0)|2 (4)|U′ far (x 0 , y 0 )| 2 = |U far (x 0 , y 0 )| 2 (4)

这说明波前

Figure BDA0002578494180000026
Figure BDA0002578494180000027
对应相同的远场光强分布;This shows that the wavefront
Figure BDA0002578494180000026
and
Figure BDA0002578494180000027
Corresponding to the same far-field light intensity distribution;

walsh函数中存在非180°旋转翻转对称的形状,利用该特性调制波前,假设相位片附加的离散像差为

Figure BDA0002578494180000028
且,In the walsh function, there is a non-180° rotation-flip symmetry shape. Using this characteristic to modulate the wavefront, it is assumed that the additional discrete aberration of the phase plate is
Figure BDA0002578494180000028
and,

Figure BDA0002578494180000029
Figure BDA0002578494180000029

则,but,

Figure BDA00025784941800000210
Figure BDA00025784941800000210

进而保证,to ensure that,

|U′far(x0,y0)|2≠|Ufar(x0,y0)|2 (7)|U′ far (x 0 , y 0 )| 2 ≠ |U far (x 0 , y 0 )| 2 (7)

即保证了远场光斑与近场波前的一一对应,也就是保证了远场反演波前解的唯一性;That is, the one-to-one correspondence between the far-field spot and the near-field wavefront is guaranteed, that is, the uniqueness of the far-field inversion wavefront solution is guaranteed;

因此,判断解是否唯一,只需比较互为180°旋转翻转对称的近场波前对应光强分布|U′far(x0,y0)|2与|Ufar(x0,y0)|2是否相同,相同则不唯一,不相同则唯一。Therefore, to judge whether the solution is unique, it is only necessary to compare the corresponding light intensity distributions |U' far (x 0 , y 0 )| 2 and |U far (x 0 , y 0 ) of the near-field wavefronts that are 180° rotationally flip symmetrical to each other | 2 is the same, the same is not unique, and the different is unique.

其中,数据集的数量应为万组以上,保证数据样本充足的同时,满足数据的多样性。Among them, the number of data sets should be more than 10,000 groups to ensure sufficient data samples and meet the diversity of data.

其中,学习网络可以是卷积神经网络(CNN),也可以是其他适用的神经网络。Among them, the learning network may be a convolutional neural network (CNN) or other suitable neural networks.

本发明与现有技术相比的优点在于:The advantages of the present invention compared with the prior art are:

(1)相比于传统波前传感器技术,本发明波前传感器结构简单,光能利用率高,且能克服传统单帧光强相位反演算法的多解问题;(1) Compared with the traditional wavefront sensor technology, the wavefront sensor of the present invention has a simple structure, high utilization rate of light energy, and can overcome the multi-solution problem of the traditional single-frame light intensity phase inversion algorithm;

(2)相比于经walsh函数相位调制改进的波前复原方法,直接利用远场光斑分布与近场波前的映射关系复原波前相位,可避免迭代过程,提高计算效率。(2) Compared with the wavefront restoration method improved by the walsh function phase modulation, the wavefront phase is directly restored by using the mapping relationship between the far-field spot distribution and the near-field wavefront, which can avoid the iterative process and improve the computational efficiency.

附图说明Description of drawings

图1为本发明一种基于单帧焦面光强图像的深度学习波前复原方法的流程图;1 is a flowchart of a deep learning wavefront restoration method based on a single-frame focal plane light intensity image of the present invention;

图2为基于W3(第三阶walsh函数)相位片调制的波前传感器原理图;Figure 2 is a schematic diagram of a wavefront sensor based on W3 (third-order walsh function) phase plate modulation;

图3为未经W3相位片调制的正离焦波前图及对应的远场光斑图;Figure 3 is a positive defocus wavefront diagram without modulation by the W3 phase plate and the corresponding far-field spot diagram;

图4为未经W3相位片调制的负离焦波前图及对应的远场光斑图;Figure 4 is a negative defocus wavefront diagram and a corresponding far-field spot diagram without modulation by the W3 phase plate;

图5为经W3相位片调制下正离焦波前的远场光斑图及负离焦波前的远场光斑图;5 is a far-field speckle diagram of a positive defocus wavefront and a far-field speckle diagram of a negative defocus wavefront modulated by the W3 phase plate;

图6为本发明采用的Xception卷积神经网络学习波前复原原理图。FIG. 6 is a schematic diagram of the Xception convolutional neural network learning wavefront restoration principle adopted in the present invention.

具体实施方式Detailed ways

为使本发明的目的、技术方案和优点更加清楚明白,以下结合具体实施例,并参照附图,对本发明进一步详细说明。In order to make the objectives, technical solutions and advantages of the present invention clearer, the present invention will be further described in detail below with reference to specific embodiments and accompanying drawings.

图1是一种基于单帧焦面光强图像的深度学习波前复原方法的流程图,具体的实施过程为:Figure 1 is a flowchart of a deep learning wavefront restoration method based on a single-frame focal plane light intensity image, and the specific implementation process is:

步骤1:设计基于walsh函数相位调制的波前传感器,经仿真结果的对比,选用W3函数作为相位片,如图2所示即为基于W3相位片调制的波前传感器原理图;Step 1: Design the wavefront sensor based on the phase modulation of the walsh function. After the comparison of the simulation results, the W3 function is selected as the phase plate. As shown in Figure 2, the principle diagram of the wavefront sensor based on the modulation of the W3 phase plate is shown;

步骤2:通过泽尼克多项式表征的正负离焦波前,验证步骤1设计的传感器是否可以保证远场光斑分布仅对应一个波前信息,即解唯一,图3、图4分别为未经W3相位片调制的正负离焦波前图及对应的远场光斑图,图5为经W3相位片调制下正负离焦波前的远场光斑图,其相位片的相位台阶为

Figure BDA0002578494180000041
Step 2: Use the positive and negative defocus wavefronts represented by Zernike polynomials to verify whether the sensor designed in Step 1 can ensure that the far-field spot distribution corresponds to only one wavefront information, that is, the solution is unique. Figure 3 and Figure 4 are respectively without W3 The positive and negative defocus wavefront diagrams modulated by the phase plate and the corresponding far-field speckle diagram. Figure 5 is the far-field speckle diagram of the positive and negative defocus wavefronts modulated by the W3 phase plate. The phase step of the phase plate is
Figure BDA0002578494180000041

步骤3:在本实验设计中,图3、图4中W3调制前正负离焦波前对应的远场光斑分布相同,而从图5中两个远场图的对比可以看出,经过W3相位片调制后的正负离焦波前对应的远场光斑分布不再相同,说明步骤1中设计的波前传感器符合解唯一的特性,因此,从仿真中收集10000组近场波前与其对应的远场光斑数据,作为数据集;Step 3: In this experimental design, the far-field spot distributions corresponding to the positive and negative defocus wavefronts before W3 modulation in Figure 3 and Figure 4 are the same, and from the comparison of the two far-field images in Figure 5, it can be seen that after W3 The far-field spot distributions corresponding to the positive and negative defocusing wavefronts modulated by the phase plate are no longer the same, indicating that the wavefront sensor designed in step 1 conforms to the unique characteristics of the solution. Therefore, 10,000 sets of near-field wavefronts are collected from the simulation and correspond to them. The far-field spot data of , as a dataset;

步骤4:配置深度学习环境,搭建Xception卷积神经网络;Step 4: Configure the deep learning environment and build the Xception convolutional neural network;

步骤5:将数据集中9000组样本作为训练集,使网络学习到远场光斑与近场波前的对应关系,1000组样本作为验证集,调整并验证网络的正确性。Step 5: Take 9000 groups of samples in the data set as the training set, so that the network can learn the correspondence between the far-field spot and the near-field wavefront, and 1000 groups of samples are used as the verification set to adjust and verify the correctness of the network.

网络训练收敛后,只需给网络输入远场光斑分布图,即可输出该远场光斑对应的近场波前信息,如图6即为本发明采用的Xception卷积神经网络学习波前复原原理图。此过程中不再涉及迭代运算,大大提高计算速度,目前预计此过程计算时间为毫秒级。After the network training has converged, it is only necessary to input the far-field spot distribution map to the network, and then the near-field wavefront information corresponding to the far-field spot can be output. As shown in Figure 6, the Xception convolutional neural network used in the present invention learns the wavefront restoration principle. picture. This process no longer involves iterative operations, which greatly improves the calculation speed. Currently, the calculation time of this process is expected to be in the millisecond level.

以上所述,仅为本发明中的具体实施方式,但本发明的保护范围并不局限于此,任何熟悉该技术的人在本发明所揭露的技术范围内,可理解想到的变换或替换,都应涵盖在本发明的包含范围之内。The above is only a specific embodiment of the present invention, but the protection scope of the present invention is not limited to this, any person familiar with the technology can understand the transformation or replacement that comes to mind within the technical scope disclosed by the present invention, All should be included within the scope of the present invention.

Claims (1)

1. A deep learning wavefront restoration method based on a single-frame focal plane light intensity image is characterized by comprising the following steps:
step 1: designing a wave front sensor based on walsh function phase modulation;
and 2, step: verifying whether the wavefront sensor designed in the step 1 can ensure that the far-field light spot distribution only corresponds to one wavefront information, namely, the solution is unique;
and step 3: if the solution is unique, obtaining far-field light spots and near-field wavefront data which correspond to each other one by one according to the design in the step 1 and using the far-field light spots and the near-field wavefront data as a data set, and if the solution is not unique, repeatedly executing the step 1 to design the sensor again until the solution is unique;
and 4, step 4: configuring a deep learning environment and building a learning network;
and 5: taking 90% of the data sets as a training set, enabling the network to learn the corresponding relation between the far-field light spots and the near-field wave fronts, taking the remaining 10% of samples as a verification set, and adjusting and verifying the correctness of the network;
wherein, the shape of the walsh function in the step 1 is selected to satisfy that the modulation phase is not 180 degrees rotationally and symmetrically overturned;
the theoretical basis for judging whether the unique corresponding far-field light spot distribution is obtained after the near-field wavefront phase is modulated based on the walsh function in the step 2 is as follows:
for a set of wavefront wavefronts that are rotationally flipped 180 ° from each other
Figure FDA0003696331450000011
And
Figure FDA0003696331450000012
the phase thereof satisfies:
Figure FDA0003696331450000013
i.e., the far fields corresponding to the two wavefronts, the complex amplitudes of which, ignoring coefficients, can be expressed as:
Figure FDA0003696331450000014
Figure FDA0003696331450000015
wherein (x, y) and (x) 0 ,y 0 ) Cartesian space two-dimensional coordinates of a near field and a far field, respectively, (u, v) are frequency domain coordinates; can see U far (x 0 ,y 0 ) And U' far (x 0 ,y 0 ) The real parts are equal and the imaginary parts are opposite in sign, and then the corresponding light intensity distribution is as follows:
|U′ far (x 0 ,y 0 )| 2 =|U far (x 0 ,y 0 )| 2 (4)
this illustrates the wave front
Figure FDA0003696331450000016
And
Figure FDA0003696331450000017
corresponding to the same far field light intensity distribution;
the walsh function has a non-180 deg. rotationally flipped symmetrical shape, and uses this characteristic to modulate the wavefront, assuming that the phase plate adds discrete aberrations of
Figure FDA0003696331450000021
And:
Figure FDA0003696331450000022
then the process of the first step is carried out,
Figure FDA0003696331450000023
thereby ensuring that the temperature of the molten steel is ensured,
|U′ far (x 0 ,y 0 )| 2 ≠|U far (x 0 ,y 0 )| 2 (7)
the one-to-one correspondence between far-field light spots and near-field wavefronts is ensured, namely, the uniqueness of a far-field inversion wavefront solution is ensured;
therefore, whether the solution is unique is judged by only comparing the light intensity distribution | U 'corresponding to the near-field wave fronts which are in rotational flip symmetry with each other by 180 degrees' far (x 0 ,y 0 )| 2 And | U far (x 0 ,y 0 )| 2 Whether the two are the same or not, if the two are the same, the two are not unique, and if the two are not the same, the two are unique;
the number of the data sets in the step 3 is more than ten thousand, so that the data diversity is met while sufficient data samples are ensured;
the learning network in the step 4 is a convolutional neural network CNN.
CN202010660807.3A 2020-07-10 2020-07-10 Deep learning wavefront restoration method based on single-frame focal plane light intensity image Active CN111854981B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010660807.3A CN111854981B (en) 2020-07-10 2020-07-10 Deep learning wavefront restoration method based on single-frame focal plane light intensity image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010660807.3A CN111854981B (en) 2020-07-10 2020-07-10 Deep learning wavefront restoration method based on single-frame focal plane light intensity image

Publications (2)

Publication Number Publication Date
CN111854981A CN111854981A (en) 2020-10-30
CN111854981B true CN111854981B (en) 2022-09-20

Family

ID=73152080

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010660807.3A Active CN111854981B (en) 2020-07-10 2020-07-10 Deep learning wavefront restoration method based on single-frame focal plane light intensity image

Country Status (1)

Country Link
CN (1) CN111854981B (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4933872A (en) * 1988-11-15 1990-06-12 Eastman Kodak Company Method and system for wavefront reconstruction
WO2007116365A2 (en) * 2006-04-07 2007-10-18 Ecole Polytechnique Federale De Lausanne (Epfl) Method and apparatus to measure and compute the amplitude point spread function and associated parameters of a coherent optical imaging system
CN101206763A (en) * 2007-11-16 2008-06-25 中国科学院光电技术研究所 High-Resolution Restoration Method for Multi-Frame Adaptive Optics Image Using Wavefront Data
CN104215339A (en) * 2014-09-14 2014-12-17 中国科学院光电技术研究所 Wavefront restoration system and method based on continuous far field
CN104596651A (en) * 2015-02-02 2015-05-06 中国科学院光电技术研究所 Phase inversion method based on four-quadrant binary phase modulation
CN106324854A (en) * 2016-10-27 2017-01-11 中国科学院光电技术研究所 Phase inversion method based on binary square diffraction element
CN106646867A (en) * 2016-12-16 2017-05-10 中国科学院光电研究院 Deep UV optical system confocal alignment device and method
CN109031654A (en) * 2018-09-11 2018-12-18 安徽农业大学 A kind of adaptive optics bearing calibration and system based on convolutional neural networks
CN110044498A (en) * 2019-04-18 2019-07-23 中国科学院光电技术研究所 A kind of Hartmann wave front sensor modal wavefront reconstruction method based on deep learning
CN111272299A (en) * 2020-01-22 2020-06-12 浙江大学 A Shack-Hartmann Wavefront Detector Based on Deep Learning

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102009027165A1 (en) * 2009-06-24 2011-01-20 Deutsch-Französisches Forschungsinstitut Saint-Louis Method and device for displaying a scanning function
US10477097B2 (en) * 2017-01-03 2019-11-12 University Of Connecticut Single-frame autofocusing using multi-LED illumination
CA2953984A1 (en) * 2017-01-09 2018-07-09 Oz Optics Ltd. Flexible low-cost mm-wave sfcw radar based imaging inspection system

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4933872A (en) * 1988-11-15 1990-06-12 Eastman Kodak Company Method and system for wavefront reconstruction
WO2007116365A2 (en) * 2006-04-07 2007-10-18 Ecole Polytechnique Federale De Lausanne (Epfl) Method and apparatus to measure and compute the amplitude point spread function and associated parameters of a coherent optical imaging system
CN101206763A (en) * 2007-11-16 2008-06-25 中国科学院光电技术研究所 High-Resolution Restoration Method for Multi-Frame Adaptive Optics Image Using Wavefront Data
CN104215339A (en) * 2014-09-14 2014-12-17 中国科学院光电技术研究所 Wavefront restoration system and method based on continuous far field
CN104596651A (en) * 2015-02-02 2015-05-06 中国科学院光电技术研究所 Phase inversion method based on four-quadrant binary phase modulation
CN106324854A (en) * 2016-10-27 2017-01-11 中国科学院光电技术研究所 Phase inversion method based on binary square diffraction element
CN106646867A (en) * 2016-12-16 2017-05-10 中国科学院光电研究院 Deep UV optical system confocal alignment device and method
CN109031654A (en) * 2018-09-11 2018-12-18 安徽农业大学 A kind of adaptive optics bearing calibration and system based on convolutional neural networks
CN110044498A (en) * 2019-04-18 2019-07-23 中国科学院光电技术研究所 A kind of Hartmann wave front sensor modal wavefront reconstruction method based on deep learning
CN111272299A (en) * 2020-01-22 2020-06-12 浙江大学 A Shack-Hartmann Wavefront Detector Based on Deep Learning

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
PHASE-RETRIEVAL STAGNATION PROBLEMS AND SOLUTIONS;FIENUP J. R. 等;《JOURNAL OF THE OPTICAL SOCIETY OF AMERICA A-OPTICS IMAGE SCIENCE AND VISION》;19861231(第3期);全文 *
Wavefront sensing by phase retrieval;R. A. Gonsalves;《SPIE》;19791231(第207期);全文 *
一种基于光强图像深度学习的波前复原方法;马慧敏;《激光与光电子学进展》;20200430;第57卷(第8期);第081103-1-10页 *
基于单帧焦面图像的波前相位反演方法研究;孔庆峰;《中国博士学位论文全文数据库基础科学辑》;20200415(第4期);第A005-12页 *

Also Published As

Publication number Publication date
CN111854981A (en) 2020-10-30

Similar Documents

Publication Publication Date Title
CN107885759A (en) A kind of knowledge mapping based on multiple-objection optimization represents learning method
CN113033652B (en) Image recognition system and method based on block chain and federal learning
CN103901617A (en) Wavefront detection-free adaptive optical system based on model
CN114722690A (en) Fast prediction method of acoustic metasurface sound field based on variable reliability neural network
CN109784359A (en) Image generating method, device, equipment and readable storage medium storing program for executing
CN115080761B (en) A semantically aware low-resource knowledge graph entity alignment method
CN107181474A (en) A kind of kernel adaptive algorithm filter based on functional expansion
CN107886162A (en) A kind of deformable convolution kernel method based on WGAN models
CN108898490A (en) Execution method, node, system, electronic equipment and the storage medium of intelligent contract
WO2021184367A1 (en) Social network graph generation method based on degree distribution generation model
CN112099345A (en) Fuzzy tracking control method, system and medium based on input hysteresis
CN111854981B (en) Deep learning wavefront restoration method based on single-frame focal plane light intensity image
CN112269318B (en) A finite-time remote safe state estimation method for time-delay uncertain systems
CN116883545A (en) Picture data set expansion method, medium and device based on diffusion model
CN114580578B (en) Constrained distributed stochastic optimization model training method, device and terminal
CN111582384A (en) An Image Adversarial Sample Generation Method
CN114648560A (en) Distributed image registration method, system, medium, computer device and terminal
CN104881678A (en) Multitask learning method of model and characteristic united learning
CN115310581A (en) Memristor convolutional neural network weight training method and device
CN112179504A (en) Single-frame focal plane light intensity image depth learning phase difference method based on grating modulation
CN112197876A (en) Single far-field type depth learning wavefront restoration method based on four-quadrant discrete phase modulation
CN106953820A (en) Blind Signal Detection Method Based on Double Sigmoid Complex Continuous Neural Network
CN117496215A (en) Method for efficient parameter transfer learning by using Re3Adapter module based on triple parameterization
CN112446487A (en) Method, device, system and storage medium for training and applying neural network model
CN117040978A (en) Fiber communication channel equalization method based on unsupervised learning neural network

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant