[go: up one dir, main page]

 
 
Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (2)

Search Parameters:
Keywords = colored Gaussian noise, coherent sources

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
16 pages, 799 KiB  
Article
A-CRNN-Based Method for Coherent DOA Estimation with Unknown Source Number
by Yuanyuan Yao, Hong Lei and Wenjing He
Sensors 2020, 20(8), 2296; https://doi.org/10.3390/s20082296 - 17 Apr 2020
Cited by 29 | Viewed by 3352
Abstract
Estimating directions of arrival (DOA) without knowledge of the source number is regarded as a challenging task, particularly when coherence among sources exists. Researchers have trained deep learning (DL)-based models to attack the problem of DOA estimation. However, existing DL-based methods for coherent [...] Read more.
Estimating directions of arrival (DOA) without knowledge of the source number is regarded as a challenging task, particularly when coherence among sources exists. Researchers have trained deep learning (DL)-based models to attack the problem of DOA estimation. However, existing DL-based methods for coherent sources do not adapt to variable source numbers or require signal independence. Herein, we put forward a new framework combining parallel DOA estimators with Toeplitz matrix reconstruction to address the problem. Each estimator is constructed by connecting a multi-label classifier to a spatial filter, which is based on convolutional-recurrent neural networks. Spatial filters divide the angle domain into several sectors, so that the following classifiers can extract the arrival directions. Assisted with Toeplitz-based method for source-number determination, pseudo or missed angles classified by the estimators will be reduced. Then, the spatial spectrum can be more accurately recovered. In addition, the proposed method is data-driven, so it is naturally immune to signal coherence. Simulation results demonstrate the predominance of the proposed method and show that the trained model is robust to imperfect circumstances such as limited snapshots, colored Gaussian noise, and array imperfections. Full article
(This article belongs to the Section Remote Sensors)
Show Figures

Figure 1

Figure 1
<p>Uniform plane waves received by a <math display="inline"><semantics> <mrow> <mo>(</mo> <mn>2</mn> <mi>M</mi> <mo>+</mo> <mn>1</mn> <mo>)</mo> </mrow> </semantics></math>—element ULA.</p>
Full article ">Figure 2
<p>Framework of Toeplitz alternate convolutional-recurrent neural network (A-CRNN)-based coherent direction-of-arrival (DOA) estimation without knowing the source number.</p>
Full article ">Figure 3
<p>Mean absolute error (MAE) response curve of the multi-label classifiers in different sectors at the signal-to-noise (SNR) of 20 dB.</p>
Full article ">Figure 4
<p>Recovered spatial spectrums for coherent signals from three different angular sectors: (<b>a</b>) outputs of the space filter to the signals from the 2th, 3th, and 6th sectors with the directions of <math display="inline"><semantics> <mrow> <mo>(</mo> <mo>−</mo> <mn>28</mn> <mo>°</mo> <mo>,</mo> <mo>−</mo> <mn>5</mn> <mo>°</mo> <mo>,</mo> <mn>44</mn> <mo>°</mo> <mo>)</mo> </mrow> </semantics></math>. (<b>b</b>) Outputs of the multi-label classifiers to the signals from the 1th, 2th, and 5th sectors with the directions of <math display="inline"><semantics> <mrow> <mo>(</mo> <mo>−</mo> <mn>28</mn> <mo>°</mo> <mo>,</mo> <mo>−</mo> <mn>5</mn> <mo>°</mo> <mo>,</mo> <mn>44</mn> <mo>°</mo> <mo>)</mo> </mrow> </semantics></math>.</p>
Full article ">Figure 5
<p>Recovered spatial spectrums for coherent signals from the same angular sector: (<b>a</b>) outputs of the space filter to the signals from the 2th sector with the directions of <math display="inline"><semantics> <mrow> <mo>(</mo> <mo>−</mo> <mn>38</mn> <mo>°</mo> <mo>,</mo> <mo>−</mo> <mn>36</mn> <mo>°</mo> <mo>,</mo> <mo>−</mo> <mn>22</mn> <mo>°</mo> <mo>)</mo> </mrow> </semantics></math>. (<b>b</b>) Outputs of the multi-label classifiers to the signals from the 2th sector with the directions of <math display="inline"><semantics> <mrow> <mo>(</mo> <mo>−</mo> <mn>38</mn> <mo>°</mo> <mo>,</mo> <mo>−</mo> <mn>36</mn> <mo>°</mo> <mo>,</mo> <mo>−</mo> <mn>22</mn> <mo>°</mo> <mo>)</mo> </mrow> </semantics></math>.</p>
Full article ">Figure 6
<p>Source number distribution of the same sector.</p>
Full article ">Figure 7
<p>Receiver operating characteristic (ROC) curves of the autoencoder and the proposed A-CRNN spacial filter of independent two-source DOA estimation.</p>
Full article ">Figure 8
<p>MAEs for direction combinations in the testing set with variable SNR levels and different angle intervals: (<b>a</b>) Angle interval = 3°. (<b>b</b>) Angle interval = 5°. (<b>c</b>) Angle interval = 8°. (<b>d</b>) Angle interval = 12°.</p>
Full article ">Figure 9
<p>Performances of DOA estimation under imperfect circumstances with the SNR of testing data fixed at 20 dB: (<b>a</b>) different numbers of snapshots. (<b>b</b>) Sensor-gain inconsistence. (<b>c</b>) Combined gain inconsistence with biased sensor position. (<b>d</b>) Coexisting of gain inconsistence, position bias, and inter-sensor mutual coupling.</p>
Full article ">Figure 10
<p>Testing of the proposed A-CRNN model to two-signal testing samples while trained in three-signal case: (<b>a</b>) the first signal and (<b>b</b>) the second signal.</p>
Full article ">Figure 11
<p>Testing of the proposed A-CRNN model to four-signal testing samples while trained in three-signal case: (<b>a</b>) the first signal, (<b>b</b>) the second signal, (<b>c</b>) the third signal, and (<b>d</b>) the fourth signal.</p>
Full article ">Figure 12
<p>Performance of the Toeplitz A-CRNN models to colored noise while trained in AWGN.</p>
Full article ">
19 pages, 3928 KiB  
Article
Multiwavelength Absolute Phase Retrieval from Noisy Diffractive Patterns: Wavelength Multiplexing Algorithm
by Vladimir Katkovnik, Igor Shevkunov, Nikolay V. Petrov and Karen Eguiazarian
Appl. Sci. 2018, 8(5), 719; https://doi.org/10.3390/app8050719 - 4 May 2018
Cited by 15 | Viewed by 4195
Abstract
We study the problem of multiwavelength absolute phase retrieval from noisy diffraction patterns. The system is lensless with multiwavelength coherent input light beams and random phase masks applied for wavefront modulation. The light beams are formed by light sources radiating all wavelengths simultaneously. [...] Read more.
We study the problem of multiwavelength absolute phase retrieval from noisy diffraction patterns. The system is lensless with multiwavelength coherent input light beams and random phase masks applied for wavefront modulation. The light beams are formed by light sources radiating all wavelengths simultaneously. A sensor equipped by a Color Filter Array (CFA) is used for spectral measurement registration. The developed algorithm targeted on optimal phase retrieval from noisy observations is based on maximum likelihood technique. The algorithm is specified for Poissonian and Gaussian noise distributions. One of the key elements of the algorithm is an original sparse modeling of the multiwavelength complex-valued wavefronts based on the complex-domain block-matching 3D filtering. Presented numerical experiments are restricted to noisy Poissonian observations. They demonstrate that the developed algorithm leads to effective solutions explicitly using the sparsity for noise suppression and enabling accurate reconstruction of absolute phase of high-dynamic range. Full article
(This article belongs to the Special Issue Applications of Digital Holographic Microscopy)
Show Figures

Figure 1

Figure 1
<p>(<b>a</b>) Optical setup. <math display="inline"><semantics> <mrow> <msub> <mi>λ</mi> <mn>1</mn> </msub> <mo>=</mo> <mn>417</mn> </mrow> </semantics></math> nm, <math display="inline"><semantics> <mrow> <msub> <mi>λ</mi> <mn>2</mn> </msub> <mo>=</mo> <mn>532</mn> </mrow> </semantics></math> nm, <math display="inline"><semantics> <mrow> <msub> <mi>λ</mi> <mn>3</mn> </msub> <mo>=</mo> <mn>633</mn> </mrow> </semantics></math> nm are wavelengths of the blue, green, and red light sources, M are mirrors, BS are beam-splitters, SLM stands for Spatial Light Modulator, O is the object, CMOS+CFA is a registration camera with CFA; (<b>b</b>) Color filter array. BGGR CFA is a blue-green-green-red disposed CFA; Blue, Green and Red are separate color channels.</p>
Full article ">Figure 2
<p>Wrapped and absolute phases of the investigated test-objects: truncated Gaussian and TUT logo, the reference wavelength <math display="inline"><semantics> <mrow> <msup> <mi>λ</mi> <mo>′</mo> </msup> <mo>=</mo> <msub> <mi>λ</mi> <mn>3</mn> </msub> </mrow> </semantics></math>.</p>
Full article ">Figure 3
<p><math display="inline"><semantics> <mrow> <mi>R</mi> <mi>R</mi> <mi>M</mi> <mi>S</mi> <mi>E</mi> </mrow> </semantics></math> and <math display="inline"><semantics> <mrow> <mi>S</mi> <mi>N</mi> <mi>R</mi> </mrow> </semantics></math> as functions of the parameter <math display="inline"><semantics> <mi>χ</mi> </semantics></math> of the Poissonian distribution: solid triangles and dashed diamonds curves (blue in color images) show <math display="inline"><semantics> <mrow> <mi>R</mi> <mi>R</mi> <mi>M</mi> <mi>S</mi> <mi>E</mi> </mrow> </semantics></math> for the truncated Gaussian and logo TUT objects, respectively (left y-axis); solid circle curve (orange) shows <math display="inline"><semantics> <mrow> <mi>S</mi> <mi>N</mi> <mi>R</mi> </mrow> </semantics></math> of the observations (right y-axis).</p>
Full article ">Figure 4
<p><math display="inline"><semantics> <mrow> <mi>R</mi> <mi>R</mi> <mi>M</mi> <mi>S</mi> <mi>E</mi> <mi>s</mi> </mrow> </semantics></math> as functions of <math display="inline"><semantics> <mrow> <mi>S</mi> <mi>N</mi> <mi>R</mi> </mrow> </semantics></math> and the number of experiments <span class="html-italic">S</span>, 300 iterations, left and right 3D images are for Logo TUT and truncated Gaussian, respectively.</p>
Full article ">Figure 5
<p>Truncated Gaussian absolute phase reconstructions: <math display="inline"><semantics> <mrow> <mi>R</mi> <mi>R</mi> <mi>M</mi> <mi>S</mi> <mi>E</mi> <mo>=</mo> <mn>0.056</mn> </mrow> </semantics></math> (<math display="inline"><semantics> <mrow> <mi>S</mi> <mi>N</mi> <mi>R</mi> <mo>=</mo> <mn>6.3</mn> <mspace width="3.33333pt"/> <mi>dB</mi> </mrow> </semantics></math>, <math display="inline"><semantics> <mrow> <msub> <mi>N</mi> <mrow> <mi>p</mi> <mi>h</mi> <mi>o</mi> <mi>t</mi> <mi>o</mi> <mi>n</mi> </mrow> </msub> <mo>=</mo> <mn>1.42</mn> </mrow> </semantics></math>), top row - 3D absolute phase surfaces, bottom row - the corresponding 2D absolute phase images. From left to right: WM-APR algorithm, and phase reconstructions obtained separately for <math display="inline"><semantics> <msub> <mi>λ</mi> <mn>1</mn> </msub> </semantics></math>, <math display="inline"><semantics> <msub> <mi>λ</mi> <mn>2</mn> </msub> </semantics></math>, <math display="inline"><semantics> <msub> <mi>λ</mi> <mn>3</mn> </msub> </semantics></math> wavelengths, respectively, followed by the 2D phase unwrapping.</p>
Full article ">Figure 6
<p>Logo TUT phase reconstructions: <math display="inline"><semantics> <mrow> <mi>R</mi> <mi>R</mi> <mi>M</mi> <mi>S</mi> <mi>E</mi> <mo>=</mo> <mn>0.084</mn> </mrow> </semantics></math> (<math display="inline"><semantics> <mrow> <mi>S</mi> <mi>N</mi> <mi>R</mi> <mo>=</mo> <mn>6.3</mn> <mspace width="3.33333pt"/> <mi>dB</mi> </mrow> </semantics></math>, <math display="inline"><semantics> <mrow> <msub> <mi>N</mi> <mrow> <mi>p</mi> <mi>h</mi> <mi>o</mi> <mi>t</mi> <mi>o</mi> <mi>n</mi> </mrow> </msub> <mo>=</mo> <mn>1.42</mn> </mrow> </semantics></math>), (<b>top row</b>)—3D absolute phase surfaces, (<b>bottom row</b>)—the corresponding 2D absolute phases images. From left to right: WM-APR algorithm, and phase reconstructions obtained separately for <math display="inline"><semantics> <msub> <mi>λ</mi> <mn>1</mn> </msub> </semantics></math>, <math display="inline"><semantics> <msub> <mi>λ</mi> <mn>2</mn> </msub> </semantics></math>, <math display="inline"><semantics> <msub> <mi>λ</mi> <mn>3</mn> </msub> </semantics></math> wavelengths, respectively, followed by the 2D phase unwrapping.</p>
Full article ">
Back to TopTop