Few-Shot User-Adaptable Radar-Based Breath Signal Sensing
"> Figure 1
<p>For each learning episode, a training subject is randomly sampled. For each training shot, the radar phase information is mapped to the reference belt signal (ref.) via a C-VAE. Through a dense layer, the ANN also tries to regress the extracted respiration <math display="inline"><semantics> <mrow> <mi>F</mi> <mi>c</mi> </mrow> </semantics></math>, learning from the ideal belt <math display="inline"><semantics> <mrow> <mi>F</mi> <mi>c</mi> </mrow> </semantics></math>. The latent space mapping is thus constrained to the <math display="inline"><semantics> <mrow> <mi>F</mi> <mi>c</mi> </mrow> </semantics></math>, whose estimate is also used in the prediction phase.</p> "> Figure 2
<p>The diagram shows the main steps of the implementation. For a chosen scenario (room and user), several data sessions with synchronized radar and a reference respiration belt are collected. For multi-output ANN, the labels consist of belt reference signals and the central breath frequencies, estimated from the pure belt reference. The data from fourteen users are then used to train an ANN episodically using Meta-L, while the data from the remaining ten users are solely used for testing.</p> "> Figure 3
<p>The <span class="html-italic">BGT60TR13</span> radar system (<b>a</b>) delivers filtered, mixed, and digitized information from each Rx channel. The <span class="html-italic">BGT60TR13C</span> radar (<b>b</b>) is mounted on top of the evaluation board.</p> "> Figure 4
<p>Recording Setup. A synchronized radar system and respiration belt are used to collect 10 30-second sessions per user and distance. The distance ranges used in data collection (up to 30 or 40 cm), refer to the distance between the chest and the radar board.</p> "> Figure 5
<p>Preprocessing pipeline. First, the phase information is unwrapped from the raw radar data. The respiration signal and <math display="inline"><semantics> <mrow> <mi>F</mi> <mi>c</mi> </mrow> </semantics></math> are then estimated by Meta-L, exploiting only in the training phase the data collected with the respiration belt.</p> "> Figure 6
<p>Lines in yellow indicate the defined range bin limits and, in red, the detected maximum bin per frame. Range plotting is generated after clutter removal. In (<b>a</b>), the subject did not move much during the session. In (<b>b</b>), the range limits vary according to the user’s distance from the radar board.</p> "> Figure 7
<p>Band-pass bi-quadratic filter. The diagram (<b>a</b>) depicts the linear flow of the biquad filter, where the output <math display="inline"><semantics> <mrow> <mi>O</mi> <mo>(</mo> <mi>n</mi> <mo>)</mo> </mrow> </semantics></math> at time instant <span class="html-italic">n</span> is determined by the two previous input <span class="html-italic">I</span> and output <span class="html-italic">O</span> values. Instead, a gain vs. frequency plot of a biquad band-pass filter obtained for a <span class="html-italic">Q</span> of <math display="inline"><semantics> <msqrt> <mn>2</mn> </msqrt> </semantics></math> and <math display="inline"><semantics> <mrow> <mi>f</mi> <mi>s</mi> </mrow> </semantics></math> of 20, over an <math display="inline"><semantics> <mrow> <mi>F</mi> <mi>c</mi> </mrow> </semantics></math> of 0.33 Hz, is shown as a reference in (<b>b</b>).</p> "> Figure 8
<p>Example of sliding window generation for instant bpm estimation on a recorded session The radar signal has been filtered using the ideal belt, <math display="inline"><semantics> <mrow> <mi>F</mi> <mi>c</mi> </mrow> </semantics></math>. The radar, as opposed to the belt, is not connected to the user during recordings, but to the desk. This results in the local shift of signal breathing peaks due to the millimeter movements of the user. The window (in purple in the plot) is shown paler on the two peaks closest to the calculated peaks’ mean distance. It is also possible to notice some slight corruption at the beginning of the session due to user motion.</p> "> Figure 9
<p>Comparison of instantaneous bpm between respiration belt and radar (with ideal <math display="inline"><semantics> <mrow> <mi>F</mi> <mi>c</mi> </mrow> </semantics></math>) for a recording session. The x-axis corresponds to the difference between the number of frames in the session and the sliding window length. The radar signal corruption flag variable is plotted in green. At the beginning of the session, the radar signal is motion-corrupted (as shown in <a href="#sensors-23-00804-f008" class="html-fig">Figure 8</a>) and thus does not lead to a reliable bpm. On the other hand, for the workplace use case, the reference belt signal is more robust to motion. In this case, the motion performed was the movement of the hands toward the desk.</p> "> Figure 10
<p>Two-component t-SNE representation of the <span class="html-italic">Breath Meta-Dataset</span> radar data. The circles represent the training users, while the crosses represent the testing users for the Meta-L. No user-specific feature clusters are visible under the t-SNE assumptions. The t-SNE was obtained with a perplexity of 20 and 7000 iterations [<a href="#B42-sensors-23-00804" class="html-bibr">42</a>].</p> "> Figure 11
<p>Graphical representation of single-episode learning with C-VAE. The unwrapped radar phase is mapped to the respiration belt signal using the signal reconstruction term. The regularization term makes the latent space closer to a standard multivariate normal distribution. <math display="inline"><semantics> <mrow> <mi>F</mi> <mi>c</mi> </mrow> </semantics></math> regression allows the parameterization to depend on the respiration signal.</p> "> Figure 12
<p>Chosen C-VAE topology. The latent space representation is constrained by both the reconstruction of <span class="html-italic">x</span> with respect to the <math display="inline"><semantics> <msub> <mi>x</mi> <mrow> <mi>b</mi> <mi>e</mi> <mi>l</mi> <mi>t</mi> </mrow> </msub> </semantics></math> reference and the ideal <math display="inline"><semantics> <mrow> <mi>F</mi> <mi>c</mi> </mrow> </semantics></math> of breathing <span class="html-italic">y</span>. The decoder layers are an up-sampled mirror version of the encoder layers.</p> "> Figure 13
<p>Examples of latent space generation. Examples of radar phase input (<b>a</b>) and generated latent spaces (<b>b</b>), size 32, are shown. The latent spaces are obtained after the model generalization training. Each <math display="inline"><semantics> <mrow> <mn>8</mn> <mspace width="4pt"/> <mi>x</mi> <mspace width="4pt"/> <mn>8</mn> </mrow> </semantics></math> representation consists of the mean values <math display="inline"><semantics> <mi>μ</mi> </semantics></math> and the standard deviations <math display="inline"><semantics> <mi>σ</mi> </semantics></math>. Starting from the top of the representations toward the right, the first 32 pixels represent <math display="inline"><semantics> <mi>μ</mi> </semantics></math> values, while the last 32 are those of <math display="inline"><semantics> <mi>σ</mi> </semantics></math>.</p> "> Figure 14
<p>MAML <math display="inline"><semantics> <mrow> <mspace width="4pt"/> <msup> <mn>2</mn> <mrow> <mi>n</mi> <mi>d</mi> </mrow> </msup> </mrow> </semantics></math> 1–shot experiment, Box Plots. Learning trends of Meta-L, box plots versus episodes (evaluation loop) for the <span class="html-italic">Breath Meta-Dataset</span>. The box in (<b>a</b>) depicts the trend for users in the training set (<math display="inline"><semantics> <msub> <mi mathvariant="script">T</mi> <mi>r</mi> </msub> </semantics></math> tasks). In (<b>b</b>), the trend for the users of the test set (<math display="inline"><semantics> <msub> <mi mathvariant="script">T</mi> <mi>v</mi> </msub> </semantics></math> tasks)is shown. The box’s mid-line represents the median value, while the little green triangle represents the mean.</p> "> Figure 15
<p>MAML <math display="inline"><semantics> <mrow> <mspace width="4pt"/> <msup> <mn>2</mn> <mrow> <mi>n</mi> <mi>d</mi> </mrow> </msup> </mrow> </semantics></math> 1–shot experiment histograms for the first (<b>a</b>) and last (<b>b</b>) set of 300 episodes. The box plots in the topmost plots also contain outliers as small circles outside the whiskers. The mid-plots show an approximation to the Gaussian distribution. The lower plots show the true histograms, which do not underlie a Gaussian distribution. The q1 and q3 represent the first and third quartiles, respectively.</p> "> Figure 16
<p>Loss (<math display="inline"><semantics> <msup> <mi>L</mi> <mo>*</mo> </msup> </semantics></math> ) as a function of the number of detected breathing spikes over the 30 s sessions for the 10 test users. The base of the box plots with non-uniform ranges was chosen so as to have at least 4 examples for the least common classes (1–4 and 12–14). The upper plot is obtained by fitting the 1–shot Meta-L model (<b>a</b>) to new users, while the middle and lower plots are obtained by 5– (<b>b</b>) and 10– (<b>c</b>) shots adaptation, respectively. For the first two plots, the circles that lie outside the box plots whiskers represent the outliers. Plot (<b>c</b>) shows no visible outliers.</p> "> Figure 17
<p>Standard prediction examples obtained post 1–shot test user-adaptation with MAML <math display="inline"><semantics> <mrow> <mspace width="4pt"/> <msup> <mn>2</mn> <mrow> <mi>n</mi> <mi>d</mi> </mrow> </msup> </mrow> </semantics></math>. The top plots show the prediction <math display="inline"><semantics> <msup> <mover accent="true"> <mi>x</mi> <mo>^</mo> </mover> <mo>*</mo> </msup> </semantics></math> versus the respiration belt reference, while the bottom plots display the estimated bpm and corruption flag. Legends, which also apply to the plots on the right, are placed in the plots on the left. An example of optimal prediction with radar information characterized by little motion corruption is shown in (<b>a</b>). The respiration signal is recovered even in the presence of some corruption, as in (<b>b</b>), thanks to the <math display="inline"><semantics> <msup> <mi>L</mi> <mo>*</mo> </msup> </semantics></math> formulation.</p> "> Figure 18
<p>Edge prediction examples obtained post 1–shot test user-adaptation with MAML <math display="inline"><semantics> <mrow> <mspace width="4pt"/> <msup> <mn>2</mn> <mrow> <mi>n</mi> <mi>d</mi> </mrow> </msup> </mrow> </semantics></math>. The top plots show the prediction <math display="inline"><semantics> <msup> <mover accent="true"> <mi>x</mi> <mo>^</mo> </mover> <mo>*</mo> </msup> </semantics></math> versus the respiration belt reference, while the bottom plots display the estimated bpm and corruption flag. Legends, which also apply to the plots on the right, are placed in the plots on the left. In (<b>a</b>), there are six visible peaks in the belt signal (blue), while in (<b>b</b>) there are thirteen peaks. In these examples, the algorithm performs less well than in standard cases. This is mainly due to the lack of edge data as prior knowledge during episodic learning. In the bpm estimation in the example (<b>a</b>), a shorter estimate can be seen for the belt than for radar. This is due to the computation of two distinct windows between radar and belt, as explained in <a href="#sec3dot6-sensors-23-00804" class="html-sec">Section 3.6</a>.</p> ">
Abstract
:1. Introduction
- 1.
- Implementation, to the best of our knowledge, of the first few-shot user-adaptable radar-based breath signal sensing solution.
- 2.
- Development of a specialized radar data preprocessing pipeline that dynamically tracks the user’s position relative to the board.
- 3.
- Design of a cost function that constrains the generation of the latent space of a C-VAE to the respiration in a multi-output ANN.
- 4.
- Development of a corruption-based sample weighting approach that guides the breathing signal estimation in the presence of user motion.
2. Related Works
3. System Description and Implementation
3.1. General Overview of the Proposed Framework
3.2. Radar Board and Configuration
3.3. Recording Setup
3.4. Radar Phase Signal Extraction
- Raw radar and respiration belt data are collected synchronously for a session. The chosen frame rate per session is 660 (), which is 10% higher than the theoretical frame rate of 600 (20 * 30 s). Longer sessions for either sensor are interpolated, whereas shorter ones are zero-padded. The belt signal is used as a reference estimation in the Meta-L training phase. Subsequent preprocessing steps involve the radar signal only.
- The IF signal is computed channel-wise, for the three Rx, for each radar frame. The information is organized in a 3D matrix, with the x-axis representing fast time (samples), the y-axis representing slow time (chirps), and the z-axis representing channels.
- The average value is subtracted from the sequence of 660 frames so that the potential direct current (DC) offset is subtracted.
- Over slow time and channels, the radar-sensed information derives from the same recorded event. Rather than using a single channel or single chirp, we use the averaged information over both axes for the next steps. Intrinsically, given the equal importance of the information in the chirps and their respective channels, the averaged information will be more robust to the noise.
- A 1D FFT is performed along fast-time to retrieve the range information.
- From the range information, it is possible to estimate the user’s position frame-wise, select the set of meaningful range bins, and subtract the clutter in each (Section 3.5).
- The phase information is calculated for the selected bins. Frame-wise, only the bin range with the highest mean squared error (MSE) to the estimated clutter is chosen (Section 3.5).
- The phase beyond () is then unwrapped using a phase discontinuity threshold approach.
- Because users had freedom of action during the recordings, the estimated from the radar phase by frequency analysis may not coincide with the central respiration . For this reason, Meta-L is used to map the radar phase to the computed ideal belt (Section 3.5.1).
- Comparison between radar-estimated breath signal and respiration belt is performed on normalized signals between zero and one, calculating MSE and estimating instantaneous bpm along the session (Section 3.6).
3.5. Range Bins Selection and Clutter Removal
3.5.1. Central Frequency Estimation and Labeling
3.6. Breaths per Minute Estimation and Corruption Detection
3.7. Breath Meta-Dataset
4. Proposed Method
4.1. Episodic Breath Signal Estimation
4.2. Proposed C-VAE-Based Topology
4.3. Corruption-Weighted Loss and Breathing Estimation Formulation
4.4. Information about Experiments
5. Results and Discussion
5.1. Results on MAML Second Order
5.2. Ablation Study
5.3. Results on Various Optimization-Based Algorithms
6. Conclusions
Author Contributions
Funding
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
Sample Availability
Abbreviations
ADC | Analog-to-Digital Converter |
AI | Artificial Intelligence |
AoA | Angle of Arrival |
ANN | Artificial Neural Network |
BCE | Binary Cross-Entropy |
BPM | Breaths Per Minute |
C-VAE | Convolutional Variational Autoencoder |
CA | Cosine Annealing of Meta-Optimizer Learning Rate |
CSI | Channel State Information |
CS-OMP | Compressive Sensing based on Orthogonal Matching Pursuit |
DA | Derivative-Order Annealing |
DC | Direct Current |
ECG | Electrocardiogram |
FFT | Fast Fourier Transform |
Fc | Central Frequency |
fs | Sampling Frequency |
FMCW | Frequency Modulated Continuous Wave |
FoV | Field of View |
HR | Heart Rate |
IF | Intermediate Frequency |
IFD | Innovation Fund Denmark |
IQR | Interquartile Range |
IR-UWB | Impulse Radio Ultra-Wide-Band |
KL | Kullback-Leibler |
KNN | K-Nearest Neighbor |
LSTM | Long Short-Term Memory |
MAML | Model-Agnostic Meta-Learning |
MC-SVM | Multi-Class Support Vector Machine |
Meta-L | Meta Learning |
MIMO | Multiple-Input-Multiple-Output |
ML | Machine Learning |
MSE | Mean Squared Error |
MSL | Multi-Step Loss Optimization |
MTI | Moving Target Indication |
Q | Quality Factor |
RA-DWT | Rigrsure Adaptive soft threshold noise reduction based on Discrete Wavelet Transform |
Rvo | Rijksdienst voor Ondernemend Nederland |
RR | Respiration Rate |
Rx | Receiver |
SNR | Signal-to-Noise Ratio |
SVM | Support Vector Machine |
STFT | Short Term Fourier Transformation |
t-SNE | t-distributed Stochastic Neighbor Embedding |
Tx | Transmitter |
UPSIM | Unleash Potentials in Simulation |
Appendix A. Biquad Filter Parameters Computation
References
- Sidikova, M.; Martinek, R.; Kawala-Sterniuk, A.; Ladrova, M.; Jaros, R.; Danys, L.; Simonik, P. Vital sign monitoring in car seats based on electrocardiography, ballistocardiography and seismocardiography: A review. Sensors 2020, 20, 5699. [Google Scholar] [CrossRef] [PubMed]
- Shimazaki, T.; Anzai, D.; Watanabe, K.; Nakajima, A.; Fukuda, M.; Ata, S. Heat stroke prevention in hot specific occupational environment enhanced by supervised machine learning with personalized vital signs. Sensors 2022, 22, 395. [Google Scholar] [CrossRef] [PubMed]
- Loughlin, P.C.; Sebat, F.; Kellett, J.G. Respiratory rate: The forgotten vital sign—Make it count! Jt. Comm. J. Qual. Patient Saf. 2018, 44, 494–499. [Google Scholar] [CrossRef]
- Brekke, I.J.; Puntervoll, L.H.; Pedersen, P.B.; Kellett, J.; Brabrand, M. The value of vital sign trends in predicting and monitoring clinical deterioration: A systematic review. PLoS ONE 2019, 14, e0210875. [Google Scholar] [CrossRef] [Green Version]
- Organisation, W.H. Cardiovascular Diseases (CVDs). 2021. Available online: https://www.who.int/en/news-room/fact-sheets/detail/cardiovascular-diseases-(cvds) (accessed on 25 October 2022).
- Dias, D.; Paulo Silva Cunha, J. Wearable health devices—Vital sign monitoring, systems and technologies. Sensors 2018, 18, 2414. [Google Scholar] [CrossRef] [Green Version]
- Wang, Z.; Yang, Z.; Dong, T. A review of wearable technologies for elderly care that can accurately track indoor position, recognize physical activities and monitor vital signs in real time. Sensors 2017, 17, 341. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Taylor, W.; Abbasi, Q.H.; Dashtipour, K.; Ansari, S.; Shah, S.A.; Khalid, A.; Imran, M.A. A Review of the State of the Art in Non-Contact Sensing for COVID-19. Sensors 2020, 20, 5665. [Google Scholar] [CrossRef]
- Sinhal, R.; Singh, K.; Shankar, A. Estimating vital signs through non-contact video-based approaches: A survey. In Proceedings of the 2017 International Conference on Recent Innovations in Signal Processing and Embedded Systems (RISE), Bhopal, India, 27–29 October 2017; pp. 139–141. [Google Scholar]
- Villarroel, M.; Jorge, J.; Pugh, C.; Tarassenko, L. Non-contact vital sign monitoring in the clinic. In Proceedings of the 2017 12th IEEE International Conference on Automatic Face & Gesture Recognition (FG 2017), Washington, DC, USA, 30 May–3 June 2017; pp. 278–285. [Google Scholar]
- Garbey, M.; Sun, N.; Merla, A.; Pavlidis, I. Contact-free measurement of cardiac pulse based on the analysis of thermal imagery. IEEE Trans. Biomed. Eng. 2007, 54, 1418–1426. [Google Scholar] [CrossRef] [PubMed]
- Negishi, T.; Abe, S.; Matsui, T.; Liu, H.; Kurosawa, M.; Kirimoto, T.; Sun, G. Contactless vital signs measurement system using RGB-thermal image sensors and its clinical screening test on patients with seasonal influenza. Sensors 2020, 20, 2171. [Google Scholar] [CrossRef] [Green Version]
- Ambrosanio, M.; Franceschini, S.; Grassini, G.; Baselice, F. A multi-channel ultrasound system for non-contact heart rate monitoring. IEEE Sens. J. 2019, 20, 2064–2074. [Google Scholar] [CrossRef]
- Kebe, M.; Gadhafi, R.; Mohammad, B.; Sanduleanu, M.; Saleh, H.; Al-Qutayri, M. Human vital signs detection methods and potential using radars: A review. Sensors 2020, 20, 1454. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Singh, A.; Rehman, S.U.; Yongchareon, S.; Chong, P.H.J. Multi-resident non-contact vital sign monitoring using radar: A review. IEEE Sens. J. 2020, 21, 4061–4084. [Google Scholar] [CrossRef]
- Liu, J.; Liu, H.; Chen, Y.; Wang, Y.; Wang, C. Wireless sensing for human activity: A survey. IEEE Commun. Surv. Tutor. 2019, 22, 1629–1645. [Google Scholar] [CrossRef]
- Kanda, T.; Sato, T.; Awano, H.; Kondo, S.; Yamamoto, K. Respiratory rate estimation based on WiFi frame capture. In Proceedings of the 2022 IEEE 19th Annual Consumer Communications & Networking Conference (CCNC), Online, 8–11 January 2022; pp. 881–884. [Google Scholar]
- Wang, F.; Zhang, F.; Wu, C.; Wang, B.; Liu, K.R. ViMo: Multiperson vital sign monitoring using commodity millimeter-wave radio. IEEE Int. Things J. 2020, 8, 1294–1307. [Google Scholar] [CrossRef]
- Brooker, G.M. Understanding millimetre wave FMCW radars. In Proceedings of the 1st international Conference on Sensing Technology, Sitges, Spain, 22–25 May 2005; Volume 1. [Google Scholar]
- Maier, M.; Stapelfeldt, F.N.; Issakov, V. Design Approach of a K-Band FMCW Radar for Breast Cancer Detection using a Full System-Level EM Simulation. In Proceedings of the 2022 IEEE MTT-S International Microwave Biomedical Conference (IMBioC), IEEE, Suzhou, China, 16–18 May 2022; pp. 251–253. [Google Scholar]
- Chen, V.C. The Micro-Doppler Effect in Radar; Artech House: Boston, MA, USA, 2019. [Google Scholar]
- Santra, A.; Ulaganathan, R.V.; Finke, T.; Baheti, A.; Noppeney, D.; Wolfgang, J.R.; Trotta, S. Short-range multi-mode continuous-wave radar for vital sign measurement and imaging. In Proceedings of the 2018 IEEE Radar Conference (RadarConf18), IEEE, Oklahoma City, OK, USA, 23–17 April 2018; pp. 946–950. [Google Scholar]
- Arsalan, M.; Santra, A.; Will, C. Improved contactless heartbeat estimation in FMCW radar via Kalman filter tracking. IEEE Sens. Lett. 2020, 4, 1–4. [Google Scholar] [CrossRef]
- Khan, F.; Cho, S.H. A detailed algorithm for vital sign monitoring of a stationary/non-stationary human through IR-UWB radar. Sensors 2017, 17, 290. [Google Scholar] [CrossRef] [Green Version]
- Wu, Q.; Mei, Z.; Lai, Z.; Li, D.; Zhao, D. A non-contact vital signs detection in a multi-channel 77 GHz LFMCW radar system. IEEE Access 2021, 9, 49614–49628. [Google Scholar] [CrossRef]
- Saluja, J.; Casanova, J.; Lin, J. A supervised machine learning algorithm for heart-rate detection using Doppler motion-sensing radar. IEEE J. Electromagn. Microw. Med. Biol. 2019, 4, 45–51. [Google Scholar] [CrossRef]
- Iyer, S.; Zhao, L.; Mohan, M.P.; Jimeno, J.; Siyal, M.Y.; Alphones, A.; Karim, M.F. mm-Wave Radar-Based Vital Signs Monitoring and Arrhythmia Detection Using Machine Learning. Sensors 2022, 22, 3106. [Google Scholar] [CrossRef]
- Malešević, N.; Petrović, V.; Belić, M.; Antfolk, C.; Mihajlović, V.; Janković, M. Contactless real-time heartbeat detection via 24 GHz continuous-wave Doppler radar using artificial neural networks. Sensors 2020, 20, 2351. [Google Scholar] [CrossRef]
- Hospedales, T.; Antoniou, A.; Micaelli, P.; Storkey, A. Meta-learning in neural networks: A survey. IEEE Trans. Pattern Anal. Mach. Intell. 2021, 44, 5149–5169. [Google Scholar] [CrossRef] [PubMed]
- Finn, C.; Abbeel, P.; Levine, S. Model-agnostic meta-learning for fast adaptation of deep networks. In Proceedings of the International Conference on Machine Learning, PMLR, Sydney, Australia, 6–11 August 2017; pp. 1126–1135. [Google Scholar]
- Alizadeh, M.; Shaker, G.; De Almeida, J.C.M.; Morita, P.P.; Safavi-Naeini, S. Remote monitoring of human vital signs using mm-wave FMCW radar. IEEE Access 2019, 7, 54958–54968. [Google Scholar] [CrossRef]
- Wang, Y.; Wang, W.; Zhou, M.; Ren, A.; Tian, Z. Remote monitoring of human vital signs based on 77-GHz mm-wave FMCW radar. Sensors 2020, 20, 2999. [Google Scholar] [CrossRef] [PubMed]
- Lee, H.; Kim, B.H.; Park, J.K.; Yook, J.G. A novel vital-sign sensing algorithm for multiple subjects based on 24-GHz FMCW Doppler radar. Remote Sens. 2019, 11, 1237. [Google Scholar] [CrossRef] [Green Version]
- Lv, W.; He, W.; Lin, X.; Miao, J. Non-contact monitoring of human vital signs using FMCW millimeter wave radar in the 120 GHz band. Sensors 2021, 21, 2732. [Google Scholar] [CrossRef]
- Gong, J.; Zhang, X.; Lin, K.; Ren, J.; Zhang, Y.; Qiu, W. RF Vital Sign Sensing Under Free Body Movement. Proc. Acm Interact. Mob. Wearable Ubiquitous Technol. 2021, 5, 1–22. [Google Scholar] [CrossRef]
- Wang, D.; Yoo, S.; Cho, S.H. Experimental comparison of IR-UWB radar and FMCW radar for vital signs. Sensors 2020, 20, 6695. [Google Scholar] [CrossRef]
- Rana, S.; Dey, M.; Brown, R.; Siddiqui, H.; Dudley, S. Remote Vital Sign Recognition through Machine Learning augmented UWB. In Proceedings of the 12th European Conference on Antennas and Propagation (EuCAP 2018), Institution of Engineering and Technology, London, UK, 9–13 April 2018. [Google Scholar]
- Khan, M.I.; Jan, M.A.; Muhammad, Y.; Do, D.T.; Mavromoustakis, C.X.; Pallis, E. Tracking vital signs of a patient using channel state information and machine learning for a smart healthcare system. In Neural Computing and Applications; Springer: Berlin/Heidelberg, Germany, 2021; pp. 1–15. [Google Scholar] [CrossRef]
- Liu, X.; Jiang, Z.; Fromm, J.; Xu, X.; Patel, S.; McDuff, D. MetaPhys: Few-shot adaptation for non-contact physiological measurement. In Proceedings of the Conference on Health, Inference, and Learning, Online, 8–10 April 2021; pp. 154–163. [Google Scholar]
- AG, I.T. XENSIV™ 60GHz Radar Sensor for Advanced Sensing. 2021. Available online: https://www.infineon.com/cms/en/product/sensor/radar-sensors/radar-sensors-for-iot/60ghz-radar/bgt60tr13c/ (accessed on 28 October 2022).
- Vernier. Go Direct® Respiration Belt. 2020. Available online: https://www.vernier.com/product/go-direct-respiration-belt/ (accessed on 8 November 2022).
- Van der Maaten, L.; Hinton, G. Visualizing data using t-SNE. J. Mach. Learn. Res. 2008, 9, 2579–2605. [Google Scholar]
- Joyce, J.M. Kullback-leibler divergence. In International Encyclopedia of Statistical Science; Springer: Berlin, Germany, 2011; pp. 720–722. [Google Scholar]
- Nichol, A.; Achiam, J.; Schulman, J. On first-order meta-learning algorithms. arXiv 2018, arXiv:1803.02999. [Google Scholar]
- Antoniou, A.; Edwards, H.; Storkey, A. How to train your MAML. arXiv 2018, arXiv:1810.09502. [Google Scholar]
Symbol | Quantity | Value |
---|---|---|
number of transmitters | 1 | |
number of receivers | 3 | |
number of chirps | 2 | |
samples per chirp | 200 | |
center freq. | 60 GHz | |
sampling freq. ADC | 2 MHz | |
frames per second | 20 Hz | |
chirp time duration | 150 µs | |
bandwidth | [58, 62] → 4 GHz |
Loss / N–Shots | 1–Shot | 5–Shots | 10–Shots |
---|---|---|---|
84.11 ± 6 | 83.92 ± 1 | 83.39 ± 1 |
Time N–Shots | 1–Shot | 5–Shots | 10–Shots |
---|---|---|---|
Adaptation Time [ms] | 797 | 2,614 | 5,877 |
Loss / N–Shots | 1–Shot | 5–Shots | 10–Shots |
---|---|---|---|
(No Corrupt.) | 226.30 ± 5 | 224.53 ± 5 | 221.97 ± 5 |
(Corrupt.) | 84.11 ± 6 | 83.92 ± 1 | 83.39 ± 1 |
Parameters / Latent Dim. | 16 | 32 | 64 | 128 |
---|---|---|---|---|
86.75 ± 5 | 84.11 ± 6 | 84.31 ± 14 | 85.19 ± 34 | |
Trainable Params. | 382,658 | 739,074 | 1,451,906 | 2,877,570 |
Algorithm N–Shots | 1–Shot | 5–Shots | 10–Shots |
---|---|---|---|
Reptile | 100.02 ± 2 | 90.78 ± 2 | 86.95 ± 1 |
MAML | 86.52 ± 5 | 83.68 ± 1 | 83.45 ± 1 |
MAML | 85.86 ± 10.7 | 82.9 ± 3 | 88.16 ± 15 |
MAML | 84.11 ± 6 | 83.92 ± 1 | 83.39 ± 1 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Mauro, G.; De Carlos Diez, M.; Ott, J.; Servadei, L.; Cuellar, M.P.; Morales-Santos, D.P. Few-Shot User-Adaptable Radar-Based Breath Signal Sensing. Sensors 2023, 23, 804. https://doi.org/10.3390/s23020804
Mauro G, De Carlos Diez M, Ott J, Servadei L, Cuellar MP, Morales-Santos DP. Few-Shot User-Adaptable Radar-Based Breath Signal Sensing. Sensors. 2023; 23(2):804. https://doi.org/10.3390/s23020804
Chicago/Turabian StyleMauro, Gianfranco, Maria De Carlos Diez, Julius Ott, Lorenzo Servadei, Manuel P. Cuellar, and Diego P. Morales-Santos. 2023. "Few-Shot User-Adaptable Radar-Based Breath Signal Sensing" Sensors 23, no. 2: 804. https://doi.org/10.3390/s23020804
APA StyleMauro, G., De Carlos Diez, M., Ott, J., Servadei, L., Cuellar, M. P., & Morales-Santos, D. P. (2023). Few-Shot User-Adaptable Radar-Based Breath Signal Sensing. Sensors, 23(2), 804. https://doi.org/10.3390/s23020804