| MS Seminar


Name of the Speaker: Mr. Kapil Singh Rathore (EE19S040)
Guide: Dr. Mohanasankar S
Online meeting link: https://meet.google.com/iyv-zcic-epf
Date/Time: April 10th 2023 (Monday), at 3.00 PM
Title: A Multifunctional Network to Address Practical Challenges in Respiration Rate Estimation

Abstract

Respiration rate is a crucial parameter to measure health and well-being. In sports science, accurate respiration rate measurement can help athletes in performance assessment. In this direction, first we present a study for estimation of ventilatory threshold during high dynamic ambulatory activities and sets up the importance of accurate respiration rate estimation.

As the estimation through classical measurement modes are limited only to rest or during slow movements, respiration rate is commonly estimated through physiological signals like ECG, PPG etc. Recently, deep learning algorithms have gained traction for the accurate estimation of respiration rate. Previous studies only tested on clinical data, not ambulatory data. In the second part we propose a multitasking network for the accurate estimation of both respiration rate and signal simultaneously. With the help of thorough experimentation, we have shown the ffectiveness of the multitasking model during various activities.

However, deep learning methods pose challenges, including model interpretability, uncertainty estimation in the context of respiration rate estimation, and model compactness in terms of deployment in wearable platforms. In this direction, we propose a multifunctional framework, which includes a combination of an attention mechanism, an uncertainty estimation functionality, and a knowledge distillation framework. We evaluated the performance of our framework on two datasets containing ambulatory movement. The attention mechanism visually and quantitatively improved instantaneous respiration rate estimation. Using Monte Carlo dropouts to embed the network with inferential uncertainty estimation resulted in the rejection of 3.7% of windows with high uncertainty, which consequently resulted in an overall reduction of 7.99% in the mean absolute error. The attention-aware knowledge distillation mechanism reduced the model’s parameter count and inference time by 49.5% and 38.09%, respectively, without any increase in error rates.