A Hybrid Vision Transformer and Residual Neural Network Model for Fall Detection using UWB Radars
- Authority: Applied Intelligence
- Category: Journal Publication
Detecting falls presents a significant challenge for researchers, given the risk of serious injuries like femoral neck fractures, brain hemorrhages, or burns, which result in significant pain and, in some cases, worsen over time, leading to end-of-life complications or even fatalities. One approach to addressing this challenge involves promptly alerting caregivers, such as nurses, upon detecting a fall. In our work, we present a technique to detect falls within a 40-square-meter apartment by collecting data from three ultra-wideband radars. The presented technique combines a vision transformer and a residual neural network for fall identification, a binary classification task distinguishing between fall and non-fall events. To train and test the presented technique, we use data reflecting various fall types simulated by 10 participants across three locations in the apartment. We evaluated the performance of the presented technique in comparison with some base models by using the leave-one-subject-out strategy to demonstrate the generalization of experiment results in practical scenarios with new subjects. We also report our results by applying cross-validation to select a validation set, which highlights the effectiveness of the presented technique during the training phase and demonstrates the confidence of the obtained results in the testing phase. Consistently, the results illustrate the superior performance of the presented technique compared to the based models. Encouragingly, our results indicate nearly 99% accuracy in fall detection, demonstrating promising potential for real-world application.