Development Of An Intelligent Shoe System To Minimise Tripping Risk During Walking

[thumbnail of JOSEPH, Anna - THESIS_no signature.pdf]
JOSEPH, Anna - THESIS_no signature.pdf - Submitted Version (6MB)
Restricted to Repository staff only until 5 June 2027

Joseph, Anna M ORCID logoORCID: https://orcid.org/0000-0002-6529-974X (2025) Development Of An Intelligent Shoe System To Minimise Tripping Risk During Walking. PhD thesis, Victoria University.

Abstract

Physical injuries cost the Australian healthcare system more than $3.6 billion annually, with falls during locomotion being the primary cause. This highlights the need for effective systems for the early detection and prevention of tripping risks. While advancements in artificial intelligence and sensor technology have improved gait analysis and wearable assistive devices, current systems are limited in their ability to anticipate corrective actions or predict instability, before losing balance. Detecting obstacles on the floor in advance and alerting the user can significantly reduce the risk of tripping. Shoe-mounted sensor systems could be an optimal solution capable of detecting the obstacles on the floor, assessing the tripping risk and providing timely alerts to help prevent falls. Existing instrumented shoes focus mainly on gait analysis and lack integrated obstacle detection to prevent falls. Therefore, there is a need for a wearable system that not only detects obstacles in real time and provides accurate alerts but also combines gait analysis with obstacle detection for effective tripping risk management. This project presents an intelligent shoe system developed to detect obstacles and predict fall risks in real-time using a combination of advanced sensors and a custom YOLO (You Only Look Once) model optimised for edge applications. The system integrates ultrasound and camera data to detect obstacles and estimate their height and distance. Ultrasound is used for distance measurement due to its effectiveness in providing accurate, real-time distance data using the time-of-flight technique. This method measures the time it takes for an ultrasound pulse to travel to an obstacle and back, allowing for precise distance estimation across various surfaces and lighting conditions. The camera is employed for height measurement because of its high resolution and ability to capture detailed visual information necessary for precise height estimation. Sensor fusion is crucial as it combines the strengths of both sensors—ultrasound for reliable distance measurement and the camera for detailed height analysis—thereby enhancing the system's overall accuracy and robustness. The obstacle detection algorithm employs a custom YOLOv8n model, designed for computational efficiency, making it suitable for deployment on embedded devices. This algorithm integrates data from ultrasound and camera sensors to detect obstacles as small as 2 cm and up to 3 meters away, providing users with sufficient time to take countermeasures. The custom YOLOv8n-based model achieved a mean Average Precision (mAP) of 81.9%, a recall of 80.5%, and a precision of 78.1%. For size estimation, the system first evaluates obstacle dimensions in pixels and then applies a multivariate linear regression model to predict the size in centimetres, delivering reliable measurements in real-world coordinates. For distance estimation, the system reached an accuracy of 98.57%, while size estimation accuracy was 96.5%. It reliably detected obstacles as small as 2cm up to 3 meters away, even under different lighting conditions, and successfully distinguished obstacles from the surrounding environment, demonstrating its reliability and precision for real-world applications. In addition to obstacle detection, the system uses Inertial Measurement Units (IMUs) for gait phase segmentation to enable obstacle detection specifically during the stance phase. A threshold-based approach utilising angular velocity was employed for stance phase detection, ensuring obstacle detection occurs when the foot is in contact with the ground. By integrating the outputs of the YOLO-based obstacle detection model with stance phase information from the IMUs, the sensor-guided shoe system provides a comprehensive assessment of fall risks and delivers timely alerts to users. The combined system demonstrated strong performance achieving an accuracy of 97.64% for distance estimation within 1 to 3 meters, an even higher accuracy of 99.08% for obstacles within 1 meter, and a height measurement accuracy of 95.90%. Experimental trials with human participants evaluated the shoe system under real-time walking conditions. The system demonstrated an accuracy of 97.52% for distance estimation of obstacles between 1 and 3 meters, with an impressive 98.71% accuracy within 1 meter, indicating precise measurement in close proximity. Additionally, the system achieved 93.43% accuracy for height estimation. The trials revealed significant changes in foot clearance following obstacle alerts, highlighting the system's effectiveness in predicting tripping risks. These findings show that the intelligent shoe system is effective in detecting obstacles and predicting tripping risks, offering a practical solution for fall risk prediction and enhancing mobility safety for the elderly.

Additional Information

DOCTOR OF PHILOSOPHY

Item type Thesis (PhD thesis)
URI https://vuir.vu.edu.au/id/eprint/49839
Subjects Current > FOR (2020) Classification > 4007 Control engineering, mechatronics and robotics
Current > FOR (2020) Classification > 4602 Artificial intelligence
Current > Division/Research > Institute for Health and Sport
Keywords obstacle detection, Shoe-mounted sensor, smart shoes; assistive devices; gait biomechanics; fall prevention, You Only Look Once model, YOLO, fall risk prediction
Download/View statistics View download statistics for this item

Search Google Scholar

Repository staff login