Proceedings of the 6th International Conference on Deep Learning, Artificial Intelligence and Robotics (ICDLAIR 2024), 06 Aralık 2024, ss.215-224, (Tam Metin Bildiri)
In modern fitness environments, accurate tracking of user activity and identification poses significant challenges due to the dynamic nature of gym settings. This project introduces an innovative system that uses RFID for entry/exit tracking and OSNet-based deep learning for robust, real-time re-identification of a person. The system incorporates a custom-designed wearable device with IMU sensors and machine learning algorithms to monitor and analyze exercises with high accuracy and minimal user intervention. A key design principle of this system is seamless usability, prioritizing minimal disruption to the user’s workout flow. Once checked in via RFID, users are passively tracked across stations without requiring further action, allowing them to focus solely on their exercise routines. Each station operates independently, using a sequence of motion detection, re-identification, and exercise-specific analysis, while multi-threading enables simultaneous monitoring across multiple stations. The system employs a dual-modality approach, combining data from IMU sensors on a wrist-worn device with video-based pose estimation to accurately track user movements and exercise form. This complementary setup ensures robustness, compensating for limitations in each modality—for example, addressing occlusions in video or stationary poses undetectable by IMU. Together, these data streams allow for precise repetition counting and detailed movement analysis, making the system adaptable to diverse exercise types and enhancing the accuracy and reliability of real-time exercise monitoring.