|
Title:
|
DESIGNING A MULTIMODAL AI FRAMEWORK FOR SWIMMING ANALYTICS: ARCHITECTURE, DATA INFRASTRUCTURE, AND HUMAN-CENTRED OBJECTIVES |
|
Author(s):
|
Vanessa Camilleri, Reno Yuri Camilleri, Mark Fialovszky, Daniel Pace, Dylan Seychell and Matthew Montebello |
|
ISBN:
|
978-989-8704-71-9 |
|
Editors:
|
Paula Miranda and Pedro IsaĆas |
|
Year:
|
2025 |
|
Edition:
|
Single |
|
Keywords:
|
Multimodal AI, Explainable AI, Applied Computing, Sports Analysis, Performance Analytics |
|
Type:
|
Short Paper |
|
First Page:
|
265 |
|
Last Page:
|
269 |
|
Language:
|
English |
|
Cover:
|
|
|
Full Contents:
|
if you are a member please login
|
|
Paper Abstract:
|
This paper presents SWIM-360, a modular AI framework for multimodal performance analysis in competitive swimming. The system integrates synchronised underwater video and physiological sensor data to generate interpretable, context-aware feedback for coaches and athletes. It employs a hybrid explainability strategy: SHAP values clarify sensor-derived features (e.g., stroke rate, SmO?), while visual overlays and heuristics highlight biomechanical insights from video. Designed for adaptability, the architecture supports modular upgrades of pose models, fusion strategies, and feedback components. Early-stage validation focuses on verifying data synchronisation, signal consistency, and the feasibility of feature extraction across modalities. While full deployment and interface integration are ongoing, SWIM-360 lays the foundation for human-centred AI in sport by aligning analytics with domain practices. This paper outlines the system's architecture, design rationale, and preliminary results, contributing a blueprint for explainable, multimodal performance analytics in applied settings. |
|
|
|
|
|
|