Our idea was to use the movesense trackers to gather data as people walk. Many of us walks with some sort limp or bad form, whether we know it or not. It would be really useful to be able to detect this as it is occurring. The sensors would be attached to the feet of the user and track the feet movement with a neural network. The network would be trained to detect when the user is not walking properly.
With an ai based on a neural network the movement tracked by the sensors is analysed. The analysis can happen both in a mobile app and a desktop program. However due to some issues the data is not currently being live feed to the network.
Blood, sweat and tears.
The provided API to interact with the movesense sensors proved to be unreliable and cumbersome to use. We used a significant amount of time just trying to figure out the API, when we would rather have spent that time on other, more interesting parts of our project.
We are proud of the neural network that analyses users walking and understand if a user walks in stairs, in a circle and straight. Also it can tell if the user has a walking issue. With this small project we hoped to prove that the sensors could be used to detect good or bad form while exercising. For instance, they could also be used to check whether you are keeping your lower back straight while doing deadlifts at the gym.
A lot of android app development and strategies for future hackathons.
Java, Python, Tensorflow, Android-Studio