Virtual personal trainer mobile application that automatically detects exercises: sets, reps and cardio activities.
Workout is a complex technology and the center focus of Khaylo’s technology. For workout, the team evolved the algorithms that Khaylo had created for the “Let’s Go Play” app and bolstered it with more structure and logic. This made the Workout app not only easy to use but helpful for all the different users. During my time at Khaylo, I focused on creating a more human experience for fitness professionals to help them understand their clients’ exercise data and to make better decisions for their business.
• Traditionally, personal trainers have to curate workouts and track the progress for each client, leading to paperwork and the juggling of various data points
• There is no tool in the market that can act as the personal trainer, creating workouts and tracking of the progress
• There is no way to keep detailed exercise completion progress
• There is no way to ensure correct exercise form is followed
• New technology, only using the native sensors of the phone
• User needs to wear an armband to be able to track the exercises
• While the user performs an activity, they can’t see the UI
• Visualize a lot of complex data
There are existing applications that allow a person to digitally create a workout and send it to another user, but there is no way to keep detailed completion progress or to ensure correct form is followed. There are apps that use a device's GPS to detect cardio activities like walking and running but don't offer enough feedback. Wearables are just glorified pedometers.
The first thing we experimented with was scheduling and keeping track of workouts, something that will help Tim and Stephanie manage their clients and gym. We looked at current applications delivering this type of service (e.g. Beyond the White Board, MindBody, etc.) I looked to identify areas where they were functional and areas where they were lacking. We created some prototypes and tested different scenarios.
The user can choose pre-defined workouts or workouts they already created
Assign, schedule and charge
The Personal trainers can assign the workout to their students and set the price
Creating a workout
Premium user and Personal Trainer users have the option to create workouts
For the main dashboard, I introduced a card-based UI for each event. Each card has a different state and a different purpose. It took multiple rounds of user testing and many iterations for the dashboard to evolve.
I worked with the core motion team to identify the data that we could visualize and how much potential we had to simplify it to make the graphs more human. The final result was a simple, clean, and visually-driven single-page dashboard.
Making activity detection more intuitive
Detecting an exercise in the “Let’s go play” app wasn’t intuitive. There wasn’t any visual feedback to indicate when the activity was being tracked. At the end of the activity a bubble would show up in the feed with the name of the activity and the data collected. The user could tap in the bubble to see more details or see the map but the wasn’t possible when the user was performing the exercise because the phone is in an armband and most likely the arms and hands are busy performing the activity.
Through our research, we found out that users were more interested in pre-assembled workouts. We built the workout detection expeirence, taking into consideration the different states: warm up, cardio, cardio machine, exercise, cool down). Each state required UI and voice feedback to guide the athlete through the workout. The UI was packed with different types of modes and activities, so I used colors, illustrations and subtle variations in typography to make the UI less intimidating.
UI & VOICE FEEDBACK
In order to create a system that provided feedback at all stages before, during and after, we came up with a UI approach and a non-UI approach. This new UX framework required to explore not only live-detection feedback in the UI by visualizing the state of the exercise but also providing non UI feedback using voice and phone vibration.
UI & MICRO-INTERACTIONS
The engineers behind this core motion piece are geniuses and everything they were able to compute from native iPhone sensors was unbelievable. Our team was lacking a UI engineer who could create all the different UI elements and UI interactions. I jumped in to help and learnt how to use obj-C and Swift to make the UI possible.
At the end I created a an extensive library with all the UI elements, so building and iterating in the future would be more modularized and easier for the developers.
Designing for something that has not been created before requires a level of thoroughness, dedication to learning and experimentation and many levels of testing. Every design decision was thoroughly reviewed by the board members and validating the product decisions through testing was essential.
© johnatanuribe.com 2017. All rights reserved