Beyond Steps: How Breeze Puts Context Around Your Movement
By Nicholas Arcolano and Goss Nuzzo-Jones
Here at RunKeeper, we’ve been hard at work with the launch of our latest app—Breeze—a fitness companion that highlights the healthy decisions you’re already making for today, and helps you set smart goals to improve and maintain your fitness for tomorrow. Breeze offers insight and encouragement by uncovering the context around your activity choices, letting you know how small actions are adding up and how your habits are changing over time.
Breeze by RunKeeper uses Apple’s M7 technology to help you find opportunities for active living throughout your day.
Along the way we’ve learned a lot about how to best utilize the Apple M7 motion coprocessor—a core enabling technology for Breeze in each iPhone 5S that collects, processes, and stores motion data such as step counts and estimates of activity type (e.g. “walking” or “running”). It operates continuously and at low power, retaining a 7-day cache of its measurements. For all those developers out there interested in what it looks like to tap into the M7, we’re going to focus on the activity detection feature, which provides some fascinating data and interesting challenges.
Using activity detection to provide context
In Breeze, much of the experience revolves around contextualizing your movement as a user so that it’s insightful and digestible. Knowing how many steps you’ve taken throughout the day is a strong indication of your level of activity, but due to the limited context of step counts, these numbers becomes less helpful as days and weeks pass. By identifying the specific type of movement (most commonly “walking,” “running,” or “stationary,” i.e. staying still) and by aggregating this movement into discrete activities, we can communicate your motion data in a way that’s easy to recognize. (“Hey, that long walk was my trip to get lunch!”) We also tie in auxiliary data sources such as location and geographic data, though this must be done judiciously to protect battery life. By presenting a complete picture of each activity throughout a user’s day, Breeze can encourage positive behaviors as they happen.
By breaking up your movement into discrete activities, Breeze helps you understand how individual choices contribute to your overall activity level.
Augmenting your motion data with GPS helps paint an even clearer picture of your day’s activity.
Once we have processed a user’s activity data, there are some natural places where it can be used to tackle other challenges. For instance, Breeze delivers helpful notifications throughout the day, and it’s important that they arrive at an appropriate time. We use activity detection to make sure that you’ve concluded a particular activity before we congratulate you on a job well done. This is especially important in cases where you might move through several activities (such as going for a run followed by a cooldown walk), and we want to make sure you’ve finished before we send a notification. We also use the activity data in our background tracking engine to dynamically adjust measurement accuracy to maximize battery life.
The M7 also offers opportunities for user experience decisions outside of fitness tracking applications. The concept of push notifications or any other user interactions being contingent upon the activity type is certainly not unique to a fitness app—there are a variety of settings where one might want to limit interaction around periods of inactivity, such as when the phone is stationary for a long time (indicating that the user is away from their phone or asleep). Similarly, motion data can be used to encourage a user to keep their phone on them when it’s important, and altering the experience when we discover that it isn’t.
Limitations and challenges of M7 motion data
In our investigation of the data that comes out of the M7 activity API, we ran into a number of interesting challenges. The API identifies activities including “walking,”“running,” “automotive,” “stationary,” and “unknown,” and these can occur in any combination (e.g. “stationary” and “automotive” when driving in a car, for instance). In our experience the most common activities by far are stationary and walking, followed by running. “Automotive” is also common, but seems to be a catch-all for any high-acceleration or non-human motion that is not clearly understood. Unfortunately, this limitation means that the M7 currently can’t be used to reliably detect cycling, at least not without the addition of auxiliary data sources.
From our studies we found that the “unknown” activity rarely occurred, and thus it is not a particularly useful sign of uncertainty in the activity measurements. Instead, alternative indicators of measurement quality must be used, such as outliers in the observations or rapid switching between activity types.
The API collects (and provides) data at a variable rate, with each measurement having a reported confidence. We found some significant variance in the sampling rate, depending on activity (and possibly depending on iOS version as well). We implemented a custom algorithm on top of the motion API to handle variances in sampling rates, as well as smooth confidence in measurements. Ultimately, our goal is to accumulate the individual activity measurements into high confidence time boxes, allowing us to create a cohesive timeline of your motion and activity.
The M7 chip is a tremendous new tool, enabling lots of new fitness applications, and enhancing others (check out auto-pause and cadence graphs in RunKeeper!). As apps have started to take advantage of the new APIs, that innovation has been very focused on step counting (not undeservedly, it’s great!). However, it’s not just a step counter. The rich data accessible through the activity detection, while not without its challenges, is extraordinary. We’re very excited to continue building Breeze on the M7 and look forward to using it’s capabilities in new ways.
Learn more and get Breeze for your iPhone 5s at http://breezeapp.com!