Yesterday, Apple announced their new CoreMotion M7 Co-Processor in the iPhone 5S. This new technology allows any developer to create apps that track your physical activity – similar in function to the Fitbit or Jawbone Up – using just the phone’s onboard sensor.
The announcement included a bit of technical jargon, a demo of a new activity tracking app from Nike and the intriguing promise of a new generation of health and fitness apps. Here’s a rundown of what we think is important, in Q&A format:
What makes this new? Doesn’t my phone already have an accelerometer?
While old versions of the iPhone contain motion tracking sensors, they are optimized for use while an app is actively running (i.e. to control a car in a racing game). Apps could not robustly access data while in the background and continuous data acquisition used up battery quickly – making it nearly impossible to do passive activity and fitness tracking in a detailed manner.
Why might be Apple doing this?
We can imagine a few reasons for Apple to do this:
- Experience: Beyond enabling new fitness tracking applications, enabling apps to be “motion aware” allows developers to create more compelling experiences. Chat apps will hold notifications if you are driving or running. Transportation apps will learn how fast you walk, giving more accurate suggestions. Game apps might tie in real world movement to the online gameplay.
- Pushing innovation in digital health: It’s no secret Apple is interested in health. By opening up this health data to the world of developers (and not just the few developers who have also developed a hardware activity tracker), Apple is asking the global talent pool to innovate on activity tracking and its applications.
- iWatch: Let’s say you wanted to release a wrist-based display that provide context-aware information to the user. Wouldn’t it be helpful if you had a whole bunch of developers already trying to build context aware apps?
What exactly does the API and M7 co-processor do?
The API gives developers simple access to functions that:
- Update an app whenever a person is walking, running or in a car, enabling a developer to change that app’s experience.
- Provide a history of the steps and movement taken by the user.
The M7 Coprocessor is the hardware component that enables this API functionality. It works behind the scenes to intelligently manages the various sensors (accelerometer, gyroscope, magnetometer) in the background – saving power and providing a clean API to developers. While Apple has not released exactly how this works, based on our experience with sensors, we would guess it does things like:
- Throttles how fast the accelerometer and gyroscope samples data. When you are running, for example, these sensors need to sample much faster (and use much more battery) to know what’s happening then if you are sitting still.
- Perform energy-optimized on chip calculation of steps
Does Android do this?
The standard Android API does not support this; however, specific device vendors are coming out with their own health-tracker-in-phone functionalities. Samsung’s Galaxy S4, for example, includes the S Health app that provides activity tracking functionality using the phone’s on-board sensors.
Are existing wearable devices obsolete?
Yes and no. For people who don’t always carry their phone with them, then a pure activity tracker can still provide value. For everyone else, the only advantage of a wrist-based device like the Nike Fuelband is it gives you a one-glance indicator of how close you are to reaching your goal.
Is this enough value to justify buying, charging and wearing a separate activity tracker device? For most people, the answer is probably no.
We believe the next generation of wearables will need to create value not just when someone is moving, but also during all of the other moments in life. Similarly, they will have to look at physiological signals beyond movement to create valuable health and wellness outcomes.