Apple’s recent Worldwide Developers Conference (WWDC) keynote was packed with new features for iPhones, Macs and iPads — and like it has done pretty consistently since the debut of its original ‘Health’ app in 2014, those included updates focused on personal health and wellness. Often, it’s impossible to assess the impact of the work Apple is doing in these areas in the moment, and health-related feature announcements aren’t generally as splashy as user interface overhauls for Apple’s device software, for instance. But viewed as a whole, Apple has built probably the most powerful and accessible suite of personal health tools available to an individual, and it shows no signs of slowing down.
I spoke with Apple Vice President of Technology Kevin Lynch who actually demonstrated the Apple Watch for the first time on the world stage during Apple’s September 2014 keynote event. Lynch has seen Apple Watch grow considerably during his time at the company, but he’s also been integral to the evolution of the its health initiatives. He explained how it became what it is today and provided some hints as to where it might go in the future.
“It’s been amazing how much it’s evolved over time,” Lynch said, referring to the original Health app. “It actually started from Apple Watch, where we were capturing heart rate data for calorimetry activity, and [Activity] ring closure, and we needed a place to put the heart rate data. So we created the Health app as a place to store the data.”
From there, Lynch says Apple realized that once you had this centralized location, they could develop a system that could store other data types, as well, and create an API and architecture that allowed developers to store related data there, as well, in a privacy-respecting way. In the early days, the Health app was still essentially a passive storehouse, providing users one touchpoint for various health-related information, but the company soon began thinking more about what else it could offer, and inspiration came from users.
User-guided evolution
A key turning point for Apple’s approach to health came when the company saw that users were doing more with features available via the Apple Watch than the company ever intended, Lynch said.
“We were showing people their heart rate, and you could look at it — we were using it for calorimetry,” he told me. “But some users actually were looking at their heart rate when they weren’t working out, and noticed it was high. […]They would go talk to their doctor, and the doctor would find a heart issue, and we would start getting letters about this. We still get letters today about our work in the space, which is amazing. But some of those early letters were clueing us into ‘Wait, we could actually look for that ourselves in the background.”
Apple then developed its high heart rate alert notifications, which can tell users when Apple Watch detects an unusually high heart rate that occurs when they aren’t moving around very much. High resting heart rates are good indicators of potential issues, and Apple also later added notifications for unusually low heart rates. This was all data that was already available to the user, but Apple saw that it could proactively provide it to users, providing the benefits already enjoyed by the most vigilant of all Apple Watch owners.
From there, Apple started investing more heavily in thinking about more areas where it could glean similar insights. Rather than waiting for user behavior to identify new areas to explore (though Lynch says that’s still important to the team), the company started hiring more clinicians and medical researchers to chart the path forward for Health.
One example of where that led was announced at WWDC: Walking steadiness, a new metric that provides a simple score of how stable a Watch wearer’s average gait is.
“Walking steadiness […] actually came from fall detection,” Lynch said. We were working on fall detection, and and that’s been really awesome, but as we’re working on it, we’re brainstorming about how we can actually help people not fall, rather than just detecting that they fell. It’s pretty tricky to do that in the moment — there’s not much you could do once that’s actually happening.”
Lynch is referring to the fall detection feature that Apple introduced in 2018, which could use motion sensor data to detect what was likely a sudden and severe fall, and provide emergency alerts to hopefully render aid to the wearer who had fallen. Apple was able to look at fall detection data for users in its 100,000-participant strong Heart and Movement study and combine that with data gathered from the iPhone in the same study about walking metrics.
“[The Heart and Movement study data] has been super helpful in some of the work here on machine learning,” Lynch said. “And then we did a focus study of particularly around falls with walking steadiness, where we used [as] the source of truth a set of traditional measurements of walking steadiness; so questionnaires, clinical observation, people meeting with doctors and they’re observing the person walking. And then as over the period of a year or two, as people in that study happened to fall, we were able to look at all their metrics ahead of that and understand, ‘What are the real predictors here of potential fall?’ Then we were able to build a model around that.”
Apple actually accomplished something with its Walking Steadiness feature that is very rare in the health and fitness industry: It created a clinically validated, meaningful new metric around individual health. The Health app assigns a score from Very Low to Low or OK, based on motion-sensing data passively gathered through Apple’s iPhone sensors (the phone is better able to detect these metrics since it’s positioned on your hip, Lynch says). Perhaps best of all, according to Lynch, the data is actually something people can use to make real improvements.
“The other compelling thing is that it’s actionable,” he said. “Some of these things can be harder to change. But with walking steadiness, there are exercises you can do to improve your walking steadiness. And so we built those into the Health app. You can watch the videos and do the exercises and work to improve your steadiness ahead of falling.”
Walking steadiness is perhaps the best expression yet of an area of increased focus for Apple when it comes to health: Turning the devices you carry with you into an ambient protector of sorts.
‘The Intelligent Guardian’
Apple’s Health app provides a good overview of the metrics that you might want to keep track of, and the company has steadily built out a library of vetted contextual information to make it easier to understand what you’re seeing (including through new updated lab displays that translate results into plain language in iOS 15). But one of the areas where it’s in a unique position to innovate is in proactive or preventative health. Lynch pointed out that the walking steadiness feature is a progression of those efforts.
“The walking steadiness work is in this category that we think of as ‘Intelligent Guardian’; it’s ‘How can we help watch out for people with data that they may not otherwise be even looking at or aware of, and let them know of potential changes,” he said.
Lynch admits that the ‘Intelligent Guardian’ category wasn’t something that was initially really part of the plan for Apple Watch and health.
In the early days, we weren’t as onto this line of thinking about ‘Intelligent Guardian, as we are now,” he said. “Those early letters were really inspiring in terms of [pointing out] we could actually let people know these things that are really meaningful.”
Those letters still inspire the team working on health features and help motivate the team and validate their work. Lynch cites one Apple received where a person had purchased an Apple Watch for their father, and while his father was out biking he got thrown off the bike and fell into a gully. Apple Watch detected the fall, and also that he was unconscious, and the Watch was luckily set up to notify emergency contacts and 911, and did both. It provided the son with a location on a map, so his son rushed to the spot, but found paramedics on scene already loading his father (who ended up being okay) into an ambulance.
“Now, there’s a lot of thought that we put into ‘What are the other things that we could maybe sense about someone and let them know about?’” he said. “Our work on health very much involves this ongoing discussion of, from a clinical perspective, what is really interesting to know about somebody? Then from a science perspective, what do we think that we can sense about somebody? It’s this intersection of what might we be able to know and extract from the data that we gather, or are their new sensors that we might be able to build to get some data that could answer the questions that, clinically, we think would be really valid.”
A community approach to individual health
Another big change coming in iOS 15 for Apple Health is sharing. Apple will allow private, secure sharing of health data from users to loved ones and caregivers, including doctors. Users can choose exactly what health data to share, and can revoke access at any time. Apple itself never sees that data, and it’s encrypted locally on your device and then decrypted in local memory on the receiving device.
Health sharing is a natural extension of Apple’s work with the ‘Intelligent Guardian’ since it elevates personal health care into what it always has been — something managed by a network of connected individuals — but augmented by modern technology and sensing capabilities.
“The other person you’re watching out for can see that information, and be notified of changes, and you can see a little dashboard of the data,” Lynch said. “That’s going to be super helpful, we hope, for people, especially as you’re caring for an older adult, or caring for a partner of yours — that’s going to basically enable people to do that kind of mutual support on their health journey.”
Lynch points out that it’s not just about surfacing data that people might not otherwise find, but it’s actually about opening the door for more communication around health between families and personal networks that typically might never happen.
“It enables conversations, where maybe people wouldn’t maybe naturally talk about how much they’ve been walking lately or how their sleep’s been going,” he said. “If you’re up for sharing that, then it can be a conversation that maybe you otherwise wouldn’t have had. And then it’s the same with doctor interactions; when you’re interacting with a doctor, they may not have a great view of your daily health. They have these little silo views of blood pressure at the time and stuff like that, so how can we help you tell your whole story when you’re talking with your doctor and make that conversation even richer than it would have been otherwise, in a very quick period of time?”
Sharing with a doctor relies on integrating with a healthcare provider’s electronic healthcare records (EHR) system, but Lynch notes that it’s using all interoperable standards to make this work, and they have a range of providers with large footprints in the U.S. already lined up to participate at launch. Healthcare professionals using this feature will be able to see data users share with them in a web view in their EHR system, and while that data is only shared ephemerally, they can easily annotate and store specific readings in their permanent EHR for a patient should they require it to back up a diagnosis or course of treatment.
I asked Lynch about the state of EHR, which has had a tricky history in terms of adoption and interoperability, and he said that it’s true that this is something they initially started working on years ago, at a time when making it work would have required a much more massive technical undertaking on Apple’s side to make it actually work. Luckily, the industry in general has been trending toward adopting more open standards.
“There really has been a change in how you can connect to the EHRs in a more standardized way,” he said. “And certainly, we’ve been working with and across all of them to help get this mature.”
Benefits of a long-term user relationship
One of the biggest potential benefits for both users and their doctors of Apple Health is just how much data they can gain access to over time. Apple users who have stuck with the platform and used Health have now been tracking at least heart rate data for around seven years. That’s why another iOS 15 feature, Health Trends, has even more potential future impact.
“Trends is looking at the longer term changes, and starting to identify what may be statistically significant changes in those areas,” Lynch explained. “There are about 20 areas that we’re starting with to do this, and if we start seeing those notable trends, then we can highlight those to you and show you how, for instance, your resting heart rates change now, you know, versus a year ago.”
This is once again the result of the Apple Heart and Movement study and the insights that the company continues to derive from that work. During the study, Apple focused a lot on how to fine-tune insight delivery, so as to ensure that it was providing users with information they could use, but at the same time avoiding any kind of overload or generating more confusion.
“When you work on something like trends, we don’t want to overwhelm people with insights, if you will, but we also don’t want to like not have, you know, if there’s something relevant to show, we don’t want to suppress that. How do we tune that in? So we did a lot of that tuning with the data that we have in the Heart and Movement study, and we’re excited to see how it goes with the with public launch, and we’ll keep iterating on it. We think this is going to be a really powerful way for people to understand long-term changes.”
The future is fusion
Apple’s health story to date is largely one made up of realizations that the sensors the iPhone and Apple Watch carry, originally for other purposes, can provide tremendous insights into our health on a continuous basis — something that previously just hasn’t been possible or practical. That evolved into an intentional strategy of seeking out new sensor technologies to integrate into Watch and other Apple devices to address even more daily health concerns, and Apple continues to figure out new ways to use the sensors that are there already — the addition of respiratory rate measurement during sleep in iOS 15 is a prime example — while working on what new hardware comes next to do even more.
Perhaps one place to look for even more potential in terms of future health capabilities lies in sensor fusion, however. Walking steadiness is the result of not just the iPhone or the Apple Watch acting independently, but of what’s possible when the company can use them in combination. It’s another place where Apple’s tight integration of software and hardware give it an edge, and it multiplies as Apple’s ecosystem of devices, and the sensors they carry, continues to grow.
I ended our interview by asking Lynch about what kind of possibilities might open up when you consider that AirPods, too, contain their own sensors and gather different data that could complement that monitored by the iPhone and Apple Watch in terms of health.
“We already do sensor fusion across some devices today, and I think there’s all kinds of potential here,” he said.