Apple’s Rumored AI Health Coach Is a Privacy Minefield
The company’s obsession with health continues.
Apple is working on an AI-based wellness “coach” that could push its obsession with health to the next level, Bloomberg reports. The new program would leverage information Apple is already able to collect with an Apple Watch to tailor programs to individual customers.
Alongside this year’s software updates — and, assuming it happens, a VR headset launch — Apple is continuing to chip away at what CEO Tim Cook has described as the company’s greatest “contribution”: health services. Until now, that’s mostly been expressed in the Apple Watch’s variety of proactive health features and compact sensors, but this proposed service would change how Apple use’s customers' private data, maybe even challenge the company’s own rigorous privacy principles.
“Quartz”
The new service is “designed to keep users motivated to exercise” and “improve eating habits,” among other aims, Bloomberg writes. Internally, Apple is referring to the service as “Quartz,” and it plans on opening up subscriptions for the program — because, yes, there’s a services angle to all of this — as early as next year.
Outside of Asia, it would be the first service from Apple of its kind, but as Bloomberg notes, the company did partner with Singapore’s government to launch a somewhat similar program called LumiHealth in 2020. LumiHealth integrates with Apple Watches and provides challenges like walking for a certain amount of time or logging a certain amount of mindfulness minutes, with rewards if you complete them successfully.
Apple’s existing health offerings, like the Health app, Fitness+, and the Apple Watch’s ability to monitor heart health, are largely passive, focused on noticing trends and storing and displaying health information in a clear way. Being more active in the role it takes in a customer's life, actually offering advice and guidance, even if it's AI-written and generic, would be a big step up.
“AI” Doesn’t Happen Without People
Amidst the current rush to incorporate generative AI like ChatGPT, and basically, anything that can be described as “powered by artificial intelligence” into existing apps and services, we often forget there’s a lot of human input in these systems. Training data includes whole websites, even personal blogs of people’s work, used frequently without their knowledge. Swaths of gig workers label images for recognition systems for low pay. AIs don’t just work out of the gate or keep working without maintenance.
Even Apple’s own Siri, which is limited in what it can actually do, relied on workers occasionally reviewing voice recordings to make sure they were being understood properly. The revelation in a Guardian report that these workers occasionally heard confidential information in the process could be part of the reason Apple shifted to processing all audio on-device in iOS 15.
The complexity of health programs, and how personal health data is in comparison to a request to set a timer or perform what amounts to a web search, obviously introduces some risk and likely the need for humans to touch the service at some point in the process. The health data Apple collects is entirely encrypted on its way to and stored in the Health app, and terms can change when other apps request access to Health app data, but one assumes Apple would try and keep its own policies consistent.
It's not clear if Apple will ultimately roll out this service, and it obviously pales in comparison to the current risk companies are willing to expose you to as use cases for AI are explored. But, as the company embraces more of the AI-powered features its peers are rapidly iterating on, maintaining a privacy-first position is only going to get more complicated.