Taught Me About the Future of Wearables
Posted by CENTURY HEARING
Article from the Atlantic
http://www.theatlantic.com/technology/archive/2015/02/what-my-hearing-aid-taught-me-about-the-future-of-wearables/385145/
As human-enhancing technology becomes tinier and more advanced, the price of progress is complexity.
I was into wearables before there was Google Glass, Apple Watch, or the Moto 360. I was into them before cheap devices told you how much you had walked, run, slept, or eaten. In fact, I’ve been into them for so long now that I’m not quite sure when it started. I think it was around when I was 5, in 1986.
The wearables I started wearing as a kid and still wear today are hearing aids—or, as my audiologist euphemistically calls them, "amplification devices." Although many will never need hearing aids, today’s tech firms are making it likely that, someday soon, tiny computers will become extensions of your body, just as they have been part of mine for nearly 30 years. Thanks to that experience, I feel as though I’ve had a sneak peek into our wearable future—and I can make some predictions about what it will look like.
To be fair, hearing aids are quite different from the current array of consumer wearables. Hearing aids are medical devices designed to make up for a physical impairment. By contrast, consumer wearables like the Apple Watch are luxury items that let us read text messages and measure our fitness. This distinction has legal significance: The FDA tightly regulates any device that tries to either diagnose or treat a medical condition. That means certain features are unlikely to ever exist in a consumer wearable, unless Tim Cook wants to sell watches that require a doctor’s prescription.
But despite initial appearances, both medical and consumer wearables share a few important goals.
Broadly speaking, both types of wearables aim to fill gaps in human capacity. As Sara Hendren aptly put it, "all technology is assistive technology." While medical devices fill gaps created by disability or illness, consumer wearables fill gaps created by being human. For example, evolution hasn’t given us brain wi-fi, yet.
Both kinds of wearables also need to justify being attached to our bodies. This seems pretty obvious for hearing aids, but it is just as true for consumer devices. A wearable that serves as only a slightly more convenient screen for your phone is hardly reason for the average person to spend hundreds of dollars. Instead, wearables need to offer a feature that works best when in close contact with your body, like measuring heart rate or offering haptic feedback.
Also, both types of wearables need to embed themselves seamlessly into our experiences. If a wearable obstructs your experience of the real world, or is a distraction, it’s likely to end up on a shelf instead of your wrist. That’s not to say that they don’t take getting used to—even after a lifetime of wearing hearing aids, it still takes me several weeks to adjust to a new pair. But after that period, a well-made wearable should seem like a seamless extension of our bodies.
In my current role at the Berkman Center for Internet & Society at Harvard University, I’m lucky to be able to study something I care deeply about: technology’s impact on our lives. I’m sure my interest partly arises from how I’ve depended on technology for as long as I can remember. I don’t know with certainty how consumer wearables will develop, but what I do know is how much hearing aids have changed over the last 30 years. And I have some insight into what sensory-enhancing wearables—like hearing aids, and unlike data-recording wearables like pedometers—could someday become. Over the next few years, I expect that we will see four trends, rich in both opportunity and peril, shape the evolution of these wearables from toys into tools.
* * *
In order to justify being part of our bodies, wearables need to offer something beyond an additional screen or input device. This means that sensory-enhancing wearables will need to mediate between reality and our experiences, altering our perception of the world around us.
For hearing aids, that role is enhancing sound, replacing the too-soft sounds of the real world with louder, more comprehensible ones. But modern hearing aids don’t simply make everything louder; instead, they provide a substitute soundscape tailored to my needs and my environment. When I go into a loud restaurant, the devices can identify the clatter of glasses and the din of conversation, and tune out those sounds, while tuning into the sound of a nearby voice. The result is an audio experience that is substantially different from the objective reality; the device replaces a reality that would be challenging with a substitute that is easier to understand and utilize.
Just as hearing aids replace one soundscape with another, future wearables will be able to alter the way we experience the world. Microsoft’s recently announced HoloLens, for example, will be able to help a homeowner perform their own electrical repairs by projecting instructions, visuals aids, and even expert advice right onto an exposed electrical outlet. In that way, future wearables will replace traditional sensory or communications experiences with ones that are richer and deeper.
The process of substituting realities means that our perceptions of the world around us will become increasingly mediated by algorithms that we do not control or even understand. The world I hear through my hearing aids is a world interpreted and translated through millions of calculations a second. Algorithms determine whether a sound in the distance is the whir of a refrigerator compressor or the whisper of your friend. If it works correctly, I may not hear the compressor at all. But if it works incorrectly, I may not hear my friend at all.
Medical devices are already increasingly ruled by complex algorithms. Just as hearing aid algorithms determine what sounds are amplified and what sounds are muted, pacemaker algorithms determine when to deliver an electronic pulse to the heart. And bionic pancreas algorithms determine when to deliver additional insulin. In these examples, the algorithms don’t just shape the perception of reality—they make life-saving decisions.
The influence of algorithms is nothing new. They shape a lot of what we perceive online. When used in wearable devices that shape our perceptions of the world around us, algorithms can have a profound impact. For example, a device that reads facial expressions to assess moods could affect how you approach your boss, or whether you think your significant other is mad at you. Or a device that hides stressful visual stimuli could remove an annoying ad on your subway commute, but it could just as easily remove a helpful PSA. As wearables do more to reshape our realities, the way we perceive the world will become increasingly shaped by the algorithms that govern those devices.
Not only does that make our perceptions increasingly dependent on algorithms, but it makes our perceptions increasingly dependent on the people who make those algorithms. My hearing aids use increasingly complex algorithms. Although a trained audiologist can adjust the devices to my unique needs, as the algorithms become more complex, the opportunities for customization become paradoxically more limited. My hearing aids, for example, have 20 independently adjustable audio channels, and while my audiologist can adjust each one, he usually adjusts them in groups of 6 or 7 channels at a time. If consumer wearables don’t offer significant opportunities for customization (or provide access to an expert who can help customize the experience), it will leave users even more dependent on the default algorithms.
The more we rely on wearables to interpret the outside world for us, it will become critical for devices to communicate failures. And the more seamless the experience of wearables becomes, the harder it is to know when it isn’t working as intended.
In some cases failures are obvious: If my hearing aid doesn’t turn on, then I can take steps to address the issue. However, in other cases failure is less obvious. At a meeting a few months ago, I was sitting near a loud air conditioner that made it difficult to hear the people across the table. I knew my hearing aids should reduce the background noise, but because the aids produce sounds using complex, personalized algorithms, I had no way of knowing whether the hearing aids were malfunctioning or whether the air conditioner was just too loud. The more personalized the device and the subjective experience it creates, the harder it is to know when things are going wrong.
Future wearables will likely do incredibly complex things, and when the results are unexpected we may trust that the device knows best, privy to secret knowledge or power. But sometimes it will just be wrong. Identifying whether what we see or hear is the proper functioning, the outcome of an inscrutable algorithm, or simply a failure, may be quite challenging.
If failures are hard to detect, the solution is just as challenging: pervasive recording. The more the behavior of wearables is dependent on context and inputs, the more that troubleshooting requires data collection. After a plane crash, one of the first things that investigators look for is the "black box" flight data recorder, because it is often impossible to reconstruct what went wrong without also knowing things like the airspeed, the throttle, and the position of the flaps and gears. Troubleshooting wearables presents many of the same challenges.
When I go to my audiologist, I can tell him that I didn’t think my hearing aids worked correctly at a noisy restaurant a few weeks ago. But without a record of the the noisy environment and the sound I heard from the aids, he can only guess about what happened. For the user, this trial-and-error form of troubleshooting can be frustrating, especially when it involves multiple trips to the audiologist for readjustments.
Up until recently, the idea of storing gigabytes of data on a hearing aid would have been absurd. The devices didn’t have sufficient storage and persistent recording would sap the already-limited battery life. But the newest hearing aids now record certain types of data for diagnostic purposes. If I raise or lower the volume on the aids, the device records information about the new setting and lets my audiologist download the data at a later date. As data storage becomes cheaper and power efficiency improves, the collection of additional data could help the device be better fitted to my needs and enable better troubleshooting.
The same drive toward additional data collection will happen in consumer wearables as well. How do you know if your mood-identifying glasses are working correctly? That requires knowing both the input (the image of someone’s face or their voice) and the output (the identified mood). It would be easy to store still images of faces for diagnostic purposes and troubleshooting, and just as easy to upload them to the device manufacturer to help improve their algorithms.
In some cases, storage may not even be necessary as consumer wearables might transmit everything in real time to centralized servers for processing. With limited processing power and battery life, wearables might offload computationally intensive processing to centralized computers. This is what Apple does (or used to do) with Siri, where at least some analysis of your voice request is processed on remote Nuance servers. Although this enables more complex analysis than small wearables might be able to do otherwise, it also creates greater privacy concerns as more data is transmitted to, stored by, and kept by others.
* * *
When I got my first pair of hearing aids, they were large and analog, and my audiologist made adjustments to the sound outputs using a small screwdriver. My hearing aids today are so small they can fit invisibly in the ear canal, and my audiologist adjusts them wirelessly on computer. The pace of progress has been astounding, and I have no doubt that progress has changed my life for the better in significant and concrete ways.
The price of progress, however, is complexity. Older hearing aids had limited customization, altered sounds in very basic and predictable ways, failed in obvious ways, and didn’t collect data. Now things are different. The endless customization available in new aids creates more opportunities for mistakes. The complex algorithms make it harder to diagnose problems. The total substitution of experience stifles attempts to identify errors. And increasing data collection means hearing aids may soon have to grapple with thorny issues of privacy.
The same holds true for consumer wearables. If they follow the path of hearing aids, future generations of wearables will be more immersive, more complex, more difficult to troubleshoot, and more pervasive in their data collection. As long as we see wearables as toys or luxury goods, it is easy to write off these challenges. But there is a real opportunity for wearables to improve the lives of many in substantial ways just as they’ve improved my life since 1986. To realize those improvements, we cannot ignore these trends, and we must take wearables seriously as the indispensable tools they will soon become.
The wearables I started wearing as a kid and still wear today are hearing aids—or, as my audiologist euphemistically calls them, "amplification devices." Although many will never need hearing aids, today’s tech firms are making it likely that, someday soon, tiny computers will become extensions of your body, just as they have been part of mine for nearly 30 years. Thanks to that experience, I feel as though I’ve had a sneak peek into our wearable future—and I can make some predictions about what it will look like.
To be fair, hearing aids are quite different from the current array of consumer wearables. Hearing aids are medical devices designed to make up for a physical impairment. By contrast, consumer wearables like the Apple Watch are luxury items that let us read text messages and measure our fitness. This distinction has legal significance: The FDA tightly regulates any device that tries to either diagnose or treat a medical condition. That means certain features are unlikely to ever exist in a consumer wearable, unless Tim Cook wants to sell watches that require a doctor’s prescription.
But despite initial appearances, both medical and consumer wearables share a few important goals.
Broadly speaking, both types of wearables aim to fill gaps in human capacity. As Sara Hendren aptly put it, "all technology is assistive technology." While medical devices fill gaps created by disability or illness, consumer wearables fill gaps created by being human. For example, evolution hasn’t given us brain wi-fi, yet.
Both kinds of wearables also need to justify being attached to our bodies. This seems pretty obvious for hearing aids, but it is just as true for consumer devices. A wearable that serves as only a slightly more convenient screen for your phone is hardly reason for the average person to spend hundreds of dollars. Instead, wearables need to offer a feature that works best when in close contact with your body, like measuring heart rate or offering haptic feedback.
Also, both types of wearables need to embed themselves seamlessly into our experiences. If a wearable obstructs your experience of the real world, or is a distraction, it’s likely to end up on a shelf instead of your wrist. That’s not to say that they don’t take getting used to—even after a lifetime of wearing hearing aids, it still takes me several weeks to adjust to a new pair. But after that period, a well-made wearable should seem like a seamless extension of our bodies.
In my current role at the Berkman Center for Internet & Society at Harvard University, I’m lucky to be able to study something I care deeply about: technology’s impact on our lives. I’m sure my interest partly arises from how I’ve depended on technology for as long as I can remember. I don’t know with certainty how consumer wearables will develop, but what I do know is how much hearing aids have changed over the last 30 years. And I have some insight into what sensory-enhancing wearables—like hearing aids, and unlike data-recording wearables like pedometers—could someday become. Over the next few years, I expect that we will see four trends, rich in both opportunity and peril, shape the evolution of these wearables from toys into tools.
* * *
1. Wearables will create substitute realities.
In order to justify being part of our bodies, wearables need to offer something beyond an additional screen or input device. This means that sensory-enhancing wearables will need to mediate between reality and our experiences, altering our perception of the world around us.
For hearing aids, that role is enhancing sound, replacing the too-soft sounds of the real world with louder, more comprehensible ones. But modern hearing aids don’t simply make everything louder; instead, they provide a substitute soundscape tailored to my needs and my environment. When I go into a loud restaurant, the devices can identify the clatter of glasses and the din of conversation, and tune out those sounds, while tuning into the sound of a nearby voice. The result is an audio experience that is substantially different from the objective reality; the device replaces a reality that would be challenging with a substitute that is easier to understand and utilize.
Just as hearing aids replace one soundscape with another, future wearables will be able to alter the way we experience the world. Microsoft’s recently announced HoloLens, for example, will be able to help a homeowner perform their own electrical repairs by projecting instructions, visuals aids, and even expert advice right onto an exposed electrical outlet. In that way, future wearables will replace traditional sensory or communications experiences with ones that are richer and deeper.
2. Wearables will be ruled by algorithms.
The process of substituting realities means that our perceptions of the world around us will become increasingly mediated by algorithms that we do not control or even understand. The world I hear through my hearing aids is a world interpreted and translated through millions of calculations a second. Algorithms determine whether a sound in the distance is the whir of a refrigerator compressor or the whisper of your friend. If it works correctly, I may not hear the compressor at all. But if it works incorrectly, I may not hear my friend at all.
Medical devices are already increasingly ruled by complex algorithms. Just as hearing aid algorithms determine what sounds are amplified and what sounds are muted, pacemaker algorithms determine when to deliver an electronic pulse to the heart. And bionic pancreas algorithms determine when to deliver additional insulin. In these examples, the algorithms don’t just shape the perception of reality—they make life-saving decisions.
The influence of algorithms is nothing new. They shape a lot of what we perceive online. When used in wearable devices that shape our perceptions of the world around us, algorithms can have a profound impact. For example, a device that reads facial expressions to assess moods could affect how you approach your boss, or whether you think your significant other is mad at you. Or a device that hides stressful visual stimuli could remove an annoying ad on your subway commute, but it could just as easily remove a helpful PSA. As wearables do more to reshape our realities, the way we perceive the world will become increasingly shaped by the algorithms that govern those devices.
Not only does that make our perceptions increasingly dependent on algorithms, but it makes our perceptions increasingly dependent on the people who make those algorithms. My hearing aids use increasingly complex algorithms. Although a trained audiologist can adjust the devices to my unique needs, as the algorithms become more complex, the opportunities for customization become paradoxically more limited. My hearing aids, for example, have 20 independently adjustable audio channels, and while my audiologist can adjust each one, he usually adjusts them in groups of 6 or 7 channels at a time. If consumer wearables don’t offer significant opportunities for customization (or provide access to an expert who can help customize the experience), it will leave users even more dependent on the default algorithms.
3. Wearables will fail invisibly.
The more we rely on wearables to interpret the outside world for us, it will become critical for devices to communicate failures. And the more seamless the experience of wearables becomes, the harder it is to know when it isn’t working as intended.
In some cases failures are obvious: If my hearing aid doesn’t turn on, then I can take steps to address the issue. However, in other cases failure is less obvious. At a meeting a few months ago, I was sitting near a loud air conditioner that made it difficult to hear the people across the table. I knew my hearing aids should reduce the background noise, but because the aids produce sounds using complex, personalized algorithms, I had no way of knowing whether the hearing aids were malfunctioning or whether the air conditioner was just too loud. The more personalized the device and the subjective experience it creates, the harder it is to know when things are going wrong.
Future wearables will likely do incredibly complex things, and when the results are unexpected we may trust that the device knows best, privy to secret knowledge or power. But sometimes it will just be wrong. Identifying whether what we see or hear is the proper functioning, the outcome of an inscrutable algorithm, or simply a failure, may be quite challenging.
4. Wearables will record everything.
If failures are hard to detect, the solution is just as challenging: pervasive recording. The more the behavior of wearables is dependent on context and inputs, the more that troubleshooting requires data collection. After a plane crash, one of the first things that investigators look for is the "black box" flight data recorder, because it is often impossible to reconstruct what went wrong without also knowing things like the airspeed, the throttle, and the position of the flaps and gears. Troubleshooting wearables presents many of the same challenges.
When I go to my audiologist, I can tell him that I didn’t think my hearing aids worked correctly at a noisy restaurant a few weeks ago. But without a record of the the noisy environment and the sound I heard from the aids, he can only guess about what happened. For the user, this trial-and-error form of troubleshooting can be frustrating, especially when it involves multiple trips to the audiologist for readjustments.
Up until recently, the idea of storing gigabytes of data on a hearing aid would have been absurd. The devices didn’t have sufficient storage and persistent recording would sap the already-limited battery life. But the newest hearing aids now record certain types of data for diagnostic purposes. If I raise or lower the volume on the aids, the device records information about the new setting and lets my audiologist download the data at a later date. As data storage becomes cheaper and power efficiency improves, the collection of additional data could help the device be better fitted to my needs and enable better troubleshooting.
The same drive toward additional data collection will happen in consumer wearables as well. How do you know if your mood-identifying glasses are working correctly? That requires knowing both the input (the image of someone’s face or their voice) and the output (the identified mood). It would be easy to store still images of faces for diagnostic purposes and troubleshooting, and just as easy to upload them to the device manufacturer to help improve their algorithms.
In some cases, storage may not even be necessary as consumer wearables might transmit everything in real time to centralized servers for processing. With limited processing power and battery life, wearables might offload computationally intensive processing to centralized computers. This is what Apple does (or used to do) with Siri, where at least some analysis of your voice request is processed on remote Nuance servers. Although this enables more complex analysis than small wearables might be able to do otherwise, it also creates greater privacy concerns as more data is transmitted to, stored by, and kept by others.
* * *
When I got my first pair of hearing aids, they were large and analog, and my audiologist made adjustments to the sound outputs using a small screwdriver. My hearing aids today are so small they can fit invisibly in the ear canal, and my audiologist adjusts them wirelessly on computer. The pace of progress has been astounding, and I have no doubt that progress has changed my life for the better in significant and concrete ways.
The price of progress, however, is complexity. Older hearing aids had limited customization, altered sounds in very basic and predictable ways, failed in obvious ways, and didn’t collect data. Now things are different. The endless customization available in new aids creates more opportunities for mistakes. The complex algorithms make it harder to diagnose problems. The total substitution of experience stifles attempts to identify errors. And increasing data collection means hearing aids may soon have to grapple with thorny issues of privacy.
The same holds true for consumer wearables. If they follow the path of hearing aids, future generations of wearables will be more immersive, more complex, more difficult to troubleshoot, and more pervasive in their data collection. As long as we see wearables as toys or luxury goods, it is easy to write off these challenges. But there is a real opportunity for wearables to improve the lives of many in substantial ways just as they’ve improved my life since 1986. To realize those improvements, we cannot ignore these trends, and we must take wearables seriously as the indispensable tools they will soon become.
http://www.theatlantic.com/technology/archive/2015/02/what-my-hearing-aid-taught-me-about-the-future-of-wearables/385145/
SHARE: