March 03, 2019
In November 2018, an enraptured audience sat inside Dolby Laboratory’s experimental cinema in San Francisco. The film on view was Free Solo, the nail-biting documentary about rock climber Alex Honnold, who attempts to scale the sheer, 2,300 metre-high granite cliff face of El Capitan in Yosemite National Park without ropes or safety equipment.
On the floor of the cinema, an array of tiny sensors were tracking the audience’s breath. “When we experience stress, joy, or heightened emotions, the chemical composition of our breath changes,” explains Dolby’s chief scientist, Poppy Crum, who was in charge of the experiment. “Certain things, like if my muscles tense, or my heart speeds up, will effect changes in how we breathe, and what we breathe.” The sensors were there to detect exhaled CO2, a classic biomarker of bodily process that are associated with high emotion.
Crum wasn’t in the cinema at the time, but when she received the results of the experiment it was obvious to her where the most nerve-wracking points in Honnold’s climb had been. “You could see where he had decided to abandon a climb and where he continues. All these things show up as peaks and troughs in the CO2,” Crum says. “I’m a rock climber and it was very clear: it was as if I’d been there in the audience and we’d all done the climb with him.”
This experiment was just one of many that Crum carries out at Dolby Laboratories, where she leads scientific development, integrating new sensory and data science to develop novel and future technologies that can enhance sensory experiences for humans. Part of her team’s work is to understand how human physiology interacts with technology and the ways this will change in the future. She measures that through biofeedback from the body – everything from exhaled CO2, to speech patterns, tiny expressions on the face, the size of pupils in the eye and skin temperature at precise points on the body.
Crum is adamant that mining this biological data can markedly improve how we diagnose and treat disease. Despite the current climate of wariness around sharing personal data, she thinks that our ability to detect and amass information on these subtle physiological cues could revolutionise personalised medicine. “I believe we’re living in an unprecedented time where technology is going to know more about our mental and physical wellness than most clinical visits,” Crum says.
She’s joined by a growing number of scientists, companies and investors who are grasping the latent power of the biological personal data that each of us is unintentionally emitting all the time. Our patterns of speech, our walking gait and the compounds detectable in our breath carry cues about our likelihood of developing a disease in later life. The way our pupils dilate can reveal elements of our emotional state, as well as how hard we’re trying to understand something. The hormones released onto the surface of our skin can indicate our stress levels. Even the way we tap and swipe at our smartphone screens can be read as signatures of our cognitive state.
The growing ubiquity of sensors in our lives – from wearables to the tracking technologies in our general environments – are portals through which this physical data can be gathered to build up profiles of our personal health and wellbeing. Crum has a name for this emerging period in our digital history. She calls it “the era of the empath”, a time when the preponderance of sensors and the biological data they harvest will inform a growing awareness that individuals experience the world very differently – not just from an emotional or perceptual standpoint, but crucially in terms of our health, too.
Crum herself has had a longstanding interest in how variably humans perceive and experience reality. “Everything I have always thought about has been the uniqueness and the malleability of each of our experiences of the world, and our senses, and how those come to rise through our physiology,” she explains. A background in experimental psychology and neurophysiology brought her to Stanford University where, as an adjunct professor, she advises on research that uses biofeedback to create personalised virtual and augmented reality games.
Often, these are used for rehabilitation – like a recent one where a computer scientist and a basketballer teamed up to develop a VR balance training game aimed at Parkinson’s patients. In their project, biological cues like gait and cognitive load were measured via sensors, then looped back into the game so it could be tailored to the individual’s needs.
For the past eight-and-a-half years at Dolby, Crum has channelled her passion into exploring how individuals experience technology, and how new inventions might take these discrete experiences into account. “Currently, technology behaves in a one-size-fits-all way,” she says. “If you want to build successful technology, you want to have democratised benefits. You do actually have to take the internal experience into how a device [works].”
Crum’s work leverages the full depth and richness of that biological data. For instance, pupillometry – the study of what causes our pupils to dilate and contract – can reveal several different streams of information about the user’s state. “The pupil obviously has a light reflex. But it also responds to engagement, and it responds to cognitive load,” Crum says. (Pupils expand when we’re working harder to understand information.)
“So you have three different interpretations from just one signature.” The capacity to track pupils will soon be commonplace in every pair of smart glasses on the market; pair that with ubiquitous microphones on our devices and in our general environment, plus AI-driven audio analysis, and you’re able to grasp the context that underlies changes in the eye – whether it’s an information-rich conversation or a stressful interaction, Crum says.
Combining these rich data streams to get the best read on someone’s internal state is a principle that also applies to disease diagnostics. “There’s an opportunity for our technology to better enhance our cognitive intent, what we’re trying to achieve in our spaces, in our rehabilitation, in our mental efficacy and workplace – all through a better amalgamated signature of our personal data,” Crum says. For those hesitant about the prospects of relinquishing their biological information to support this new vision of healthcare, Crum has a salient reminder. “We are not changing as humans. What’s changing is the ubiquity of sensors in our environment.”
Owlstone Medical is just one example of a company that’s bringing more sensors into the healthcare space. Based in Cambridge, Owlstone has developed breathalysers that store biological information from the air we exhale. Their invention comprises a plastic disposable mask, attached at the base to four metal tubes lined with sorbents that latch onto volatile organic compounds that we breathe out.
These compounds arise as byproducts of the body’s metabolic processes, the activities of the gut microbiome, and the breakdown of foods and medicines we consume. They can also come from chemicals we’re absorbing from the surrounding environment. These enter the blood, then transition from there into the airways (once every minute, all the body’s blood passes through the lungs). A single, minute-long breath sample can therefore be taken as a direct read on the body’s processes and wellbeing.
“Breath as a phenotype can actually give you a very broad snapshot on the health of an individual,” says Chris Claxton, head of investor relations at the Owlstone Medical – a UK-based firm that peforms breath biopsies. Owlstone analyses the contents of the metal tubes separately in its Cambridge laboratory – currently the world’s only commercial personal breath biopsy lab. But the device’s portable nature and non-invasive technique means it can be used anywhere, whether it’s a doctor’s office, a hospital or at home.
Owlstone’s diagnostic focus is to identify precise biomarkers of disease in breath. “We’re in that phase of linking specific chemicals to specific conditions, underlying disease states, or underlying biology,” Claxton says. Currently, they’re running the world’s largest-ever breath biomarker trial – funded by the NHS and made up of 4,000 patients – to identify markers for the early detection of lung cancer.
They’re also identifying breath-based biomarkers that can reveal how drugs work in the body, to help pharmaceutical companies create more effective, targeted treatments. On top of this, the company is building tailored breath tests, such as one for the detection of chemical exposures – potentially useful for people with jobs that involve hazardous chemicals, Claxton explains. This year, the company intends to open new laboratories in China and the US. “Our goal as an organisation is to enable breath as a diagnostic sample type that would sit alongside blood or urine,” Claxton says.
Other companies are turning to more unconventional sources of information to diagnose disease. Research shows that voice, in particular, can be a powerful predictive tool for early signs of Alzheimer’s, because purpose-built technologies can detect altered speech patterns that are a precursor to the disease. Similarly, IBM has paired up with university researchers to develop artificially intelligent programmes that can analyse syntax and semantics in speech, to predict the onset of psychosis in patients. Meanwhile, companies like Israel-based VoiceSense are building analytical technologies to identify unique signatures in the pace, intonation, and breathing patterns of our speech, to draw up a profile of an individual’s emotional state. This company says that their approach could help to predict the likelihood that someone is experiencing depression, post-traumatic disorder or substance abuse.
Crum adds that some researchers are exploring how certain types of disruptions in breathing and speech may be linked to heart disease risk. Such subtle cues wouldn’t necessarily be obvious during the course of a regular clinical visit. “These are things that may be detectable by a machine much more than a human. The thing that’s important here goes back to longitudinal data, the technology having an understanding of the user,” Crum says. “These technologies can help us see these things, and potentially intervene, much earlier.”
It’s something that, since 2012, Dina Katabi, a professor of computer science at the Massachusetts Institute of Technology has been working to achieve. Using a sensor encapsulated in a white, rectangular, wall-mounted box, Katabi monitors patients’ long-term health by remotely tracking their breathing, motion, gait, sleeping patterns and even their heartbeats. “We thought, wouldn’t it be cool if you could just have something like your Wi-Fi box at home to collect health information continuously, as you’re living your life, without having to interfere or ask you to do anything special?” Katabi says.
Her invention, commercialised by her start-up Emerald, is targeted at Alzheimer’s and Parkinson’s patients. Emerald works by beaming low-intensity electromagnetic waves into a space and then sensing how these signals change as people move in a room. Then, machine learning is applied to extract different types of physiological signals embedded in that data. Measured against the patients’ digital profiles, these readings can reveal whether there has been a degradation or an improvement in gait, a change in breathing, heartbeat or sleeping patterns, Katabi explains. That’s particularly useful for tracking early signs of decline caused by disease, or, on the other hand, the potentially positive effects of new treatments on patients.
Emerald is currently working with pharmaceutical and biotech companies in clinical trials to test the efficacy of certain drugs. Katabi says she’s also hoping to enhance the detection system to pick up on even subtler physiological cues, such as changes in blood pressure. And she wants to develop the machine learning facet of the technology to make it react smartly to the signals it detects. “[It] can be used to detect and understand higher level information, not just monitoring the signs and the measurements but really being able to understand the meaning of those measurements.” That could lead to the tech intuiting from a patient’s gait and sleeping patterns, for instance, that their drug dosage needs to be increased, or changed – and then alerting a caregiver to do so.
Katabi also sees the potential of this technology to bridge the gap between hospital and home care for ageing or very ill patients, by enabling more people to live independently at home while their health is being reliably tracked. “This convergence is really where the future is,” Katabi says.
For many people, the idea of plugging our spaces with technologies that will incessantly mine our biological data is a fraught one – especially in a time when data privacy is increasingly paramount. And despite knowing the benefits that abundant streams of physiological data can bring, Crum is cautious, too. “I don’t try to convince people it’s a good idea. I want people to have a richer dialogue about it,” she says.
But for Crum, the notion that ubiquitous technology can improve care and quality of life resonates on a personal level. In a relatively short space of time a few years ago, three of her closest relatives passed away, she says. “Technology was very relevant to their end of life, relevant to how I interacted with them, and my success at trying to be a better caretaker,” she explains. Considering our growing capacity to tailor healthcare according to patients’ biological cues, with the benefit of hindsight Crum recognises the advantages this expanding field of research could bring.
She believes that if anything is going to push us to grapple with the practical and philosophical problems that inevitably arise when we talk about personal data, it’s the chance for it to revolutionise healthcare. “This is where what’s to be gained is so worth the challenge, that people will invest in solving these problems,” Crum says.
This article was originally published by: https://www.wired.co.uk/article/breath-analysis-dolby-owlstone