Free applications that are marketed for people with depression or who want to stop smoking cause a haemorrhage in the data of third-party users such as Facebook and Google, but often do not support it in their privacy policies, as reported a recent study. This study is the most recent to highlight the potential risks of entrusting confidential health information to our phones.
Even though most of the easy-to-find apps for quitting smoking or depression in Android and iOS stores share data, only a fraction of them really reveals it. The findings add to a series of disturbing revelations about what applications are doing with the health information we entrust to them. For example, an investigation of the Wall Street Journal recently revealed that the Flo period tracking application shared the dates of the users and the pregnancy plans with Facebook. And previous studies have reported health applications with security flaws or that have shared data with advertisers and analytics companies.
In this new study, published on Friday in the journal JAMA Network Open researchers searched for applications using the keywords "depression" and "quit smoking". Then they downloaded the applications and verified if the data that was entered was shared when intercepting the traffic of the application. Much of the data that the applications shared did not immediately identify the user or even was strictly medical. However, 33 of the 36 applications shared information that could provide advertisers or data analysis companies with information about people's digital behavior. And some shared very confidential information, such as health diary entries, self-reports on substance use and user names.
This type of details, plus the name or type of application, could provide information of third parties about the mental health of someone that the person may want to keep private. "Even knowing that a user has a mental health or quit-smoking application downloaded to their phone is valuable health-related information," said Quinn Grundy, an assistant professor at the University of Toronto who studies corporate influences on health. and did not participate in it. the study says The Verge in an email.
The fact that people do not know how their applications share their data worried John Torous, director of digital psychiatry at Beth Israel Deaconess Medical Center and co-author of the new study. "It's really hard to make an informed decision about using an application if you do not even know who will have access to certain information about you," he says. That's why he and a team from the University of New South Wales in Sydney conducted this study. "It's important to trust, but verify, to tell where your health care data goes," says Torous.
Researchers do not know what these third-party sites were doing with this user information. "We live in a time when, with enough crumbs of bread, it is possible to identify people again," says Torous. It is also possible that the crumbs are there, he says, but for now, they just do not know it. "What happens with these digital data is a mystery." But Chan worries about the potential and invisible risks. "Potential advertisers could use this to compromise someone's privacy and influence their treatment decisions," he says. For example, what happens if an advertiser discovers that someone is trying to stop smoking? "Maybe if someone is interested in smoking, would they be interested in electronic cigarettes?" Says Chan. "Or could you introduce them to other similar products, like alcohol?"
Part of the problem is the business model for free applications, the study authors write: since the insurance might not pay for an application that helps users quit smoking, for example, the only way in which the developer of the free application can stay afloat is to sell subscriptions or sell data. And if that application qualifies as a wellness tool, developers can bypass laws designed to maintain the privacy of medical information.
In the long term, one way to protect people who want to use health and wellness applications could be to form a group that can give a seal of approval to responsible mental health applications, says Chan. "Something like having the approval of the FDA for things, or the FAA certifying a particular aircraft for safety," he says. But for now, is the user of the application, be careful. "When there are no such institutions or the institutions themselves are not doing a good job, it means that we should invest more as a public good."