Most of the depressing or quitting apps that are easily found on the Android and iOS stores share data, but only a few actually disclose it. This discovery leads to worrying revelations about what apps are doing with the health information we entrust. For example, The Wall Street Journal survey found that the duration of users sharing App Flo and the length of time they track their pregnancy plans on Facebook have recently been discovered. In previous research, there was a security flaw, or the application was searched in this new study published in the September 1915 JAMA Network Open journal. I downloaded the app using the keywords & # 39; Depression & # 39; and & # 39; non-smoking & # 39; and then verified that the data in the app was blocked by blocking the app's traffic. Many of the data shared by the app did not immediately identify the user or was strictly medical. However, 33 of the 36 apps shared information that advertisers or data analysts could provide insight into people's digital behavior. Others also share very sensitive information such as health diary entries, self-reports on drug use, and usernames.
This kind of detail and the name or type of the app allows third parties to provide information about someone's mental health. "Even if you know that you have a mental health or smoking cessation app that you have downloaded on your phone, it's worth it." Associate Professor Quinn Grundy of the University of Toronto The effect is studied and this study is emailed to The Verge .
John Torous, director of digital psychiatry at Beth Israel Deaconess Medical Center, and co-author of a new study, are worried that their app might not know how to share data. "It's really hard to make informed decisions about using apps if you do not even know who will access any information about you," he says. So a team from Sydney and the University of New South Wales conducted this study. "It's important to trust, but be sure to tell where your healthcare data is going," says Torous.
Researchers do not know what these third-party sites were doing with this user data. "We live in an era where people can reaffirm their bread crumbs," Torous says. Bread crumbs may be sitting there, but I do not know right now. "It's a mystery how this digital data will work." But Chan is worried about potential invisible risks. "Potentially, an advertiser can use it to infringe on someone's privacy and influence their treatment decisions," he says. For example, what if an advertiser finds someone who wants to stop smoking? "If someone is interested in smoking, are you interested in electronic cigarettes?" Says Chan. "
Part of the problem is the free app business model. By: Since insurance may not pay for apps that help stop users from smoking, the only way for free app developers to get injured is to sell subscriptions or sell data . In addition, if the app is branded as a wellness tool, developers can enforce laws that keep their medical information private.
One way to protect people who want to use health and health apps in the long term is to create a group that can give a stamp of approval for responsible mental health apps," says Chan. "We need an FAA to certify a specific aircraft for some kind of FDA approval or safety," he says, but for now, it is an app user's caveat. "If there is no such agency, or if the agency itself does not do a good job, You have to invest. "