Health Apps and Privacy: Why Should You be Aware?

People use healthcare applications for a variety of purposes, including calorie tracking, step counting, and connecting to various medical devices, to mention a few. But since the #COVID-19 Pandemic and the rise of telemedicine as a medical alternative, these apps have multiplied at an uncontrollable rate. 

There are two primary categories of healthcare apps. 

  • The first category consists of the software that your healthcare practitioner uses. They might be combined with a medical device for tracking or monitoring reasons, such as an electrocardiography device that tracks heart activity, or they might be used to save your lab or radiological findings. They might also be used to plan your medical care. 
  • The second category consists of personal or private healthcare apps, which a person can download from an app store to monitor and control their eating habits, exercise routines, or particular medical concerns. To mention a few, there are apps for managing diabetes, COVID, and mental health. Several of these apps are free.  

Data Privacy Concerns 

With the sheer amount of data, including sensitive data, that is stored and processed, data privacy concerns are bound to arise.  

Data is shared with third parties for marketing and sales purposes and boosting targeted ads. 

Even data that has been purportedly “de-identified” may still contain enough details to identify users, and it may also contain very private data, such as details on a person’s mental health or drug usage. 

Medical identity theft is when someone uses your information to obtain free medical care or to submit false claims. For these reasons, healthcare data is valuable, which is why hackers frequently target it. 

Other parties, like Facebook or Google, might obtain the data and create user profiles. Once the information is exposed in that setting, it becomes impossible to keep track of who is accessing it and how. 

Why do these concerns come into existence?  

  • Employee Mistakes: In any situation, human error is possible. It can happen when a worker speaks aloud about a patient’s health information in a public place, when they click on a link that opens them up to a virus or ransomware attack, or when they accidentally enter the wrong phone number and send information to the wrong person. This extends beyond issues with technology to concerns with privacy in general. 
  • Poor Access Management: A reliable procedure that is strictly followed is required to guarantee that only those who require access are granted access and that ex-employees or business partners are promptly removed when they no longer require access. 
  • Failure to Store Data Safely: In addition to being securely stored, data must also be routinely deleted or destroyed after the necessary retention periods have passed. 

Insufficient encryption. 

Failure to conduct a comprehensive risk assessment that includes the various apps and networked devices where PHI is stored or transmitted. 

Conclusion  

Before using an app, it’s critical for users to be aware of any possible risks. You should carefully consider what information you are comfortable disclosing and review the privacy and security policies of the app. If the app is from your service provider or insurance provider, find out about the security measures in place and how your data is being protected.

Major Privacy Updates of the Week

Upcoming US Senate Bill to set age minimum for access to social media:

Children’s access to social media is expected to be regulated by the introduction of legislation by a bipartisan group of U.S. Senators. 

The bill would prohibit children who are under the age of 13 from accessing social media, and children aged between 13-17 are expected to be allowed with the consent of their parents. How the verification of the children’s age remains unclear. 

Read more.

Ukrainian cyber police arrested a man for selling data to Russian buyers:

A 36-year-old man was arrested by the Ukrainian cyber police for selling the data of Ukrainian and EU citizens. 

The police stated the stolen data were sold based on the volume. Information like passport details, taxpayer numbers, birth certificates, and bank account data was contained in the databases that were discovered by the officers. 

Read more.

Data Protection inquiry over ChatGPT launched by Germany:

The data privacy concerns over ChatGPT resulted in the launching of the inquiry by Germany.

The authorities of Germany wanted to verify whether OpenAI and the EU law inform the people whose data has been used by ChatGPT, it also demands an answer from the US maker OpenAI. 

Read more.

Double Supply chain attack – 3CX compromised:

The Cybersecurity firm Mandiant has reported that the breach of 3CX was caused by an earlier futures trading platform Trading Technologies. This is known to be the supply chain attack caused by another supply chain attack.

However, the source of the breach was said to be caused due to an employee downloading a piece of outdated trading software. 

Read more.

IMF paper states the absence of data protection law in India possess a privacy risk:

As per the reports stated in the IMF paper, there were 80 million Indian users were affected by the data breach incidents in 2021.

According to IMF, the absence of comprehensive data protection legislation is still missing in India where the privacy and the digital rights of users are at risk. 

Read more

Curated by: Prajwala D Dinesh, Ritwik Tiwari, Ayush Sahay

WEEKLY PRIVACY NEWSLETTER

Keep up to pace with this high-impact weekly privacy newsletter that
features significant data privacy updates, trends, and tools that can
help to make your life secure & easier every day!

*By clicking on subscribe, I agree to receive communications from Tsaaro