4. Consent Collection and Management

shutterstock_2376119647

Consent is a cornerstone of data privacy in healthcare contexts. Health tech organisations often handle data that by law or ethics requires consent – especially when it involves health information or when personal data is used for secondary purposes like marketing. However, obtaining and managing user consent is trickier than it sounds, and many companies get it wrong.

Under the GDPR (and many similar laws), the default lawful basis for processing sensitive personal data (which includes health data) is explicit consent of the data subject unless another exception applies. This means users must knowingly agree to the specific use of their health data. Likewise, if a health app wants to show personalised ads or share data with third-party partners for research or product improvement, consent is often required (unless anonymization is in play, which has its own strict criteria).

A challenge arises because health tech apps often serve dual purposes: one primary (e.g. wellness tracking) and others secondary (e.g. monetising via ads or partnering with insurance for discounts). We see some startups attempt to cover all these uses in one go – burying consent in a general Terms and Conditions or assuming that a user’s acceptance of terms implies consent to data processing. This is a mistake. In jurisdictions like the EU, bundled consent (forcing users to agree to data processing as a condition of using the service, when not strictly necessary) is not valid. Consent must be freely given, specific, informed, and unambiguous. Hiding a consent clause in legal fine print or pre-ticking a box for the user violates these principles.

Our research found some health tech apps that failed to expressly ask for user consent before processing health information, or only mentioned their data use in the privacy policy that users rarely read. Others take an “all-in-one” consent approach during sign-up – essentially an ultimatum: “By creating an account, you agree we can use your data for anything in our privacy policy.” This leaves users in the dark about what they’re really agreeing to, leaving them feel a lack of transparency and control (leading to mistrust or backlash), and it doesn’t meet regulatory standards in many places, leaving the organisation exposed to legal challenges or enforcement actions.

Health data adds another layer of complexity: sometimes consent is hard to obtain in a user-friendly way. If an app continuously collects sensor data (heart rate, sleep patterns), asking a user to tap “I consent” every time would be obnoxious. Yet one-time blanket consent isn’t good either. The solution often lies in good UX: obtaining broad consent at onboarding for necessary processing, then using just-in-time notices and granular controls for additional uses. For example, a fitness app might get initial consent to process health metrics for core functionality, but later, if it wants to share a user’s step count with a wellness brand for a reward programme, it should pop up a clear opt-in choice for that specific sharing. Granular consent options (separate toggles for say, “Allow my data to be used to personalise ads” in addition to “Allow my anonymised data to be used for medical research”) both help users feel in control and consulted, and help keep you compliant.

Different jurisdictions have varying rules around consent. Europe and many countries require explicit opt-in for sensitive data and direct marketing. The United States (outside of HIPAA) historically was more “opt-out” based for general personal data use, but things are changing – e.g. several states now mandate opt-in consent to sell sensitive personal data, and the WMHMDA in Washington requires opt-in consent to collect consumer health data in the first place for certain entities. India’s DPDP Act emphasises consent (with a requirement for clear, plain language consent requests and an easy way to withdraw consent). Brazil’s LGPD similarly demands specific consent for the collection and use of sensitive data.

Another area that needs attention is children’s data – if your health tech product might be used by minors (e.g. a smart fitness band for kids or a mental health app used by teenagers), note that laws like the GDPR require parental consent for under-13 (under-16 in some EU countries) in most cases. The UK’s Age-Appropriate Design Code and similar “children’s codes” in other countries also impose stricter requirements on transparency and consent for younger users.

To handle consent properly:

- Design a consent interface that is clear and not misleading. Avoid pre-ticked boxes or vague language. Use simple descriptions for what the user is agreeing to.

- Document the consents you obtain – which user gave what consent and when – because you may need to prove this, if challenged. Many privacy laws put the onus on the company to demonstrate that valid consent was obtained.

- Provide a mechanism to withdraw consent easily. GDPR and others explicitly say users should be able to change their mind as easily as they gave consent. If a user turns off data sharing, your systems should honor that and stop processing the data for that purpose.

- Review your user journey through a regulatory lens: Are you asking for too much upfront? Often, it’s better to only ask for what’s needed for core service at sign-up, and defer other optional consents to later when you can better explain the value to the user (and they have a context for the request).

Getting consent right pays dividends in user trust. When users feel in control – e.g. they know “I can use this app even if I decline marketing cookies or data sharing, I just won’t get personalised ads” – they are more comfortable engaging with your product. On the flip side, if they discover you were doing something with their health data that they weren’t clearly told about, it can lead to upset and disappointment and the speedy deletion of the app- and maybe even a complaint to the regulator or a lawsuit.

Worried that you may not have fully appreciated your consent management needs? Contact us to find out how we can help.

 

Act now and speak to us about your privacy requirements

Start a conversation about how Privacy Made Practical® can benefit your business.

Click here to contact us.

Back to top