
37% of consumers claimed they “only share their data because they have no other option.” While the “other option” would be to simply not patronize that particular site or service, it is clearly a choice consumers do not want to make.
And that should come as wonderful news for digital organizations. It should also come as a significant wake-up call.
The Ultimatum: No Data, No Access
A delicate balance must be observed between an organization’s need to gather data and a consumer’s need to protect what is theirs. Since data became the new oil, there is hardly an entity that hasn’t seen the value of mining for their own. With so many users frequenting sites daily, the answer became clear: give us your information if you want to use our services. Anxious to access critical resources (from banking to healthcare to higher education), many handed over their data. And now they regret it.
Nearly 40% of consumers felt coerced, which indicates the current mood of digital users: They feel used. This is not a good look for digital entities and is not in line with trust. Consumer trust declined in almost all sectors, with the exception of banking, insurance, and government. Perhaps providing legally necessary information strikes a different chord than being asked for details that could be used in advertising.
Whatever the reason, consumers do not like the high (perceived) cost of doing digital business these days, and the risk is that they will start voting with their feet. Or their clicks.
A False Dilemma
However, the dichotomy – no data, or no access to services - is a fake one. While pop-ups may make it seem like data is the cost of doing business, anyone aware of data privacy laws knows it is not. However, the point that the data is optional is not clearly communicated, as over a third feel they are being forced to part with it.
Consumers can opt out of all personal data collection. However, the wording on pop-up consent forms may not be transparent enough for a large percentage to understand. A clearer design in UX and more upfront, straightforward verbiage can help.
What about the forms required to create accounts to access essential services like healthcare billing portals or telehealth visits? Even if the information is required, consumers still have the right and ability to determine how much of their data is used and kept, and how it is used. However, without clear-cut wording that delivers this information at the outset, consumers can be unaware of those facts. This results in resentment and suspicion; 31% left a brand because they felt it required too much personal information – the top reason overall.
An additional 25% left because they were unaware of how their data was being used. Couple that with the feeling that they have “no choice” in the matter, which explains why consumer digital trust ranks low across the board.
In a world where consent is the norm, what happens if you don’t consent to share your personal data (be it financial, tax, or identity) with the organization requesting it? Is there a way to opt out, or would you cease to exist as a (digital) citizen? By being more transparent about data usage, regional organizations can begin to claw back a measure of trust.
Transparency, Consent, Progressive Profiling, and Ethical AI
These four principles can help companies seem like the “good guys” again. When a consumer loses trust in your brand, you lose business. When you fail to do your part to earn that trust, you deserve it.
Here is how implementing these three qualities can help organizations not get in their own way:
Transparency:
This includes transparency in language. Use clear, reader-friendly verbiage (“layman’s terms”) to show your consumers you have nothing to hide. And if they are free to use your resources without giving you their data, tell them so. Or at least tell them they are free to revoke access at any time, and provide them with a link.
Consent:
Make sure users know the ball is in their court. They do not enjoy feeling like a distant corporate entity that holds all the cards and that they are being forced to part with the little that is theirs. That is never the right dynamic to start a lasting, mutually beneficial relationship. Make sure they know that you will not use any of their data without their consent. Also, being straightforward about how that data is used (and how it is not) helps to build trust and prevent anxious consumers from catastrophizing.
Progressive Profiling:
Only gather what you need. Collect user data gradually, requesting only essential information at each interaction. This reduces friction during sign-up, enhances consumer experience, and encourages more people to engage. By building trust over time, providers can request additional details as needed—respecting people’s boundaries while still gaining valuable insights.
Ethical AI:
AI is a mysterious black box for many. It can seem as dangerous as it is difficult to understand, and if they feel your site will put their PII into an AI engine for its own personal gain, when they know the AI field is not yet fully regulated, it will quickly put you on the wrong side of the line.
A Fair Trade
As we noted in the report, “Providing ‘all or nothing’ [data sharing] options will hurt the bottom line.” Instead, provide incremental, granular ways of sharing some, but not all. Allow the users to do what they feel comfortable with.
It makes sense that more data means better data-driven decisions, which is “better for business.” However, as consumers turn a collective leery eye towards the use of their personal information, siding firmly with the side of trust will ultimately do more for your image and longevity than a few quick (and ) data wins.
Comments ( 0 )