Dark Patterns?
Experience designers design experiences.
In the digital marketing world they path-out customer journeys designed to deliver specific marketing objectives. For example, to get mobile app users to engage with a piece of content. Or put something in their shopping basket and check out.
Experience designers work with UX researchers and data scientists who help them to simplify journeys and maximize performance. Together they look for hidden patterns that reveal what works and why, so that they can be replicated.
In a nefarious twist, some designers use these methods to trick-up their KPIs. For example, forcing users to engage with a piece of content by removing options to avoid it. Or preventing users from closing their account by burying that choice, or not offering it at all.
Design deceptions that force or fool consumers into doing things they don’t want to have a name. They are called “dark patterns”.
There is a website https://darkpatterns.org that has been cataloging these bad practices for quite some time, for the purpose of raising web consumer awareness and shaming the companies that try to sneak them into market.
It appears to have resonated with a third group. Policymakers.
Earlier this month new legislation was introduced in the US called the Deceptive Experiences To Online Users Reduction (DETOUR) act that would hold companies accountable for such bad practices.
The purpose of DETOUR is to penalize large digital service providers - those with over 100M monthly active users, like Google, Facebook and LinkedIn - if they engage in deceptive design practices.
Some illuminating language from the bill …
Under DETOUR it would be illegal to “design, modify, or manipulate a user interface with the purpose or effect of obscuring, subverting, or impairing user autonomy, decision-making, or choice to obtain consent or user data.”
It would not allow segmentation of users into groups for “the purposes of behavioral or psychological experiments” unless consumers first give informed consent.
And it would punish targeting children under 13 years old “with the purpose or substantial effect of cultivating compulsive usage, including video auto-play functions initiated without the consent of a user.”
Deceptive marketing practices are not strictly a digital thing but they do have unique characteristics online as the above passages from the bill show (tricking consent out of people, running experiments on people without their knowledge, attempting to create addictive behaviors in children).
It would be great if the biggest platforms like Google, Facebook, LinkedIn and their peers could better align their ambitions with those of their users. Which, to be fair, they usually do which is why they have been successful.
But in situations where some data collectors have completely lost the empathy signal, I think it is OK to have a regulatory forcing mechanism to compel them away from dodgy practices.