Easy to use but hard to understand: moving beyond the pitfalls of frictionless digital design
This blog post is the first in a series I’m writing as a Citizens Advice Policy Associate, exploring the social impacts of online choice architecture for consumers.
Modern digital design tends to optimise for speed, simplicity and stickiness.
When we use an app or visit a Web site where we might spend money, every click and every cue on the page will have been refined in the hope of converting us into valued (and valuable) customers. Most checkout journeys and one-click shopping experiences are accompanied by a stack of metrics that are monitored to make sure that we, the users, fulfil as many business goals as possible. Our bounce rates are minimised, our cart abandonment rates reduced, and we are kept whooshing through the conversion funnel — distracted only by cross-sells and upsells — until we’ve completed our transaction(s) and consented to receive emails about related products and services. The convenience generated by this optimisation is often regarded as a feature, not a bug.
In December 2022, Citizens Advice published “Tricks of the trade: How online consumer journeys create harm and what to do about it”. This report shared the results of a large-scale experiment on the social impacts of frictionless consumer journeys. The team found three high-risk areas — Buy Now, Pay Later options at checkout; online gambling; and auto-renewing subscriptions — and uncovered some stark truths about purchase paths that encourage us to make decisions on the fly. In particular, the research revealed that:
- 1 in 4 people have made online purchasing decisions they regret
- UK consumers spend £0.5bn a year on subscriptions they don’t realise will auto-renew
- 41% of those surveyed think websites make it too easy to “make the wrong choice”
More generally, the research also showed that some online consumer journeys contain design patterns that actively encourage disinhibition, resulting in more spontaneous, less cautious behaviour, often by people in vulnerable circumstances.
Off the back of these findings, Citizens Advice are calling for an outcomes based framework that requires businesses to put consumers at the heart of design. Many digital business teams and specialists will claim that consumers are already at the heart of their design — but not every piece of user-centred design is created with our wellbeing in mind. This blog post explores some of the industry norms that make customer journeys extremely easy to use and intentionally hard to understand, and explores whether repurposing the goal setting technique Objectives and Key Results (or OKRs) might be an effective lever for improving online consumer journeys.
Don’t Make Me Think
Frictionlessness is by no means a new trend in digital design.
Back in the early 2000s when I worked at BBC Online, almost everyone in our office had a copy of Don’t Make Me Think by Steve Krug on their desk. Published in 2000, long before any of us had a smartphone, the book was an early touchstone for people creating Web sites. Full of good lessons about buttons, hyperlinks, navigation, and search boxes, the title of the book is also Krug’s “first law of usability” which, he explained:
means that, as far as humanly possible, when I look at a Web page it should be self-evident. Obvious. Self-explanatory.
In this context, “self-evident” and “self-explanatory” don’t mean transparent and understandable, but something faster and less distracting, more akin to pre-attentive processing, a subconscious process that allows us to do things like glance at the car dashboard or recognise road signs while driving. Krug’s aim in Don’t Make Me Think was to reduce the “question marks” that popped up over users’ heads as they navigated a site, because:
when we’re using the Web every question mark adds to our cognitive workload, distracting our attention from the task at hand. The distractions may be slight but they add up, and sometimes it doesn’t take much to throw us.
And as a rule, people don’t like to puzzle over how to do things. The fact that the people who built the site didn’t care enough to make things obvious — and easy — can erode our confidence in the site and its publishers. (p. 15)
Krug’s aim was for Web pages to “work most of their magic at a glance” (p. 19). This glanceableness is really useful in things like map interfaces, checking your bank balance, or when you want to see which song is playing on Spotify, but it’s not always what’s needed to support a complex decision, like making a purchase or signing up for a service. As designer Matt Jones said over a decade later, this kind of “magic is a power relationship”, and it needs to be used sparingly.
The first edition of Don’t Make Me Think was written when ecommerce was in its early days, Web site navigation could be labyrinthine, and digital literacy could not be assumed. And Krug leans into the value of simplicity. Chapter 4 — “Animal, Vegetable, or Mineral? Why users like mindless choices” — explains why it’s important to create clear, certain choices, even in ambiguous situations. He goes on to justify this by referencing the party game Twenty Questions, citing how everyone accepts that
anything that is not a plant or an animal — including things as diverse as pianos, limericks and encyclopedias, for instance — falls under ‘mineral’” (pp. 41–2)
The aim of this plea for simplicity was to stop Web teams showing the internal complexity of their organisations to their users. This was surprisingly common in the early days of the Web, and the conventions championed by usability pioneers such as Krug, Jakob Nielsen, and Jesse James Garrett have made surfing the Web much easier and more accessible for everyone. But this approach also meant that it quickly became commonplace for designers to impose choices and architectures that were easy to use but difficult to understand.
Two decades later, this kind of simplified framing doesn’t just mask complicated org charts and upselling models, it also often sits atop algorithmic recommendation systems. And while ease and convenience are still sought after, many financial transactions are much more complex than a game of Twenty Questions.
Do the Hard Work to Make it Simple
One reason this search for digital simplicity has persisted is because few digital experiences come with an instruction manual. As users, we are all expected to make the best use of the available prompts to find our own way.
This universalism is one reason digital products and services are often thought of as democratising and accessible; certainly, most smartphone users now expect that what designer Don Norman termed the “frustrations of everyday objects” will be smoothed away so that most interactions are as easy as possible to complete.
There are all kinds of benefits to this. Certainly, no one needs it to be harder to pay their taxes, read a map, find accurate health information or complete a regular supermarket shop, but that doesn’t mean there aren’t other situations where it might be useful to have more space to deliberate before making a final choice — or that the norms of frictionless design don’t sometimes become deceptive or manipulative.
Designer Harry Brignull has been collating examples of deceptive design* for well over a decade, and the Deceptive Design Hall of Shame documents some of the many tricks deployed by well-known businesses to lock us into subscriptions, extract unnecessary data, and persuade us to sign up for services we don’t need. These patterns often come across as sleight of hand — a button or a checkbox at the end of another journey, an opt-out rather than an opt-in — and are calculated to appear when we’re least expecting them.
Digital products and services cut across many traditional regulatory verticals, which mean these design patterns are not confined to a single market segment or type of interface. They have been spotted in such diverse settings as software and app subscriptions, financial products, social media platforms, and in all kinds of ecommerce settings; their consequences can also be different in different contexts and so take a long time to come to light. A recent US Federal Trade Commission paper, “Bringing Dark Patterns to Light”, offers this context:
Because dark patterns are covert or otherwise deceptive, many consumers don’t realize they are being manipulated or misled. Workshop participants theorized that even when consumers do realize they have been deceived, many don’t report their experiences, some out of an unnecessary feeling of embarrassment at being tricked
To understand why dark patterns exist, however, we need to look beyond design, and get to grips with the business model.
Business model-centred design?
Much of the language of digital design is calibrated to make it sound as if we, the users, are the ultimate beneficiaries. But ultimately the design patterns that nudge and encourage us along the purchase path aren’t there to meet our wider “user needs” such as health and wellbeing; they exist to anticipate our immediate needs as people who might be about to spend money. In some contexts, what is often called user- or customer-centred design has actually been created to meet the needs of the business model. It’s business-model centred design, with a side of consumer ease.
For instance, Amazon’s famous “Customer Obsession” is not a humanitarian mission, but a business strategy to ensure the company delivers seamless customer experiences around the things they identify as being “most valuable” — or, in other words, most easy to monetise.
If being “user-centred” or “customer-centric” actually means “encouraging us to the end of the purchase path as quickly as possible, and upselling some additional items along the way”, then it’s arguable whose needs it really prioritises.
In 2021 journalist Karen Hao’s investigation into Facebook uncovered a devastating example of user behaviour being optimised to service a business goal with harmful consequences. Although this is not a user journey that necessarily ends in a purchase, it’s illustrative of how a focus on meeting metrics can have a distorting effect on design decisions:
Their goal, among other things, was to increase a metric called L6/7, the fraction of people who logged in to Facebook six of the previous seven days. L6/7 is just one of myriad ways in which Facebook has measured “engagement”… Now every user interaction once analyzed by engineers was being analyzed by algorithms. Those algorithms were creating much faster, more personalized feedback loops for tweaking and tailoring each user’s news feed to keep nudging up engagement numbers…
But this approach soon caused issues. The models that maximize engagement also favor controversy, misinformation, and extremism: put simply, people just like outrageous stuff. Sometimes this inflames existing political tensions. The most devastating example to date is the case of Myanmar, where viral fake news and hate speech about the Rohingya Muslim minority escalated the country’s religious conflict into a full-blown genocide.
Hao’s aside that “people just like outrageous stuff’ cuts to the quick of the potential regulatory dilemma: a design pattern that seems to make things more fun or convenient for one person may well create unacceptable risk for another, meaning it can take a little while for a harm to emerge at scale, and even longer for sufficient evidence to be gathered to prove its impact. Meanwhile digital design practice is constantly changing, and any effective regulatory intervention would need to be systemic, anticipatory, and extremely flexible — or else risk going out of date before even being implemented.
So what would be an effective regulatory lever?
The business model is queen
Although digital interfaces might dominate our online experiences, it’s important to remember interface design is only one part of a digital consumer journey. The business model is by far the most influential component, and the one with the most direct influence over what we see on the screen.
In 2023, it is not news that digital business models can have shocking effects. Digital businesses — particularly ones that don’t have to contend with physical logistics — often make a virtue of their ambitious approach to growth and scale, and this appetite for expansion can lead directly to extractive, and sometimes deceptive, interface design and marketing campaigns.
Objectives and Key Results (OKRs) are one technique used to turn business models into shared, executable sets of goals. When used well, OKRs are an incredibly effective way of getting teams to deliver against stretch targets, and they have become synonymous with what John Doerr, author of Measure What Matters: OKRs — The Simple Idea That Drives 10x Growth, calls “exponentially aggressive goals”.
This boundaryless approach to growth and delivery has been one of the enablers of Big Tech’s hyperscale. Doerr’s book contains many examples of what he calls the “radical, high-risk action” tech teams have taken to achieve “Big Hairy Ambitious Goals”. This kind of expansive, pro-growth thinking means that high-risk, high-reward choices can quickly become normalised in digital product development, sometimes at the expense of abiding by social norms.
Over the years, Netflix has, unusually, set and shared these kinds of hyperbolic goals with the public. In 2017, CEO Reed Hastings claimed sleep was one of the streaming platform’s major competitors; the screengrab above is from the company’s 2019 investment statement, setting out their aim to compete with all of a viewer’s leisure activities, including “going to dinner with friends or enjoying a glass of wine with their partner”. At the time of writing, in January 2023, this has a slightly milder but still “hairily ambitious“ formulation:
We compete for a share of members’ time and spending for relaxation and stimulation, against linear networks, pay-per-view content, DVD watching, other internet networks, video gaming, web browsing, magazine reading, video piracy, and much more. Over the coming years, most of these forms of entertainment will improve.
Not all digital businesses operate at this level of inflation, but this kind of goal setting has the effect of shifting the Overton Window across the industry, normalising over-inflated goals that are unmoored from social consequences, so that everyone is fighting to gain more repeat visits, more engagement, more time in app or on site, and more completed purchases.
In my previous job, at responsible tech think tank Doteveryone, we addressed this by creating a tool called Consequence Scanning — a way for product teams to build a reality check into their delivery schedule and understand how their “key results’’ might have much broader implications for other people and the planet. Digital metrics tend to be very insular: they don’t show the people making the product or feature much about real-world impact or social consequences, they just show what people are doing on the site or in the app, and indicate whether or not they are achieving their engagement targets and realising their business goals. There is no real excuse for this kind of introspective approach, but leaving it behind will require a shift in culture across the industry. Introducing new regulatory scrutiny would be one way to expedite this.
An Outcomes-Based Framework
Regulation that encourages good outcomes, regardless of the technological or the social context, will be much more sustainable than an approach that prioritises banning specific practices. This may seem like a complicated approach that relies too much on anticipation, but it’s not that different to setting business goals, such as the objectives and key results described above. Making regulatory standards a natural part of the product development process, not just a compliance problem, will also make them easier to adopt and adapt to and it’s likely to lead to genuine service improvements rather than incremental adaptations. An outcomes-based approach is also more durable in the face of technical change than banning specific interface design components and would also make it easier to identify what has gone wrong when those thresholds are not met, and allow the same standards to be implemented across a wider range of technologies.
The recent progress of the UK Online Safety Bill has shown some of the limitations of naming and banning specific harms. Digital harms often take different forms in different contexts and can also take a while to fully emerge and become widely recognised and understood; meanwhile, changing technical capabilities can mean that the same kind of harm can quite quickly evolve and manifest in different ways. For instance, the Online Safety Bill refers to both “upskirting” and “downblousing”, but these are not the only forms of non-consensual and invasive image gathering, and the specificity of the current wording may turn out to preclude other similar types of photography that emerge over time.
Setting expectations around outcomes also has the potential to define what good looks like, not simply to name the bad. Banning specific design patterns makes it possible for businesses to implement similar, alternative patterns that achieve the same outcomes, whereas defining better outcomes might also, over time, deliver an overall uplift in the quality of experience and raise consumers’ expectations. The World Wide Web Accessibility Standard already works in this way: the rating system — crated by experts — sets an aspirational ceiling and a clear quality threshold. For businesses, the duty to comply with this is enshrined in the Equality Act. A similar model could be adapted to support online consumer journeys.
Vulnerable customers should not have to risk financial exploitation online at any time — let alone during a cost of living crisis. The sharp practices that lock us into unwanted subscriptions, unnecessary purchases, and make online gambling too easy should be named and shamed, and alternatives found, so that more of our own money stays in our own pockets.
Digital consumer journeys do not need to be extractive. Making them so is a deliberate choice, and it’s one we need businesses to stop making.
*This blog post from designer and ICO Fellow Caroline Sinders discusses the importance of replacing the term “dark patterns”, which has both racist and colonial connotations, with a more inclusive term, and debates whether or not “deceptive design” is a suitable replacement