Add more content here...
May, 2025

Kahneman subverted: Behavioural economics weaponised as dark patterns pump ecom, platform profits – prepare for legal change, warns Consumer Policy Research Centre

What you need to know:

  • Dark patterns are “entrenched” across the digital economy – with companies “reverse engineering” the “nudge” principles of Daniel Kahneman’s behavioural economics to serve profit rather than help people make better choices, says Gupta.
  • Already, the likes of LinkedIn, Amazon, TikTok, Meta and Epic Games have run afoul of regulators, while ticketing platform StubHub A/B testing has shown the double-digit profit impact of manipulating consumer choice via hidden costs. Its subsequent pricing practices have landed the firm in court.
  • Gupta, back from a global tour or regulators, lawmakers and enforcement bodies, and armed with a fresh report on her findings, says the practice is so widespread across the digital economy that most young adults have probably never lived in a world where they are not being manipulated. AI risks “supercharging” the practice – and making dark patterns darker still.
  • But Gupta warns businesses to prepare for regulation, enforcement and redress, with the Australian government committed to a ban on unfair business practices – and a strong overlap between dark patterns and the Privacy overhaul now gearing up for its second act.  
  • She sees profit upside for those that overhaul UX design now “to put the person and their wellbeing at the centre” rather than “waiting to be caught”.
  • There’s more nuance, context and examples in the podcast. Get the full download here.

Originally, ‘nudge’ – and I think Daniel Kahneman also called it ‘sludge’ – was introduced to be able to help people almost make better decisions … [But] some businesses are reverse engineering it [to become] ‘how can we profit off a particular design pathway?’ But those profitable reasons will probably not exist in the coming years, given the kind of regulatory change that we're starting to see in other countries.

Chandni Gupta, Deputy CEO, Consumer Policy Research Centre

When, not if

“Dark patterns are being codified around the world into legislation,” says Consumer Policy Research Centre deputy CEO and Winston Churchill Trust Fellow, Chandni Gupta. “It’s only a matter of time those kind of regulations are going to start coming here in Australia.”

In basic terms, dark patterns subtly nudge consumers towards making choices that suit businesses better than they suit themselves. Some would argue that is a fundamental tenet of advertising and marketing – persuade people to buy stuff that more often than not they probably don’t need, and do it more effectively and efficiently than the next vendor, spawning an entire behavioural economics industry.

But dark patterns also involve making it hard for people to unsubscribe for a service, or tricking them into handing over more data, or misleading them through simple techniques like putting the price in the same colours as a discount offer would usually appear, and even some normalised tactics like heightening fear of missing out through scarcity pricing.

Basically the things that any web user and most physical store users (i.e. everyone) – see every day, everywhere.

Manipulation techniques are at least as old as capitalism. But the platform age has pushed choice architecture deeper into deception territory across all aspects of digital commerce – behavioural economics have been subverted in service of profit, and lawmakers around the world are mobilising.

Gupta recently spoke with dozens of regulators, enforcement agencies, consumer advocacy groups and choice architecture experts in the US, UK, EU, Singapore and India about their tightening of rules and the overlap – and pincer movement – between dark patterns and data privacy legislation.

Her subsequent report, Made to Manipulate, unpacks the scale of dark patterns being deployed by businesses; the common ways businesses across the economy are manipulating customers for their own ends; and how, maybe, lawmakers and lawyers can stem the tide.

She thinks Australia will be one of the next to move.

Clever marketing or deception?

One of the first questions posed by the report is whether ‘dark patterns’ are clever marketing or deception. The report’s title gives a clue to the experts’ view: Gutpa says the lines are increasingly being “blurred”, and many businesses are intentionally operating “in the grey” to circumnavigate existing laws and definitions. Those businesses would probably argue that to do otherwise is to cede advantage to rivals. Which is likely a major factor in making dark patterns literally business as usual.

Gupta acknowledges that fact.

It pays really well at the moment. But I think that tide is going to shift soon,” she says.

“The Australian government committed at the end of last year to introduce a ban on unfair business practices, and as part of that, they’ve already called out dark patterns being one of the things that they’ve got their eye on to include very specifically. So that’s coming anyway.

“We also have the next stage of the Privacy Act, and we’re really keen to see how tranche two is actually implemented. So there is change coming in the Australian environment.”

Younger consumers in particular, like Gen Z and Gen Alpha, have probably never had an online experience devoid of dark patterns.

Chandni Gupta, Deputy CEO, Consumer Policy Research Centre

Dark versus grey areas

“Given just how prevalent dark patterns are … a lot of dark patterns are now almost being normalised,” says Gupta.

“Some of them are so entrenched that they are starting to appear as the norm of being online, and it’s the cost of doing business online. Younger consumers in particular, like Gen Z and Gen Alpha, have probably never had an online experience devoid of dark patterns.”

While the lines are becoming ever more blurred, “I like to think of it through the lens of persuasion versus coercion, and that’s what a lot of experts touched on that I spoke to,” per Gupta.

“It is about, what was the intention of the customer when they actually came to your website or your online platform, and what did they actually end up doing?

“So someone’s intention was that they were going in to cancel their subscription because, due to cost of living, they couldn’t afford it. Or they saw, ‘actually, I don’t even use it, can I leave?’”

But they actually “end up signing up to two more months, or they’ve paused it for it to be automatically renewed in three months’ time, or they’ve given up and just pay for it,” says Gupta.

“I think this is where that distinction is: What is it that the customer wanted to do versus what they ended up doing? And that gap is where dark patterns truly sit.”

Big ticket profiteering examples

The report calls out other in-market examples – and proof that deploying dark patterns can massively ramp profit.

One comes from StubHub – America’s equivalent to Ticketek – which A/B tested pricing techniques and messaging across millions of customers.

Half were shown the actual full price of a ticket after add-ons, the other half were shown just the base cost, with the extras added on through the sales process.

The upshot: The half shown the ‘cheaper’ price without add-ons spent circa 21 per cent more on tickets and were 14 per cent more likely to buy.  

Subsequently, the firm is being sued in the US by the District of Columbia, which alleges its citizens have as a result paid more than $118m in hidden fees.

StubHub made US$1.77bn revenues in 2024. Applying dark patterns across its business – should it be proven is the case – would therefore cost its customers $372m extra a year on the basis of a 21 per cent hidden fee gain, or boost its revenues to a similar tune.

Applied across platforms and ecommerce alone, the annual impact of choice manipulation likely runs to hundreds of billions if not trillions of dollars globally – and Gupta says it is endemic across markets.

“It can go beyond, and it will go beyond [apps and websites]. I think any online experience that we are currently engaging in has the potential to have dark patterns exploited through it.”

Some, such as the ACCC, suggest it is common practice within physical retail too.

Hence governments and regulators around the world starting to act – but approaches differ.

EU-US-Aus tightening

Some governments are explicitly targeting dark patterns by classifying them as deceptive, some are wrapping deceptive conduct into broader unfair commercial practices laws and updating existing legislation; some are taking a mixture of both approaches.

Classifying dark patterns and manipulation as ‘deceptive’ brings it into the scope of current regulations like the EU’s Unfair Commercial Practices Directive and The Data Act, as well as updated GDPR rules. Meanwhile, the EU’s Digital Markets Act, which applies almost squarely to big tech, also attempts to cover dark patterns, as does the EU Artificial Intelligence Act – though it doesn’t explicitly mention dark patterns.

But the EU’s Digital Services Act (DSA), which came into force last year, does call out dark patterns. Gupta sees it as “the shape-shifter”. That’s because it specifically requires companies to assess risk of manipulation within their choice architectures and user experience (UX) design – and provide direct access to those architectures to authorities, or risk percentage of global revenue fines.

In other words, ‘prove you haven’t tried to manipulate consumers, give us the evidence and we’ll be the judge’.

Under the DSA, The European Commission has instigated proceedings or is seeking further information from the likes of Meta, Google, Microsoft, Apple, AliExpress, LinkedIn, Temu, Shein, X, Amazon, Snap, Pinterest, and others including Booking.com and adult content platforms Porn Hub and XNXX.

European Commission researchers scouring returned information submissions last year resulted in targeted ad data changes at LinkedIn – and it appears TikTok may be next to face enforcement. Microsoft is also being probed over generative AI risks stemming from integrating co-pilot into Bing, with suspicion of “automated manipulation” specifically cited.

The US, UK, Singapore and India are reliant on blanket laws on unfair trading – but are starting to make specific callouts on dark patterns in one guise or another.

The US now has ‘Click to Cancel’ legislation that stipulates businesses can’t make cancelling a subscription or service harder than it is to sign up.

Crucially, US regulators are taking action. The FTC has taken Epic Games to task – fining it $245m and making it pay back $72m to consumers the Commission said were “tricked” into in-game purchases and letting kids rack up bills on their parents’ cards.

Per the FTC: “Epic deployed a variety of design tricks known as dark patterns aimed at getting consumers of all ages to make unintended in-game purchases. Fortnite’s counterintuitive, inconsistent, and confusing button configuration led players to incur unwanted charges based on the press of a single button.”

The FTC is also taking on Amazon and Uber specifically in relation to dark patterns for making their subscriptions hard to cancel and/or to allegedly “trick” them into signing up for auto-renewing subscriptions.

Dark pattern-privacy pincer

There’s considerable overlap between dark patterns and data privacy laws, with consumers being manipulated wholesale into handing over more of their data to access services than is necessary, per the report. This has arguably increased since efforts to rein-in data collection and beef-up privacy control.

The CPRC calls that kind of dark pattern a “data grab” and experts interviewed for Gupta’s report suggests consent management platforms – which power the pop-up windows on websites and apps telling people what’s being collected and presenting opt-outs in various guises – themselves “manipulate data collection”. Despite being captured by regulations like GDPR, they aren’t being punished.

Which means companies are effectively incentivised to indirectly profit from dark patterns by nudging customers/audiences into agreeing to have their data monetised by hundreds, if not thousands, of third parties.

AI opens up the space for dark patterns to be exploited and very much hyper-targeted … My concern is how much of it will be supercharged and will become even more difficult to identify and then actually take action on.

Chandni Gupta, Deputy CEO, Consumer Policy Research Centre

AI – dark supercharger

With behavioural economics subverted – per Gupta, “reverse engineered [to become] ‘how can we profit off a particular design pathway’,” – consumer bodies and data protection authorities around the world fear that “pinpointing consumers at their weakest point through AI” will soon become widespread.

It’s a fear Coke CEO James Quincey recently highlighted ­– forecasting an AI-powered dystopia where AI personalises comms “until people click yes.”

Meanwhile, generative AI summaries within search draw on a different set of sources and do away with the obvious cues that tell people something is an ad or untrustworthy source per the report. Likewise identifying dark patterns within non-visual architectures – like voice-enabled bot-curated products and services – will get harder to spot and enforce.

More broadly, warned experts, if dark patterns are baked into AI model training, they will be embedded into standard responses. Which means they could be impossible to unpick without starting over – and much harder to spot.

“Much of the regulatory conversation has been about the dark patterns we can see, things like a scarcity cue or an activity notification, they’re visible, they’re static,” says Gupta.

But when you start thinking about virtual and dynamic interfaces, it does open up the space for dark patterns to be exploited and very much hyper-targeted.

“There’s also far more data points that an immersive or an AI enabled tech may be able to gather, like eye tracking and immersive tech or how you pose questions to a generative AI bot, and that cumulative effect of dark patterns, which is already of concern, is likely to exacerbate even further,” she says.

“So my concern is how much of it will be supercharged and will become even more difficult to identify and then actually take action on. Which is why I think regulators around the world are thinking about how to broaden their toolkit and their resources to be able to really look under the hood of what companies are doing and say, ‘actually, this is unfair’.”

What Canberra does next

The Federal government is likely heading towards similar umbrella legislation as being applied in the US and UK under Australian Consumer Law – but with specific provisions on manipulation even if it is not deemed misleading or deceptive per se (which has historically been hard to prove, and therefore prosecute, under current consumer law).

It has called out practices that create a false sense of urgency, making it difficult for people to cancel subscriptions, drip pricing, dynamic pricing and hidden fees and specific post-sale practices, like making it hard for people to access customer support in its most recent Consumer Law consultation (summarised by law firm Clifford Chance here.)

The ACCC is fully behind the proposals – and wants them to apply equally to businesses, i.e. a B2B unfair trading practices ban.

Plus, the ACCC specifically calls out non-digital manipulation and dark patterns, such as supermarket pricing labels that look much the same as special offer labels. Which could be interpreted as a warning.

Dark disruptors: Government action list

Gupta urges government to go further still and introduce an economy-wide ban on unfair business practices – with specific legislation on dark patterns, and much sharper enforcement powers, including making companies pay compensation and forcing firms to delete data acquired deceptively.

She thinks Australia could lift the templates set by other governments – such as making businesses hand over the results of A/B testing in UX design to prove they have not intentionally engaged in dark patterns, or face significant fines.

Gupta also suggests government sets a levy on large digital businesses – or “babysitting fees” – that in effect fund oversight and compliance on dark patterns, and gives enforcement agencies the resource they need to go after repeat large offenders that have usually priced-in the cost of fines … if they get caught.

But regulators and lawmakers must also show businesses what ‘good’ looks like. Gupta’s report nods to work by London-based Projects by IF and Fair Patterns by Amurabi – which has developed an AI solution to find and fix dark patterns at scale – as examples that firms could be pointed towards.

Likewise, given the sluggish pace of regulatory change, and subsequent intervention, the report urges enforcement agencies “to think like marketers”, i.e. tell the stories of investigation – during and after – to maintain momentum and keep businesses on their toes without waiting years for a result.

Meanwhile, the report points to proactive steps some regulators have already undertaken: The Dutch equivalent to Australia’s ACCC, the Authority of Consumers and Markets (ACM), has developed an automated ‘dark pattern detector’ specifically for misleading scarcity countdown timers, and subsequently crawled 30,000 websites.

That served to put businesses ‘playing in the grey’ on high alert. Combined with placing an obligation on firms to open up UX design for scrutiny, it could go some way to driving wholesale change before laws are updated and enforced.

For Australian businesses, designing or testing what your UX looks like by putting the person and their wellbeing at the centre of it – that is what's going to make a real difference in the long-term.

Chandni Gupta, Deputy CEO, Consumer Policy Research Centre

How businesses can prepare

As ever, there is pushback from industry on Australian Consumer Law update proposals. Some marketing industry bodies have urged the government to narrow the legislation to ban only practices that cause “significant economic detriment”. But Gupta and the CPRE suggest focusing on economics alone misses the point. Per Gupta, the focus should be on wellbeing over pure monetary impacts.

In other words, customer-centricity for good.

“For businesses, I’d say you’re in the actual best position to make change now to really think about the experience that you’re giving to Australian consumers when you’re designing systems,” she says.

That is, “designing or testing what your UX looks like by putting the person and their wellbeing at the centre of it. That is what’s going to make a real difference in the long-term”.

While businesses will fear losing revenue to less scrupulous rivals by redesigning CX for wellbeing over profit, Gupta suggests there is also upside. She points to previous CPRE research into subscription traps that found “90 per cent of Australians would willingly come back to a system or to a platform or a business if it was easy to cancel”.

“There is real business sense to make it a really great experience for your customers, as opposed to holding on to them against their will,” per Gupta.

“So that is one aspect I’d really like to see where businesses really shape-shift.”

Gupta’s report also moots a new job function for businesses that do grasp the nettle: Design Accountability Officer.

Whether or not companies start hiring DAOs remains to be seen (a skim of LinkedIn job postings currently draws a rare blank). But Gupta and the CPRE say the basic principle to overcome dark pattern manipulation, and subsequent regulatory problems, is simple:

What we don’t want to see is businesses waiting for things to go wrong or waiting to be caught to then actually act,” she says. “That’s not how you’re going to garner the trust of your customers. Trust has to be a mindset.”

There’s more nuance, context and examples in the podcast. Get the full download here.