The Digital Fairness Act of the EU



What is the Digital Fairness Act?

17 September 2024 - In the mission letter addressed to Commissioner-designate for Democracy, Justice and the Rule of Law, President von der Leyen refers to the need to develop “a Digital Fairness Act to tackle unethical techniques and commercial practices related to dark patterns, marketing by social media influencers, the addictive design of digital products and online profiling especially when consumer vulnerabilities are exploited for commercial purposes”.


Digital_Fairness_Act

In October 2024, the European Commission published the findings of the Digital Fairness Fitness Check, which evaluates whether the current EU consumer protection laws are fit for purpose to ensure a high level of protection in the digital environment.

The Fitness Check covered three core Directives:

- the Unfair Commercial Practices Directive,
- the Consumer Rights Directive, and
- the Unfair Contract Terms Directive.

The results show that these rules remain both relevant and necessary to ensure a high level of consumer protection and effective functioning of the Digital Single Market. However, it also shows that consumers behave differently online than offline.

Moreover, technological developments and increased tracking of online behaviour enable businesses to more effectively persuade consumers online. This highlights the need for rules that are better adapted to the specific harmful practices and challenges that consumers face online.

According to the European Commission, the effectiveness of EU consumer protection is undermined by:

- insufficient enforcement,
- legal uncertainty,
- increasing risk of regulatory fragmentation across Member States' national approaches,
- the lack of incentives for businesses to aim for the highest standard of protection.

Increased legal certainty could prevent regulatory fragmentation and promote fair growth. There is scope for simplifying existing rules, without compromising the level of protection. It is also fundamental to ensure the coherent application and effective enforcement of EU consumer law and the EU digital rulebook, including the Digital Services Act, which prohibits several unfair practices on online platforms.

According to the findings of the Digital Fairness Fitness Check, consumers do not always feel fully in control of their online experience due to practices such as:

1. Dark patterns in online interfaces that can unfairly influence their decisions, for example, by putting unnecessary pressure on consumers through false urgency claims.

2. Addictive design of digital services that pushes consumers to keep using the service or spending more money, such as, gambling-like features in video games.

3. Personalised targeting that takes advantage of consumers' vulnerabilities, such as showing targeted advertising that exploits personal problems, financial challenges or negative mental states.

4. Difficulties with managing digital subscriptions, for example, when companies make it excessively hard to unsubscribe.

5. Problematic commercial practices of social media influencers. Some of these practices may already go against existing EU consumer law and other EU law, for example, the Digital Services Act and the Audiovisual Media Services Directive.


Article 25 of the Digital Services Act (DSA) is the first step, the Digital Fairness Act (DFA) is the next step

We will start with the DSA.

Article 25, Online interface design and organisation - the Digital Services Act (DSA)

1. Providers of online platforms shall not design, organise or operate their online interfaces in a way that deceives or manipulates the recipients of their service or in a way that otherwise materially distorts or impairs the ability of the recipients of their service to make free and informed decisions.

2. The prohibition in paragraph 1 shall not apply to practices covered by Directive 2005/29/EC or Regulation (EU) 2016/679.

3. The Commission may issue guidelines on how paragraph 1 applies to specific practices, notably:

(a) giving more prominence to certain choices when asking the recipient of the service for a decision;

(b) repeatedly requesting that the recipient of the service make a choice where that choice has already been made, especially by presenting pop-ups that interfere with the user experience;

(c) making the procedure for terminating a service more difficult than subscribing to it.


While Article 25 of the DSA is a strong first step, it does not fully address all manipulative practices. This is where the Digital Fairness Act (DFA) comes in. The DFA is addressing additional areas of concern.

The Digital Fairness Act will apply to a wider range of businesses, including e-commerce sites, app developers, digital advertising firms, and online services. There is also a broader definition of manipulative digital designs. The DFA introduces stronger rules to prevent ads that take advantage of psychological vulnerabilities.

One of the biggest complaints from consumers is that subscription services make it difficult to cancel (e.g., requiring a phone call to cancel but allowing one-click sign-ups). According to the Digital Fairness Act, the cancellation process must be as easy as the subscription process.

Many online influencers promote products without proper disclosure, leading to misleading marketing. The DFA will introduce stricter transparency rules for influencer marketing and algorithm-driven promotions. Also, companies increasingly use AI and behavioral profiling to exploit consumer weaknesses. The DFA will introduce rules to prevent AI-driven deceptive marketing tactics.


Understanding dark patterns in online interfaces, from Cyber Risk GmbH.

Dark patterns are deceptive design techniques used in online interfaces to manipulate users into making decisions that benefit a business, often at the expense of the user’s true intent or interest. These patterns exploit cognitive biases and aim to influence behavior, nudging users toward actions they might not otherwise take. Dark patterns can result in unfair practices, such as unintended purchases, unwanted subscriptions, or sharing personal data without fully informed consent.

Cognitive biases are deviations from rationality in judgment or decision-making that arise because of the brain's tendency to simplify information processing. While cognitive biases can help in making quick decisions, they often lead to faulty reasoning and poor judgments.

For example, anchoring bias occurs when people rely too heavily on the first piece of information they encounter (the “anchor”) when making decisions. Even if subsequent information is more relevant, the initial information heavily influences their judgment.

Another example is the bandwagon effect that reflects the tendency to adopt beliefs or behaviors because many other people do, rather than basing choices on individual analysis.


Common types of dark patterns.

1. Sneak into Basket. This involves adding items to a user’s shopping cart without their explicit consent. For example, when a user is making an online purchase, additional products (like warranty) might be pre-selected, leading users to accidentally purchase them if they don’t carefully review their order.

Here is a simple example:

A buyer adds a laptop to their cart. Without noticing, an extended warranty is pre-selected, increasing the total price. The user must manually opt-out.


2. Confirmshaming. This tactic involves using guilt or emotional manipulation to persuade users to take a particular action. For instance, when a user tries to decline a newsletter subscription, they may see a button labeled “No, I don’t want to stay informed,” attempting to guilt-trip them into accepting the offer.

Here is a simple example:

"Would you like to subscribe to our newsletter?"

[Yes] → “Sign me up!”

[No] → “No, I hate learning new things.”


3. Forced Continuity. Users may sign up for a free trial of a service only to find that canceling is difficult, or they are automatically charged once the trial ends without clear reminders. This pattern locks users into ongoing payments without their agreement.

Here is a simple example:

A user signs up for a 7-day free trial of a streaming service. The credit card is required upfront. The company does not send a reminder before charging the user when the trial ends. The user only realizes they were charged after checking their bank statement.


4. Roach Motel. This design makes it easy for users to get into a situation (such as subscribing to a service) but difficult to get out (canceling the subscription). Complex cancellation processes, hidden settings, or no easy way to opt-out are characteristic of this pattern.

Here is a simple example:

A user subscribes to a €1 trial of an online magazine or service. The cancellation process is hidden deep in the terms. When trying to cancel, the system requires calling customer service or navigating multiple confusing steps. After the trial, the user is billed the full monthly price without an easy way to cancel.


5. Hidden Costs. Additional fees or charges appear late in the purchase process, such as during checkout. This could include taxes, shipping fees, or surcharges that were not disclosed upfront, misleading the user about the true cost of a product or service.


6. Unknowingly Granting Permissions. This pattern tricks users into sharing more personal information than they intended. It often involves unclear privacy settings or opt-out options that are hard to find, leading users to unknowingly grant permissions for data collection.


7. Trick Questions. This technique uses confusing wording or misleading options in forms to trick users into making decisions they didn’t intend. For example, a checkbox might be worded in a double negative, making it unclear whether checking or unchecking it opts in or out of something.


8. Bait and Switch. Users expect one outcome from a certain action but are given a completely different result. For example, clicking a button that appears to close a pop-up might instead take the user to an unrelated page or service.

Here is a simple example:

A website advertises a high-end laptop for €899 (normal price €1,299). The user clicks to buy, but at checkout, the discount disappears, and the price returns to €1,299. The company claims the deal expired (even though it was still being advertised).

Here is another simple example:

An online job listing advertises “€80,000 salary, remote work, full benefits”. After applying and going through interviews, the company offers €50,000, no remote work, and minimal benefits. The original offer was never real, it was just a tactic to attract applicants.


9. Misdirection. This involves designing an interface where the user’s attention is directed toward one thing, often something beneficial to the company, while hiding or downplaying other important information or options. An example might be highlighting a premium service while the option for a free version is hard to locate.


10. Disguised Ads. Ads that appear to be part of the interface, such as editorial content or buttons, are another form of dark patterns. Users may click these ads thinking they’re interacting with the platform, only to be directed to external content or products.

Here is a simple example:

A news website publishes an article titled "Why Experts Recommend This New Weight Loss Pill." The article looks like real journalism, but it is actually a paid promotion from the pill manufacturer.

Here is another simple example:

A user visits a website to download free software. There are multiple "Download Now" buttons, but only one leads to the advertised download. The other buttons are disguised ads that lead to third-party sites.


11. Shadow Banning. In some social platforms, users can be shadow banned, where their content is still visible to them but is hidden from everyone else. This prevents users from realizing they have been banned, encouraging continued engagement while preventing their content from reaching others. This pattern exploits the psychological need for validation and feedback.

Here is a simple example:

An online seller offers a product that competes with the platform’s own branded item. Their listing is silently moved to the bottom of search results of other users, making it practically invisible to buyers. The seller never learns about that.

Here is another simple example:

A simple (not "premium") user contributes frequently to a large discussion forum. Moderators place a silent restriction on their account, making their posts visible only to them and a few others. The user never receives a notification.


12. Obfuscation Through Legalese. Although providing terms and conditions is necessary, some companies intentionally design legal agreements to be excessively long, filled with complex jargon and legal terms. This legal obfuscation prevents users from fully understanding the consequences of their consent, exploiting their lack of legal expertise.

Here is a simple example:

A user signs up for a social media account and must agree to the Terms of Service (ToS). The privacy policy is 30+ pages long, filled with dense legal jargon. Buried in the text, the platform reserves the right to share personal data with third-party advertisers. The user unknowingly agrees to extensive data tracking because they don’t understand the complex wording.

Here is another simple example:

A weather app requires the user to accept the privacy policy before use. The policy is written in vague legal terms like: “We may process certain personally identifiable information for the purposes of enhancing user engagement through third-party partnerships.” In reality, this means the app sells location data to advertisers, but the user cannot understand this from the wording.

Here is one more simple example:

A user purchases a product from an e-commerce store. The return policy states: “All purchases are subject to an evaluation process based on the conditions outlined in Section 2.2 of the user agreement.” The buyer later discovers that returns are only allowed within 3 days, only for defective items, and require a written request via postal mail, or a telephone call to the customer service, where they must wait for hours.


13. Buried data-sharing. It involves processes that are buried deep within nested menus or toggles, requiring the user to actively discover and disable them. For example, an app might track location data by default, and the setting to disable it is hidden several layers deep in the privacy settings menu, making it difficult for users to protect their privacy.


14. Disguised Feedback Collection. In some cases, platforms collect user feedback disguised as something else, such as a simple satisfaction survey. However, the feedback might also be used for purposes unrelated to the survey, like product promotion or profile building for personalized marketing. Users are often unaware that their input is being monetized.

Here is a simple example:

A user completes an online purchase and is asked: “How was your shopping experience?” They clicks “Great!”. Later, they find their name and rating displayed on the website as a customer testimonial, without their explicit permission.

Here is another simple example:

A mobile app asks users to rate their experience with: “Are you enjoying this app?” If they select 5 stars, they are immediately redirected to the App Store to post a review. If they select 3 stars or lower, they are sent to a private feedback form instead. This manipulates public ratings by only allowing positive feedback to be published.


15. Friend Spam. Some services, especially social media platforms, request access to a user’s contacts under the guise of helping them connect with friends. However, these platforms may send unsolicited messages or invitations to the user’s contacts, often without explicit consent. This form of social pressure leverages personal connections to drive engagement or sign-ups.


16. Scroll Jacking. In this advanced technique, websites hijack the user’s scrolling behavior, making it difficult or impossible to scroll freely. For instance, scrolling may trigger pop-ups, slide-ins, or automatic loading of additional content (infinite scroll), preventing the user from easily navigating the page as they intended.


17. Subscription with Dynamic Pricing. Some companies use dynamic pricing algorithms in subscription models where the cost of a service varies based on user behavior, such as how often they use it or their purchasing history. When paired with a subscription trap (making it difficult to cancel), users can unknowingly pay fluctuating prices, higher than expected.

Here is a simple example:

A user frequently reads a particular online newspaper. The website tracks their visit history and engagement level. When they finally decide to subscribe, they are offered a €15/month plan. Another new visitor sees a €9/month offer for the same subscription. The pricing algorithm detects interest and charges loyal readers more.


18. Dark Algorithmic Amplification. This involves platforms using algorithms to amplify content or suggestions that increase user engagement, even if the content is harmful, inflammatory, or addictive. For example, social media algorithms may prioritize divisive or emotionally charged posts because they are more likely to generate interactions, even if they reduce user well-being.

Here is a simple example:

A shopping platform uses an algorithm that artificially inflates product demand. Even for widely available items, the site displays messages like “Only 3 left! 20 people are looking at this now!” to pressure users into buying. In reality, stock levels are not changing, and the urgency is fabricated to manipulate purchases.


19. Decision fatigue refers to the deteriorating quality of decisions after a long session of decision-making. Dark patterns exploit this by creating overly complex choices or difficult-to-navigate systems.

Here is a simple example:

A user wants to opt out of data tracking on a website. Instead of a simple "Accept All" or "Reject All" button, the settings require manually disabling 30+ different tracking options (e.g., analytics, personalization, advertising, specific third-party cookies etc.). Frustrated, the user gives up and clicks "Accept All" just to move on.

Here is another simple example:

Online members want to cancel membership. In order to cancel, the website makes them confirm cancellation in the account settings, write personal details, fill out a "feedback survey", chat with a representative who offers discounts to stay, receive an email requiring an extra confirmation click etc.



3 October 2024 - COMMISSION STAFF WORKING DOCUMENT, FITNESS CHECK of EU consumer law on digital fairness.

Dark patterns are commercial practices deployed through the structure, design or functionalities of digital interfaces or system architecture that can influence consumers to take decisions they would not have taken otherwise, e.g. presenting choices in a non-neutral manner, using fake countdown timers to create urgency, using emotional manipulation to make consumers second-guess their indicated choice, phrasing questions using double negatives, misleading consent options in cookie banners.

Although traders’ attempts to influence consumer decision-making is not a new phenomenon, concerns have intensified about the increased effectiveness and scale of such practices as well as the potential for personalised persuasion based on behavioural data.

While there are no existing baseline figures from 2017 for each type of dark pattern, there have been numerous enforcement actions in the past five years against various misleading online practices (e.g. drip pricing, subscriptions traps, hidden information), which were not previously labelled as ‘dark patterns’ but simply as consumer law breaches.

The problem with the prevalence of dark patterns has arguably become worse, as illustrated by the sharp increase of policy attention and regulatory or enforcement action from European and other authorities globally in the last three years (e.g. US, UK, South Korea, India; OECD Committee on Consumer Policy).

In the targeted survey, 61% of respondents perceived an increase of the deployment of dark patterns during the evaluation period. The Commission’s 2022 dark patterns study showed that 97% of the most popular websites and apps used by EU consumers deployed at least one dark pattern, with the most common ones involving hiding information, creating false hierarchies in choice architectures, repeatedly making the same request, difficult cancellations and forced registrations.

The 2022 CPC sweep by EU consumer authorities found that nearly 40% of online retail shops contained at least one dark pattern, specifically fake countdown timers, hidden information and false hierarchies in choice architectures.

The 2024 International Consumer Protection and Enforcement Network (ICPEN) and Global Privacy Enforcement Network (GPEN) sweep of the websites/apps of 642 traders found that 75,7% of them deployed at least one dark pattern, and 66,8% of them employed two or more dark patterns.

Sneaking practices (e.g. inability of the consumer to turn off auto-renewal of subscription service) and interface interference (e.g. making a subscription that is advantageous to the trader more prominent) were encountered especially frequently.

Concerning evidence of current problems, in the representative consumer survey, 40% reported experiencing a situation where the design or language used on a website/app was confusing, which made the consumer uncertain about what they were signing up for, or about which rights and obligations they had. 66% saw claims that a product was low in stock or high in demand (e.g. that many other consumers are currently looking at the same product) and 61% saw claims that a product was available only for a limited time, without the ability to know if these claims are truthful.

32% reported paying more than they planned to because during the purchasing process the final price changed to a price higher than the one advertised initially. 48% of consumers, especially the young, were pressured with repeated requests to make a decision, e.g. to get a premium account, offering special discounts, asking to buy a recommended product. After indicating their choice or declining a choice offered, 42% received messages that made them doubt their decision, e.g. asking questions like ‘are you really sure you do not want a discount?’. 37% recognised a situation where important information was visually obscured or ordered in a way to promote an option that did not seem to be in their interest. 37% encountered preselected options that were in favour of the company but changing those options was difficult.

42% experienced a situation where making a choice led to a different result than they would normally expect, e.g. clicking an unsubscribe button led to a page describing the benefits of that service that you would lose. Dark patterns can affect a wide range of transactional decisions and many of them have been empirically proven to appreciably impair the consumers’ ability to take an informed decision.

In the public consultation, 89% of consumers reported being confused by dark patterns in website/app design and 76% felt pressured to buy something due to the language or design that was used.

The behavioural experiments in the Commission’s 2022 dark patterns study showed that when exposed to dark patterns the probability of making a choice that was inconsistent with the consumers’ preferences increased – the average figure of making inconsistent choices arose to 51% for vulnerable consumers and 47% for average consumers, with older consumers and those with lower education levels being more impacted.


Addictive design and gaming.

As consumers navigate the ‘attention economy’, concerns have increased regarding specific interface designs and functionalities that could induce digital addiction. It is generally in the traders’ economic interest to design their products in a manner that increases the amount of time, money and engagement that consumers spend, especially those traders whose business model relies on the processing of consumer data.

However, the addictive use of digital products and services carries the risk of economic, physical and mental harm, including, but not confined to, vulnerable consumers such as children.

While most of the addictive design features already existed in 2017, both the market size and consumer use of products like social media and video games in the EU have increased over the evaluation period.

Furthermore, algorithmic recommendations and other data-driven practices improved in their efficacy and persuasiveness as more consumer data was gathered through the years. The EP’s 2023 resolution on addictive design of online services highlighted the negative impacts that addictive design could have on consumers, including mental health problems, especially for younger consumers.

In the representative consumer survey, 31% of consumers reported spending more time or money than they intended because of specific features such as the autoplay of videos, receiving rewards for continuous use or being penalised for inactivity. In the public consultation, 33% of consumers reported spending too much time or money using certain websites or apps for hours.

Concerns have also arisen with specific products such as video games that increasingly involve the sale of virtual items, including uncertainty-based rewards (e.g. loot boxes), and the use intermediate in-app virtual currencies, which could distort the real value of the transaction for consumers and encourage them to spend more than they intended.

Furthermore, these practices are often accompanied by opaque offer and pricing techniques. The proliferation of commercial communications in gaming environments raises different concerns that are currently not expressly addressed by any EU law. Over the evaluation period, there has been an increase in the use of in-game purchases and virtual items like loot boxes. In 2018, 74% of the video game turnover came from app and online revenues, compared to 83% in 2022.

Loot boxes were much less widespread in 2017, compared to 2023. Concerns have also been amplified due to the widely increased accessibility of such apps to minors given the ubiquity in the availability of smartphones and tablets. The targeted stakeholder survey showed that 68% of respondents considered the use of loot boxes and addiction-inducing features to have increased over the evaluation period. In the representative consumer survey, 29% of consumers had experienced a situation where the real price of a virtual item was not clear because it was only indicated in the app’s virtual currency.


Personalisation.

As a cross-cutting issue, concerns about the use of consumers’ personal data have increasingly undermined consumer trust over the evaluation period.

Personalisation practices in the B2C context can take the form of behavioural advertising, search result ranking, recommendations, prices etc., which can offer many benefits for consumers. However, the 2023 CCS found that 70% of consumers are concerned about how their personal data is used and shared, which amounts to a 21 percentage point increase compared to 2018.

Targeted advertising was already prevalent in 2017 and continues to be used extensively, as digital advertising has become the largest advertising channel globally. The targeted stakeholder survey showed that 53% of respondents perceived personalised pricing to have increased in frequency over the evaluation period, although these practices are difficult to detect. Data collection in policy discourse and research has become more frequent after the entry into application of the GDPR in 2018.

Furthermore, the 2023 CCS also found that consumers continue to be concerned about the processes concerning the collection of personal data and profiling (66%), installation of cookies (57%), negative effects on their trust in e-commerce (38%), seeing only a limited selection of ads and not the best offers (38%), inability to opt-out/refuse (37%) and inability to distinguish between information and advertising (35%).

The representative consumer survey found that 41% of consumers had experienced a situation where the design or language of the website/app made it difficult to understand how their personal data would be used, and 37% of consumers had the impression that the company had knowledge about their vulnerabilities and used it for commercial purposes.

In the public consultation, 74% of consumers thought their personal data was misused or used unfairly to personalise commercial offers in the preceding 12 months. BEUC’s representative 2023 survey showed that the majority of consumers do not consider personal data analysis and monetisation to be fair (60%) and they do not feel fully in control of the decisions they make or the content they are shown online – consumers reported feeling unsafe (60%), manipulated (55%) or suspected that their rights were violated (46%), yet less than half of consumers considered filing a complaint and only 22% felt satisfied by how authorities are protecting them against unfair practices. These concerns were heightened in case of personal data about vulnerable consumers who are more at risk, in particular children.


Digital contracts.

In the context of the exponential growth of the digital subscription economy and the trend towards ‘freemium’ business models, consumer have increasingly encountered problems with their digital contracts.

While there are no existing baseline figures from 2017 for each issue related to digital contracts, the subscription economy market has tripled since 2017 and figures from previous studies show that problems with difficult cancellations and subscription traps have increased.

For example, in 2017, 7% of consumers reported experiencing problems with subscriptions51, compared to 14% in 202052 and much larger figures identified in this Fitness Check (see granular survey data below; up to 60 percentage point increase from 2017 to 2023).

In the representative consumer survey, 40% considered that the design of the website/app made cancelling the subscription very difficult.

The sweep in the framework of the supporting study showed that traders provide clear information about the 14-day right of withdrawal in only 54% of cases and the procedure for cancellations beyond 14 days was only fairly clear in 34.7% of cases.

In the representative 2023 CCS, 23% of consumers reported difficulties with cancelling a contract that they had concluded online. In the public consultation, 69% of consumers found it technically difficult to cancel their contracts, 55% experienced deliberate avoidance of contract cancellation by the trader and 34% were only able to cancel their subscriptions after a longer time period (e.g. a year), despite being charged monthly.

Auto-renewals can be convenient and beneficial for consumers, provided the consumers are aware of them. In the representative consumer survey, 29% of consumers reported often having their free trial automatically extended into a paid subscription.

Consumers also indicated that they continued paying for a digital subscription that they had stopped using some time ago but forgot to cancel (18% encountered this often, 19% sometimes). 62% of consumers in the public consultation experienced automatic renewals of inactive subscriptions without reminders.

Consumers have limited bargaining power when entering into contracts in the digital environment – in general, they can either take it or leave it. The detection of unfair contract terms presumes that consumers are able to familiarise themselves with the contract terms in the first place, but most consumers never choose to do so.

In the representative consumer survey, only 36% of consumers indicated that they read the Terms and Conditions always or often, with a further 23% indicating they do this sometimes. The prevalence of unfair contract terms has increased over the evaluation period. According to the CCS, in 2017, 9.8% of consumers encountered unfair contract terms, compared to 22% in 2023, without however distinguishing between contract terms in offline vs online environments.

3 October 2024 - COMMISSION STAFF WORKING DOCUMENT, FITNESS CHECK of EU consumer law on digital fairness.


Cyber Risk GmbH, some of our clients