Regulating ‘Big Tech’: EU’s DSA and DMA initiatives

QQ截图20210611122822.png

QQ截图20210611141223.png

A. Introduction

Online platforms are the backbone of modern digital economies. Online platforms intermediate between users (commercial businesses, advertisers, consumers) to create mutually beneficial ‘matches:’ social networks bring together individual users with common interests (and those users together with advertisers), search engines match consumer queries with relevant search results (products and services), digital (audio and video) content platforms match users with the artistic content that they may find entertaining. From a societal perspective, these online intermediaries play a highly beneficial role by reducing information costs for consumers and businesses alike: they make it easier for consumers to find products, services and content that suits their interests, and for businesses to find willing buyers of their products and services.

Compared to more traditional industries, online platforms are particularly characterised by the following features: (a) operation in  two-sided markets: online platforms intermediate between users in two (or even more than two) distinct sides of the market, (b) strong network effects: online platforms become more valuable to their users as the number of those users increases, (c) intensive use of data (both personal and non-personal): online platforms essentially use advanced analytics to process information about consumer behaviour and adjust their services to current customer demands, (d) economies of scope in the use of data: online platforms combine user-generated data from multiple business settings to gain a more holistic view of consumer behaviour, and (e) economies of scale: following a large upfront investment for setting up the business, online platforms incur only a minimal cost for serving additional customers. 1

These characteristics combined make for highly dynamic, innovative, and responsive digital markets, but at the expense of a relatively high market concentration that has caused concerns among regulators in many jurisdictions worldwide. These concerns are exacerbated by the fact that online platforms and their services have become indispensable for economic growth (especially the growth of smaller businesses), job creation and innovation. In Europe alone, more than one million businesses trade through online platforms. Sixty percent of private consumption in the EU is related to products and services supplied by or through online platforms. Eighty-two percent of European small-and-medium enterprises (SMEs) rely on search engines to promote their products and services.

At the same time, however, the most successful online platforms have come to occupy a strong market position that enables them to set terms with smaller businesses and end-users that may appear to be disadvantageous or even unfair. This is largely due to the fact that several online platforms possess a strong bargaining position vis-à-vis their users, although they may fall short of being dominant in the antitrust law sense (i.e., to possess market power that enables them to suppress output and raise prices). Complaints against online platforms include unfair commercial terms, lack of transparency in their operation (for instance, alleged lack of transparency regarding how search results are presented by search engines), contributing to the proliferation of illegal content online. In particular, concerns have been expressed regarding the role of online platforms in the proliferation of content that infringes upon third parties’ intellectual property rights (IPRs) with the online sale, for instance, of counterfeit goods that infringe trademark rights, or the unauthorised making available online of copyright-protected content.  

To address issues around the digital economy and secure Europe’s leading role in this emerging new environment, the authorities of the European Union (the Commission, the Council, and the Parliament) have embarked upon an ambitious digital strategy that includes: (i) enforcement of the antitrust provisions of the Treaty for the Functioning of the EU (TFEU)—Articles 101 and 102—against allegedly anti-competitive practices in the digital economy, (ii) soft-law instruments, such as communications, proposals, guidelines, and resolutions that express the current position of the Union policy-making bodies on issues around the digital economy, and (iii) hard-law: regulations and directives that set the rules for online platforms and other businesses in the digital sphere. Central among these initiatives is the EU’s Digital Single Market (DSM) strategy that aspires to create an integrated EU market of 500 million consumers for digital services.2 The DSM strategy aims at breaking the barriers to trade on digital services between EU Member States, promoting the European standardisation ecosystem, and enhancing the competitiveness of European digital industries (especially SMEs).  

 

B. Context

An important part of the Union policy as regards online platforms has been elaborated in the context of EU competition law enforcement. Specifically, the European Commission has enforced Article 102 TFEU against abuses of dominant positions by adopting three decisions against Google, in Google Shopping, Google AdSense, and in Google Android cases, and has a Statement of Objections under Article 102 against Amazon. In its Google Shopping decision, the Commission held that the search engine abused its dominant position by leveraging its dominance to expand its product search business at the expense of specialist price comparison search engines and apps.3 Among the practices found by the Commission to result in exclusionary anticompetitive effects was the alleged demotion in the Google Search listings of competitors’ services and the granting of preferential treatment (and listings) to its own Google Shopping service. In Google AdSense, the Commission found that Google abused its dominant position by agreeing with third-party websites a number of restrictive contractual clauses that, according to the Commission, effectively prevented Google's competitors from placing their search adverts on these websites.4 In its Google Android case, the Commission held that Google abused its dominant position in the online search market by imposing a range of restrictive contractual clauses on smartphone manufacturers that allegedly deprived rival search engines of the possibility to compete ‘on the merits’ with Google. For the infringement of Article 102 in Google Android the European Commission imposed on Google the heaviest ever fine for an EU competition law infringement amounting to €4.3 billion.5 In its latest antitrust enforcement initiative, the Commission issued a Statement of Objections to the e-commerce platform Amazon.6 The Commission reached the preliminary conclusion that Amazon’s practices, such us taking advantage of data generated by consumers in its platform (e.g., product reviews) to offer products competing with the business users of the platform, may violate EU competition rules (Article 102 TFEU).

Beyond antitrust law enforcement, the EU has further enacted a range of regulatory measures that set the rules for the digital economy and the operation of online platforms in Europe. Contrary to antitrust enforcement, which penalises anticompetitive conduct, typically, after such conduct has actually materialised in the market (‘ex post’), regulation aims at preventing conduct that is deemed socially harmful from occurring in the first place (‘ex ante’). Regulatory measures that impact on online platforms can be found in general-purpose EU law, such as the General Data Protection Regulation (GDPR) that lays down the rules for the processing and sharing of personal data in the EU, including by online platforms. In the past decade, however, EU has also introduced legislation that deals specifically with online platforms. These, more specific and targeted, regulatory initiatives aim, according to the Commission, at creating ‘the right framework conditions and the right environment […] to retain, grow and foster the emergence of new online platforms in Europe.’7 Moreover, in its regulatory initiatives regarding online platforms the European Commission pursues, in principle, the following priorities: (a) the need for ‘a level playing field’ for digital services, (b) the need to ensure ‘responsible behaviour’ by online platforms that protects the Union’s ‘core values,’ (c) transparency that ensures consumer trust and confidence, and (d) open markets for data.

One of the first issues regulated by the EU legislature was liability of online platforms for illegal content they may host online. The E-Commerce Directive created a safe harbour for online platforms from liability for illegal content (Article 14) provided that they (i) have no knowledge of such content, and (ii) remove unlawful content upon being notified of its existence.8 The E-Commerce Directive created a so-called ‘notice-and-action’ system for removing illegal online content which, on the one hand, limited the liability of online platforms for content over which they did not exercise control and, on the other hand, allowed third parties (especially IPRs holders) to request removal of content that infringed their rights.

More recently, the EU enacted Regulation 2019/1150 ‘on promoting fairness and transparency for business users of online intermediation services.’9 The Regulation addresses issues around the relationship between online platforms and business users with the aim to ‘ensure that business users […] are granted appropriate transparency, fairness and effective redress possibilities’ (Article 1). The Regulation applies not only to EU-based online platforms, but also to platforms based in overseas jurisdictions provided that they offer products and services to consumers established within the Union. The Regulation mandates that online platforms provide for transparent terms and conditions that are (a) drafted in plain and simple language, (b) are easily available to business users, (c) set the grounds for termination of service to business users, (d) include information on any additional distribution channels, and (d) inform business users on the impact of the terms and conditions on their IPRs (Article 3). If online platforms modify their terms and conditions, they are to notify business users of the modification 15 days in advance. Suspension or termination of services offered to business users should be justified in a ‘statement of reasons’ submitted to the user concerned prior, or at the time of, the suspension or termination (Article 4). Moreover, the Regulation deals with issues around the ranking of search results by online platforms. According to Article 5, online platforms are to provide in their terms and conditions the ‘main parameters determining ranking and the reasons for the relative importance of those main parameters as opposed to other parameters.’ The Regulation further addresses the issue of so-called ‘self-preferencing,’ i.e., the practice on behalf of online platforms of giving preferential treatment to their own services as opposed to those of competitors. In Article 7, the Regulation stipulates that online platforms are to describe clearly, in their terms and conditions, the preferential treatment of their own products and services, as well as the ‘main economic, commercial or legal considerations’ that call for such treatment.

In 2020, the European Commission embarked on its most ambitious regulatory initiative to date regarding online platforms and the digital economy. In its proposals for two new Regulations—a ‘Digital Services Act’ (DSA) and a ‘Digital Markets Act’ (DMA)—the Commission lays down a far-reaching, detailed, and comprehensive regulatory framework for online platforms active in the EU.

 

C. The Proposal for a Digital Services Act (DSA)

In its Proposal for a Digital Services Act (DSA Proposal) the European Commission builds on the E-Commerce Directive and its liability and ‘notice-and-action’ system, and expands further by proposing a broad spectrum of rules that aim to enhance platforms’ responsibility and accountability for the content they provide to users.10 According to the Commission, the main goal of the DSA Proposal is to ‘to foster responsible and diligent behaviour’ by online platforms. A particular and novel feature of the DSA Proposal is that it intends to create an ‘asymmetric’ system of due diligence obligations for online platforms. These due diligence obligations will vary depending on platform size (‘very large’ platforms are burdened with stricter due diligence obligations) and the particular services provided by platforms.

More specifically, the Proposal for a DSA Regulation:

1. limits the liability of online platforms for illegal content where those platforms exercise no effective control over content. The DSA Proposal envisages three types of activities that suggest absence of effective control over unlawful content: (a) platforms as ‘mere conduits’ (Article 3) where the online platform does not initiate the transmission, does not select the recipient of the transmission, and does not select or modify the information transmitted, (b) platforms merely providing ‘caching’ services (Article 4), i.e., the ‘automatic, intermediate and temporary storage’ of information, provided that the platform ‘expeditiously’ removes unlawful cached content upon notification, and (c) platforms merely ‘hosting’ unlawful content (that is, merely storing it), provided again that the platform ‘expeditiously’ removes unlawful cached content upon notification (Article 5).

2. lays down specific rules and due diligence obligations for online platforms. Specifically, the DSA Proposal: (a) calls for online platforms to appoint a ‘single point of contact’ with the authorities (national and Union) competent to enforce the DSA Regulation (Article 10), (b) requires online platforms that are not established in the EU (yet they offer services to consumers in the EU) to ‘designate, in writing, a legal or natural person as their legal representative’ (Article 11), (c) mandates that online platforms should include in their terms and conditions any restrictions they impose upon users regarding content ‘in clear and unambiguous language and […] in an easily accessible format’ (Article 12), (d) imposes on platforms a range of transparency and reporting obligations as regards their handling of unlawful conduct and ‘notice-and-action’ requests (Article 13), (e) institutes a ‘notice-and-action’ mechanism (similar to the one in the E-Commerce Directive) under Article 14, (f) requires online platforms to provide a ‘statement of reasons’ in cases where they take down content (Article 15), (g) calls for platforms to create an internal complaint handling system (Article 17) which will process users’ complaints regarding decisions to remove access to content, decisions to suspend the provision of the platform services, and decisions to suspend or terminate specific user accounts. In processing ‘notice-and-action’ requests, online platforms are to prioritise complaints from ‘trusted flaggers,’ i.e., to entities that have ‘particular expertise and competence for the purposes of detecting, identifying and notifying illegal content,’ are independent from the platform and represent ‘collective interests,’ and carry out their activities ‘in a timely, diligent and, objective manner’ (Article 19)

3. creates a system of varying obligations depending on platform size: (a) SMEs and micro-SMEs are exempted from the above transparency obligations (Article 16), (b) ‘very large’ platforms are burdened with additional obligations (Articles 25 et seq.). ‘Very large’ platforms, according to the DSA Proposal, are those that provide their services to at least 45 million monthly users (on average) established in the EU (Article 25). Platforms that qualify as ‘very large’ should undertake, on a yearly basis, a risk assessment (Article 26). This assessment should account for the following ‘systemic risks’: (i) dissemination of illegal content, (ii) negative effects for the ‘exercise of the fundamental rights to respect for private and family life, freedom of expression and information, the prohibition of discrimination and the rights of the child,’ and (iii) ‘intentional manipulation’ of their service. To safeguard against these ‘systemic’ risks, ‘very large’ online platforms must establish ‘reasonable, proportionate and effective mitigation measures’ (Article 27). Moreover, ‘very large’ platforms must subject themselves to independent external audit that assesses their compliance with the DSA Regulation (Article 28). Finally, ‘very large’ online platforms must appoint one or more compliance officers ‘responsible for monitoring their compliance with this Regulation’ (Article 32).

4. establishes an enforcement mechanism: (a) under Article 38, EU Member States are to appoint the authorities competent for the enforcement of the DSA Regulation in their respective jurisdictions (‘competent authorities’), (b) EU Member States shall appoint one of those ‘competent authorities’ as the single ‘Digital Services Coordinator’ which is responsible ‘for all matters relating to application and enforcement of this Regulation,’ (c) ‘competent authorities’ and ‘Digital Services Coordinators’ can request the assistance of the Commission in their investigatory and enforcement functions (Article 46), (d) ‘competent authorities’ and ‘Digital Services Coordinators’ will be assisted, in the performance of their duties, by an advisory ‘European Board for Digital Services’ (Article 47), and finally (e) EU Member States shall provide for ‘effective, proportionate and dissuasive’ penalties in case of infringement of the DSA Regulation (Article 42). Those may include injunctive remedies (cease-and-desist orders) and monetary fines. Fines may not ‘exceed 6 % of the annual income or turnover’ or ‘5 % of the average daily turnover’ of the infringer. For the enforcement of the DSA Regulation the proposal envisages broad investigatory powers for the competent authorities, including requests for information, on-site inspections etc (Articles 52 et seq.). Moreover, online platforms that have been found to infringe the DSA Regulation may voluntarily offer commitments to address the infringement (Article 56).

 

D. The Proposal for a Digital Markets Act (DMA)

The second major legislative initiative by the European Commission in 2020 was its Proposal for a Digital Markets Act (DMA Proposal).11 The DMA Proposal complements the proposal for the DSA by pursuing the objective of maintaining ‘open and contestable’ digital markets. The main concern behind the DMA Proposal is that, in view of the specific characteristics of digital markets (strong network effects, economies of scope in the use of data, market concentration), online platforms have emerged that function as ‘gatekeepers,’ i.e., they ‘control’ access to digital markets. The emergence of these ‘gatekeepers’ weakens, in the view of the Commission, contestability in digital markets and allows ‘unfair commercial practices’ by gatekeepers that harm businesses and, ultimately, consumers. Industries in which the Commission has identified such ‘gatekeepers’ include online intermediation services (e-commerce apps and services, mobility apps, etc.), search engines, social networks, video sharing, operating systems, cloud services, and online advertising.

For an online platform to be deemed a ‘gatekeeper’ the following characteristics must, according to Article 3 of the DMA Proposal, be present: (a) a significant impact on the EU internal market, (b) operation of one (or more) ‘core platform service’ that functions as a ‘gateway’ to customers, and (c) an ‘entrenched and durable position in their operations’ (current or predicted in the near future). The DMA Proposal includes quantitative and qualitative criteria for an online platform to be designated a ‘gatekeeper.’ In particular, ‘significant impact on the internal market can be found where the online platforms achieves an ‘annual EEA turnover equal to or above EUR 6.5 billion in the last three financial years, or where the average market capitalisation or the equivalent fair market value of the undertaking to which it belongs amounted to at least EUR 65 billion in the last financial year.’ Moreover, ‘core platform service’ can be established where the online platform in question has more than 45 million monthly active end users. An entrenched position in the market can be found if the above two quantitative criteria are met for the last three consecutive financial years. Undertakings that satisfy the quantitative criteria are to notify the Commission which within 60 days upon notification must designate the platform as a ‘gatekeeper’ for the purposes of the DMA Regulation, unless the platform concerned presents ‘sufficiently substantiated arguments to demonstrate that […] the provider does not satisfy the requirements’ of Article 3(2). However, the Commission may designate a platform as a gatekeeper even if it does not satisfy the quantitative criteria in Article 3(2). More specifically, according to Article 3(6), the Commission will take the following qualitative criteria into account: (i) ‘the size, including turnover and market capitalisation’ of the platform concerned, (ii) the number of its business users, (iii) entry barriers from ‘network effects and data driven advantages,’ (iv) economies of scale and scope, (v) lock-in effects for customers of the platform, and (vi) ‘other structural market characteristics.’

Online platforms that have been designated as gatekeepers are burdened with a broad range of obligations that address the Commission’s concerns over contestability and fairness in digital markets. In particular, Article 5 of the Proposed DMA Regulation provides, among others, for the following:

a. the gatekeeper is to refrain from combining (personal and non-persona) data from other services offered by the gatekeeper (economies of scope) unless users have given their specific consent (opt-in).

b. business users of the platform shall be allowed to offer their products/services through third-party online platforms at prices or conditions that are different from those offered through the gatekeeper (most-favoured nation–MFN–clauses).

c. the gatekeeper must refrain from ‘requiring business users to use, offer or interoperate with an identification service of the gatekeeper in the context of services offered by the business users using the core platform services of that gatekeeper.’

d. the gatekeeper shall provide advertisers and publishers to which it supplies advertising services, upon their request, with information concerning the price paid by the advertiser and publisher.

Beyond the above, the European Commission proposes, under Article 6, additional obligations for gatekeepers, including the following:

a. gatekeepers must refrain from ‘using, in competition with business users, any data not publicly available, which is generated through activities by those business users’ in the platform in question.

b. gatekeepers shall  allow end-users to uninstall any pre-installed software apps on their core platform service. However, end-user restrictions regarding pre-installed apps may be allowed to the extent they are necessary to maintain the technical integrity and functionality of the operating system.

c. gatekeepers shall allow the installation and use of third-party applications or app stores using, or interoperating with, operating systems of that gatekeeper.

d. gatekeepers must refrain from affording preferential treatment in rankings to their own services. Conditions determining ranking should be fair and non-discriminatory.

e. gatekeepers shall refrain from technically restricting the ability of end users to switch between and subscribe to different software applications and services to be accessed using the operating system of the gatekeeper.

f. gatekeepers must provide effective data portability to end-users within the meaning of the GDPR.

g. gatekeepers shall provide to any third-party search engines, upon their request, with access on fair, reasonable and non-discriminatory (FRAND) terms to ranking, query, click and view data.

Moreover, Articles 5 and 6 of the proposed DMA Regulation do not provide for an exhaustive list of obligations. The Commission preserves the right to update the obligations imposed on gatekeepers to include further prohibitions of ‘unfair practices’ (Article 10). ‘Unfair practices’ within the meaning of Article 10 are those that are either characterised by ‘an imbalance of rights and obligations on business users and the gatekeeper’ and from which the gatekeeper in question derives a ‘disproportionate advantage,’ or those that weaken contestability of digital markets. Adding to that, the DMA Proposal envisages a system of notification of concentrations by gatekeepers, similar to but distinct from the EU Merger Control Regulation (Article 12). Gatekeepers are to inform the Commission of a planned concentration ahead of its implementation.

The DMA Proposal also affords the European Commission with broad investigatory and enforcement powers. More specifically, the Commission may initiate a market investigation to establish whether a particular online platform meets the criteria of Article 3 and is to be designated as a gatekeeper (Article 15). The Commission may further initiate a market investigation in case of ‘systematic non-compliance’ with the obligations under the DMA (Article 16). Systematic non-compliance can be found in those cases where the Commission has issued at least three non-compliance or fining decisions within a period of five years. The Commission is additionally entitled to initiate market investigation proceedings with the view to update the list of gatekeeper obligations under Articles 5 and 6 (Article 17). In the enforcement of the DMA, the Commission enjoys broad rights to access to information provided by gatekeepers, to interview and take statements, and to carry on-site inspections to gather evidence (Articles 19-21). Gatekeepers that have, after a market investigation, been found to infringe the provisions of the proposed DMA Regulation, may offer commitments to the Commission that the latter can make binding by issuing a formal decision (Article 23).

In case of non-compliance with the proposed DMA Regulation, the Commission, under Articles 25 and 26, may issue an infringement decision and order injunctive remedies (cease-and-desist orders) and impose monetary penalties (fines). With regard to fines, they may not exceed 10 percent of the gatekeeper’s total turnover in the preceding financial year when they are imposed for failing to comply with the substantive obligations under Articles 5 and 6 of the proposed DMA Regulation, and 1 percent of the total turnover in the preceding financial year for failure to comply with the information requests by the Commission (Article 26).

 

E. Conclusion

The two proposals by the European Commission for a Digital Services Act and a Digital Markets Act mark a major step in the regulation of digital markets by the EU. Although the two proposals have entered the legislative process, which involves the European Parliament and the Council, and some changes should be expected in the final text of the proposals, the direction is clear. The two proposed Regulations will (when enacted) introduce a range of far-reaching obligations for online platforms. The overarching aim is, according to the Commission to create a more fair, contestable and competitive digital sphere. Whether this aim will be fulfilled with the two Regulations remains to be seen in practice. The DSA and DMA Regulations will, however, effect an important change for digital businesses in Europe and beyond. 


For a comprehensive survey of the economics literature on online platforms, see Bertin Martens, ‘An Economic Policy Perspective on Online Platforms’ (2016) JRC Technical Reports, Institute for Prospective Technological Studies Digital Economy Working Paper 2016/05 <https://ec.europa.eu/jrc/sites/jrcsh/files/JRC101501.pdf> accessed 18 February 2021.


European Commission, Communication from the Commission to the European Parliament, the Council, the European Economic and Social Committee and the Committee of the Regions - A Digital Single Market Strategy for Europe [2015] COM(2015) 192 final.


Commission Decision, Case AT.39740 - Google Search (Shopping) [2017] C(2017) 4444 final.


European Commission, Press Release: Commission fines Google €1.49 billion for abusive practices in online advertising (20 March 2019) <https://ec.europa.eu/commission/presscorner/detail/en/IP_19_1770> accessed 18 February 2021.


Commission Decision, Case AT.40099 - Google Android  [2018] C(2018) 4761 final.


European Commission, Press Release: Commission sends Statement of Objections to Amazon for the use of non-public independent seller data and opens second investigation into its e-commerce business practices (10 November 2020) <https://ec.europa.eu/commission/presscorner/detail/en/ip_20_2077> accessed 18 February 2021.


European Commission, Communication from the Commission to the European Parliament, the Council, the European Economic and Social Committee and the Committee of the Regions: Online Platforms and the Digital Single Market Opportunities and Challenges for Europe [2016] COM(2016) 288/2, 3.


8 Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market ('Directive on electronic commerce') [2000] OJ L 178.


Regulation (EU) 2019/1150 of the European Parliament and of the Council of 20 June 2019 on promoting fairness and transparency for business users of online intermediation services [2019] OJ L 186/57.


10 European Commission, Proposal for a Regulation of the European Parliament and of the Council on a Single Market for Digital Services (Digital Services Act) and amending Directive 2000/31/EC [2020] COM(2020) 825 final.

11 European Commission, Proposal for a Regulation of the European Parliament and of the Council on contestable and fair markets in the digital sector (Digital Markets Act) [2020] COM(2020) 842 final.


@2021 China IP Magazine