A Practical Guide for Startups and Digital Businesses: The Hidden Legal Obligations of Online Platforms

When providing advice to digital businesses, one issue frequently arises: at what point does a digital service become a regulated platform? Companies rarely realise when they have crossed the line from operating a digital service to operating a regulated platform.

From a legal standpoint, that threshold is often crossed earlier than companies expect.

Many services that founders internally describe as marketplaces, directories, booking tools, or community platforms fall within the regulatory framework governing online intermediation services in the European Union. The classification rarely depends on how the service is described commercially, but rather on how regulators view what the platform actually enables users to do.

Where a service allows users to publish listings, interact with other users, and offer goods or services on the platform, several regulatory frameworks may apply. In particular, platform operators need to consider the implications of the Digital Services Act and the Platform to Business Regulation, alongside broader obligations arising under the General Data Protection Regulation and, in some contexts, the European Accessibility Act.

Each of these regulations addresses a different dimension of platform activity: content governance, fairness in the relationship with business users, data protection, and the accessibility of digital services.

Taken together, they impose a set of operational expectations that many platforms only begin to understand once they scale.

Here are the main obligations many platforms often overlook.

  • Platforms should operate a structured system for reporting illegal content

Platforms hosting user-generated content must provide users with the ability to report illegal content through a notice-and-action mechanism.

Under the Digital Services Act, platforms are expected to implement procedures to receive notices, assess them without undue delay, and take appropriate action where necessary.

For platforms hosting listings or reviews, these notices frequently concern issues such as counterfeit goods, misleading offers, or intellectual property violations.

Platforms must demonstrate that reports are received, assessed, and resolved through identifiable internal processes. Authorities increasingly examine not only the existence of reporting channels but also the internal moderation framework behind them.

  • Platforms should disclose how ranking works

Ranking systems are central to the functioning of many platforms. Search results, product listings, and service providers are often displayed according to criteria determined by algorithms.

Where ranking affects the visibility of business users, the Platform-to-Business Regulation requires platforms to disclose the main parameters determining ranking.

The obligation does not extend to revealing the algorithm itself. Rather, platforms must explain the key factors that influence visibility, such as relevance, reviews, price, and sponsored placement.

In practice, problems arise when ranking systems evolve over time without internal documentation. When sellers challenge changes in visibility, the platform must be able to explain the logic behind those decisions.

  • Platforms should verify the identity of business sellers

Under the Digital Services Act, platforms allowing businesses to sell goods or services must implement trader traceability measures.

Platforms are expected to collect and verify certain information about business users before allowing them to operate on the platform. This typically includes identifying information and contact details sufficient to ensure that consumers know who they are dealing with.

For platforms designed for rapid onboarding, this requirement can require adjustments to the onboarding process once regulatory expectations become clearer.

  • Platforms should disclose if they compete with their own sellers

A less frequently discussed obligation arises where a platform offers its own products or services alongside those of third-party sellers.

Under the Platform-to-Business Regulation, platforms must disclose whether they compete with business users on the platform and whether their own offerings receive preferential treatment, particularly in ranking.

This issue often emerges when marketplaces begin introducing private-label products or proprietary services. Regulators are particularly attentive to situations in which platforms appear to use sellers' transactional data to compete with them.

  • Platforms should explain how transaction data is used

Platforms generate substantial amounts of data through user interactions. Under the Platform-to-Business Regulation, platforms must disclose what data is generated through their use and whether business users have access to it.

Platforms must also clarify whether they use such data commercially.

This becomes particularly sensitive where platforms operate competing services alongside third-party sellers.

  • Platforms should provide internal complaint mechanisms

Platforms that operate as intermediaries must provide internal complaint-handling systems for business users.

These mechanisms allow sellers to challenge decisions that affect their platform activity, including account suspensions, listing removals, and ranking changes.

In practice, many platforms recognize the importance of these mechanisms only after disputes with sellers arise.

  • Suspension decisions must be justified

Platforms cannot freely suspend or terminate business users without explanation.

The Platform-to-Business Regulation generally requires platforms to provide a statement of reasons when restricting or suspending a business user’s access to the platform. In many cases, advance notice must also be provided before termination.

Where suspensions are triggered automatically by fraud detection systems or moderation tools, platforms must still be able to explain the reasoning behind the decision.

  • Review systems must be credible

Review systems play an increasingly important role in the platform’s ecosystem. At the same time, regulators expect platforms to ensure that reviews are not misleading.

Platforms displaying reviews or ratings should explain whether reviews are verified and what measures they take to detect or prevent manipulation.

Where reviews significantly influence consumer decisions, regulators may examine whether the platform has implemented reasonable safeguards.

  • Platforms may need to publish transparency reports

Certain platforms may be required to publish transparency reports describing their content moderation practices.

These reports typically include information about notices received, actions taken in response to those notices, restrictions applied to user accounts, and the use of automated moderation tools.

Even where reporting obligations are limited, platforms are increasingly expected to maintain internal records capable of supporting such reporting.

  • Compliance regarding platform governance

One important development in platform regulation is the shift from documentation to governance.

Historically, compliance was often associated with publishing terms of service and privacy policies. Today, regulators are increasingly interested in how platforms actually operate: how moderation decisions are made, how ranking systems function, how sellers are verified, and how disputes are resolved.

In that sense, platform regulation increasingly concerns the governance of digital ecosystems. This means compliance is becoming less about drafting policies and more about ensuring the platform's governance can withstand regulatory scrutiny. It requires coordination between legal, product, and engineering teams.

A practical starting point - how to determine whether these obligations apply to your platform

A recurring challenge for many companies is determining whether their services fall within the regulatory frameworks discussed above. In practice, regulators typically examine the functional role of the service within the digital ecosystem, and we have prepared some questions relevant to this assessment:

1. Does the service enable users to offer goods, services, or content to other users through the platform? Where a platform facilitates such interactions, it may fall within the scope of the Digital Services Act as a hosting service or online platform.

2. Does the platform allow business users to reach consumers through the interface? If the answer is yes, the relationship between the platform and those business users may fall within the scope of the Platform-to-Business Regulation.

3. Does the platform influence visibility or access through ranking, recommendation systems, or search results? Where the platform controls visibility through algorithmic systems, additional transparency obligations may apply.

4. Does the platform collect and use data generated through interactions between users, particularly where business users rely on the platform to reach customers? In such cases, both data protection rules and platform transparency obligations may become relevant.

5. Is the service offered to consumers within the European Union? If so, consumer protection and accessibility requirements may also apply, including those introduced under the European Accessibility Act.

In many cases, platforms discover that the regulatory framework applies not because of a single feature, but because of the combination of functionalities that the service provides.

For that reason, assessing whether platform regulation applies typically requires examining the platform's architecture: how users interact, how listings are displayed, how sellers are onboarded, and how transactions take place.

For many digital businesses, the regulatory framework surrounding online platforms becomes visible only after the platform has already begun to scale. By that stage, features such as ranking systems, seller onboarding processes, review mechanisms, or moderation tools are often deeply embedded in the product architecture.

This is why we recommend that regulatory considerations be addressed early, while the product architecture is still evolving. Understanding how different regulatory frameworks intersect with the platform's functioning can help operators anticipate obligations before they become operational constraints.

The content of this article is general information, not tailored legal advice for your specific situation. It has a strictly informative and general purpose; the information contained does not constitute legal advice.

Every business is different. For personalised consultancy, schedule a consultation call or write to us directly at 📧 anamaria@legallyremote.online.

Next
Next

Digital Omnibus: Council Pushes Back on Redefining “Personal Data”