Contact Members Join
AmCham Romania
Members only
Home |Privacy policy
Business Intelligence Online Majority Law – digital maturity at 16 and fines of up to 0.4% of national turnover for online service providers

Online Majority Law – digital maturity at 16 and fines of up to 0.4% of national turnover for online service providers

by BACIU PARTNERS November 3, 2025

By Adela Nuță and Cristina Stoica

The protection of minors from exposure to harmful content accessible through online services constitutes the main objective of a legislative initiative recently adopted by the Romanian Senate – the ‘Online Majority Law’. This law project aims to mark a significant shift in the way minors may access and use online services. Inspired by European trends, the legislative initiative also seeks to align national legislation with the European legal framework.[1]

According to the draft law, the concept of ‘online majority’ means obtaining full legal capacity in the online environment upon reaching the age of 16. Until that age threshold, access to online services in one’s own name could be made only with the express and verifiable parental consent.

To cover the full spectrum of digital activities, the legislator defines online services as any ‘service provided free of charge or for payment, at a distance, by electronic means and at the individual request of the recipient’, including, without being limited to ‘social media platforms, streaming services, e-commerce, online games, banking services, mobile applications, and other online interfaces accessible to users’.

At the same time, the law introduces the notion of ‘harmful content’ defined as any written, audio, video, or image content that promotes alcohol, energy drinks, nicotine-based products, medicines, violence, hatred, pornography, self-harm, behaviors harmful to health, addictions, fraud, hate speech, including on grounds of race, sex, religion, or sexual orientation, the exploitation of minors, or any other forms of content that present a significant risk to a minor’s physical or emotional development.

The legislative initiative strengthens the role of legal representatives in protecting minors under 16 from the dangers they face when accessing online services and gives them the right to decide: (i) whether to request the suspension or deletion of existing accounts, (ii) whether to express parental consent for accounts already created or to be created after the law enters into force, and (iii) whether to restrict minors’ access to harmful content.

According to the draft, these requests may be addressed to the National Authority for Management and Regulation in Communications (ANCOM), starting from the date the law enters into force, either in physical format or electronically, with a certified digital signature.

However, from a practical perspective, these provisions raise obvious difficulties in the implementation process.

First, legal representatives could, in theory, make these requests from the date the law enters into force. However, in reality, this will be possible only after the publication of the technical rules by ANCOM, namely within 180 days from the entry into force of the law.

Second, the proposed administrative procedure is based on submitting requests in written form or by certified electronic means. This could become excessively bureaucratic and discouraging limiting the real involvement of parents in the digital supervision of minors and, implicitly, the effectiveness of protective measures.

Third, the lack of clarifications regarding how legal representatives will be able to prove which accounts belong to minors, the objective methods for identifying the content they consider harmful, as well as what documents they must submit to demonstrate legal standing as representatives, creates uncertainty for both parents and online service providers. Under these conditions, the effective application of the law could generate additional risks of excessive processing of personal data, contrary to the principle of data minimization established by Regulation (EU) 2016/679 (GDPR).

Beyond these procedural difficulties, it must be also clarified how the extended notion of ‘harmful content’ will be applied in practice. The absence of an automatic prohibition and the exclusive dependence on the initiative of legal representatives can turn the protection of minors into a formal mechanism lacking real effectiveness.

In an attempt to balance the legal representatives’ responsibilities, the law also addresses the activities of online service providers and establish explicit obligations for them to design and operate platforms in a way that protects the privacy and safety of minors.

Within 180 days from the law’s publication, online service providers would be obliged to label content according to users’ age categories and to obtain parental consent for all existing minor accounts in Romania. Breach of these obligations will expose non-compliant providers to administrative fines ranging from 0.1% to 0.2% of national turnover.

Additionally, within the same period, online service providers should implement filters and electronic mechanisms for verifying users’ identity and age, block accounts of minors for whom parental consent has not been obtained and delete those accounts within 120 days from blocking. For failure to comply with these obligations, the sanctions increase, with administrative fines ranging from 0.2% to 0.4% of national turnover.

It is noteworthy that, according to the legislative proposal, providers are not obliged to process additional personal data to determine whether users are minors. However, they may do so, as it is not prohibited. This provision opens sensitive discussions in the area of minors’ privacy, since any additional age verification can lead to the collection of data which, although justified by the objective of protecting minors, may exceed the principle of proportionality. In this context, the Privacy by Design principle becomes relevant, recently reaffirmed by the European Commission in the Guidelines on the protection of minors in the digital environment, which recommend integrating age-verification mechanisms directly into the technological architecture of platforms, so that they ensure data protection without collecting additional or sensitive information.

We also consider that the proposal of ‘probabilistic estimation’ for determining age should be carefully and cautiously explained by the legislator, because estimating age based on interactions or by using tools based on artificial intelligence to analyze facial features, for example, can lead to the mass collection of sensitive data, a practice that raises major compliance risks and should be avoided for minor users.

Even if, in the explanatory memorandum of the draft, the legislator suggested that artificial intelligence could support online service providers in the compliance process, the 180-day period seems insufficient for the technical adjustment of platforms and alignment with the Methodological Rules and the Technical Rules of ANCOM that should be adopted.

The application and monitoring of compliance with the law will fall to ANCOM, which will have the power to act ex officio or upon request from interested persons and which, to ensure integrated protection of minors, will collaborate with the National Audiovisual Council (CNA), the National Supervisory Authority for Personal Data Processing (ANSPDCP), and the National Authority for the Protection of Children’s Rights and Adoption (ANDPDCA).

The institution will present Parliament with an annual report on the application of the law, including typologies of breaches, the number of sanctions, and the value of fines. In situations where acts of a criminal nature are identified, ANCOM will be obliged to notify the competent authorities, and in the case of repeated violations it may order the suspension of a provider’s activity on the territory of Romania until full remediation of non-compliance.

It remains to be clarified to what extent and through what mechanisms the sanctions provided by the legislative initiative could be applied to large online platforms, such as social networks (TikTok, Instagram, Facebook, etc.), given that these sanctions are calculated by reference to national turnover.

The legislative proposal ‘Online Majority Law’ is undoubtedly a necessary and timely step, in line with the European trend of regulating minors’ access to the digital environment.[2]

However, the version adopted by the Senate continues to raise numerous practical and legal challenges of implementation: cumbersome procedures for legal representatives, short deadlines for implementation for online service providers, the lack of objective criteria regarding harmful content, and the risk of excessive processing of personal data. In addition, effective application to large international platforms and the effectiveness of sanctions remain open issues, which will depend essentially on the clarifications and adjustments that can be made during the legislative procedure before the law is promulgated.

 


[1] Article 28 of Regulation (EU) 2022/2065 of the European Parliament and of the Council of 19 October 2022 on a Single Market for Digital Services and amending Directive 2000/31/EC the “Digital Services Act” and Article 8 of Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (the General Data Protection Regulation) – “GDPR”.

[2] France adopted in 2023 the Online Majority and Combating Online Hatred Law – LAW No. 2023-566 of July 7, 2023 aimed at establishing a digital majority and combating online hatred (1) and introduced the requirement to obtain parental consent for minors under the age of 15. Other countries such as Spain, Greece, Ireland, and Denmark are working on legislative projects to protect minors in the online environment.

More from Business Intelligence

Previous Next