Key contacts
On 7 January 2026, Turkish lawmakers submitted the Draft Law on the Protection of Children and Young People in the Digital Environment the Presidency of the Grand National Assembly of Türkiye.
The Draft Law addresses Türkiye’s previously fragmented digital regulatory framework and aligns with international standards for child protection, as established by the UN Convention on the Rights of the Child and the Council of Europe.
Purpose and scope of the Draft Law
The primary purpose of the Draft Law is to introduce preventive and protective measures against risks such as digital addiction, virtual gambling, obscenity, cyberbullying, harmful content, misuse of personal data, disinformation and terrorist propaganda that threaten the healthy development of children and young people in digital environments.
The Draft Law applies to content providers, intermediary service providers, social media companies, gaming platforms, educational institutions, public institutions and adult users, and the parents or legal guardians of users under the age of 18, as well as gambling operators authorised under the Law of Regulation of Tax, Fund and Shares from Revenue of Games of Chance numbered 5602.
For the purpose of the Draft Law, a “child” refers to individuals under 18 years. “Young people” refers to individuals between 18 and 24 years and several core restrictions are specifically designed for children under 16 years.
Fundamental obligations
The Draft Law introduces a child-centred compliance framework grounded in the protection of the best interests of the child. In this respect, digital actors must protect children’s data, implement age-appropriate safeguards, prevent manipulative practices, and support parental oversight. At the same time, the legislation must ensure non-discrimination and child-friendly digital content.
Obligations of social network providers
Social network providers must establish and maintain age verification systems and ensure their continued effectiveness. Any biometric data processed for age verification purposes may be used solely for that purpose and may not be further processed once the verification process is completed.
Social network providers are also required to take necessary measures in relation to the content, advertisements and services these networks offer to users on an age-group basis. In this context, network providers must:
- Protect the physical, psychological and emotional development of users;
- Prevent risks of sexual abuse and commercial exploitation;
- Prepare user agreements, user settings and data policies in a clear, simple and understandable manner appropriate to children and young people;
- Refrain from using design features or recommendation algorithms that promote popularity or virality or provide unrestricted access in a manner that may negatively affect children;
- Avoid displaying targeted or personalised advertisements to children;
- Implement parent-controlled safe search, profile creation and communication settings;
- Prevent the use of content in a manner that may cause or contribute to violations of children’s rights, including privacy and protection rights;
- Provide rapid and effective solutions for users, parents and other responsible parties.
Additional obligations include implementing enhanced privacy safeguards and data minimisation for user groups deemed insufficiently aware of data-processing risks, preventing transfer of their data to third parties, and establishing dedicated internal units responsible for the digital protection of children and young people.
Failure to comply with these obligations may result in administrative fines ranging from TRY 1 million to TRY 5 million. In the event of recurrence within one year from the date of the violation, the administrative fine will be increased by half upon the second recurrence, and the activities of the provider will be terminated upon the third recurrence.
Age classification and time restrictions
Digital content providers must classify all digital content into age categories (i.e. 6+, 12+, 16+, 18+) and activate age-appropriateness and time-limit mechanisms. Compliance will be supervised by an independent Digital Content Supervision Centre to be established within Türkiye’s Information and Communication Technologies Authority (BTK).
Children under the age of 16 may spend no more than 55 minutes per day on digital platforms and online games and are prohibited from accessing digital environments between the times of 22:30 and 09:30 except for educational purposes. During permitted usage periods, digital content must be age-appropriate and positively support child developmental and educational needs.
Platforms and gaming companies must implement age verification and parental consent mechanisms, assume algorithmic responsibility for content shown to children, and ensure transparency of child-related algorithms. They must conduct risk analyses on potential harm to children, publicly disclose the results, and submit the data and reports to the Digital Security Board, which will be established alongside the Digital Security Institution under the Presidency to oversee and coordinate compliance. In addition, personal data and location data of children under 16 years may not be processed for commercial purposes.
Non-compliance may result in administrative fines ranging from TRY 1 million to TRY 5 million. In case of recurrence within one year, fines are increased by half upon the second breach, and activities may be terminated upon the third recurrence.
Prevention of virtual gambling
Entities authorised under Law No. 5602 must implement age verification systems to prevent access to games of chance by individuals under 18 years. For users over 18, authorised operators must not encourage participation in gambling activities and issue appropriate warnings. Internet websites operating virtual gambling activities in violation of Law No. 5602 will be blocked by BTK within 24 hours. In the event of recurrence within one year, the activities of the relevant institution or entity may be terminated.
Duty to inform parents
Social media and gaming platforms with users under the age of 18 must provide parents with weekly usage reports. In addition, if minors access content deemed risky or harmful, platforms must immediately notify the parents. At the same time, such reporting and notification mechanisms must be implemented in a manner that does not violate the child’s right to privacy.
Criminal sanctions
Where certain offences are committed through digital means against children or young people, the Draft Law introduces enhanced penalties under the Turkish Criminal Code.
Pursuant to Article 12(1), if the following offences are committed digitally and at least one victim is a child, the penalty imposed will be increased by half:
- incitement to suicide (Article 84);
- sexual abuse of children (Article 103);
- violation of privacy (Article 134);
- facilitation of the use of narcotic or stimulant substances (Article 190);
- supply of substances dangerous to health (Article 194);
- incitement to commit a crime (Article 214);
- obscenity (Article 226); and
- prostitution (Article 227).
Persons who produce and disseminate harmful digital content targeting children will be punished by between three to eight years imprisonment. Convicted individuals will not be eligible to sentence reductions or amnesty provisions.
Transitional compliance period
Pursuant to Provisional Article 1 of the Draft Law, digital content providers are granted a six-month transitional period to ensure compliance with the new regulatory framework following the entry into force of the Draft Law. This transition period is intended to facilitate technical and operational adjustments.
Conclusion
The Draft Law establishes Türkiye’s first comprehensive child-focused regulatory framework for the digital environment. For social media platforms, it introduces enforceable obligations including age verification, algorithmic safeguards, parental controls, and dedicated child protection units. Digital content providers must ensure age-appropriate classification, exposure limits, and developmentally supportive content. If enacted, the Law will mark a significant step toward structured and accountable digital governance, requiring platforms and content providers to align operations, design, and data practices with legally binding child protection standards.
For more information on this Draft Law, contact the experts who contributed to this article: alican.babalioglu@ybk-av.com, melis.celik@ybk-av.com, and ezgi.bahar@ybk-av.com.