Key Points:
- The European Commission determined Meta broke EU laws by allowing children under 13 to use Facebook and Instagram.
- Investigators found that children easily bypass age limits using fake birthdays, and the platform ignores user reports about underage accounts.
- Regulators threaten to fine the tech giant up to 6% of its global annual revenue if the company fails to fix these issues.
- This warning arrives as countries like Australia, the UK, and France push for strict social media bans for teenagers under 16.
The European Commission caught Meta breaking major European Union safety laws. Regulators announced on Wednesday that the technology giant failed to keep children under 13 off its massive social media platforms. The commission focused its preliminary investigation specifically on how Instagram and Facebook handle child safety. They concluded that Meta violates the Digital Services Act because the company completely fails to enforce its own minimum age requirement.
Children easily bypass the basic age gates on these platforms. When a new user creates an account, the system asks them to enter their date of birth. The commission pointed out that minors simply type in a fake birth year to pretend they are older. Meta currently has no active controls in place to verify that the person behind the screen actually matches the age they entered in the box.
Regulators also blasted Meta for making its safety reporting tools overly complicated. If a concerned parent or user spots a young child using the platform, they must jump through hoops just to report the account. The commission found that users must click up to 7 times just to access the correct reporting form. This frustrating process discourages people from flagging dangerous or underage accounts.
Even when someone successfully reports an underage user, the system rarely works. Investigators discovered that Meta often ignores these reports entirely. The company fails to take adequate follow-up measures, leaving underage children active on the platform. The commission demanded that Instagram and Facebook completely change how they assess and manage risks for young users living inside the European Union.
Meta quickly fired back against the European regulators. A company spokesperson told reporters that they strongly disagree with the preliminary findings. The representative stated that the company clearly intends for Instagram and Facebook to serve only people aged 13 and older. The spokesperson also claimed that Meta already uses advanced measures to detect and remove underage accounts from its network.
The technology giant promised to roll out even more safety features soon. The spokesperson said the company continues to invest in new technologies to detect underage users. Meta plans to share more details about these upcoming safety measures next week. However, the company also deflected some of the blame, calling age verification a massive challenge that requires the entire technology industry to work together to solve.
Meta now has the opportunity to review the preliminary findings and submit a written response to the commission. The stakes are incredibly high for the social media giant. If the European Commission confirms these violations in its final investigation, regulators have the power to impose severe penalties on the company. They can hit Meta with a massive fine totaling up to 6% of its total worldwide annual revenue.
This European crackdown follows intense legal trouble for Meta inside the United States. In March, two high-profile American court rulings struck hard blows against the company. One court found that the actual design of Meta platforms directly contributes to extreme addiction and mental health harms among teenagers. The second court concluded that the company intentionally misled parents and users about the safety of children on its platforms.
Governments around the world are losing patience with social media companies. Lawmakers want to pass strict bans to protect young teenagers from online harm. Australia recently made history by becoming the first country to legally ban children under 16 from using social media platforms. Other major nations are watching Australia closely and want to emulate this aggressive approach. Lawmakers in the United Kingdom, Spain, and France are actively looking at similar legislation to ban teenagers under 16 from social media.
British regulators also refuse to accept the current safety standards. In March, the United Kingdom’s Information Commissioner’s Office ordered social media giants such as YouTube, TikTok, Snapchat, Instagram, and Facebook to enforce much stricter rules. Regulators demand that these companies implement real age verification technologies. They argue that letting kids self-declare their age fails because children easily trick the basic systems.
The British regulators suggested several modern technological solutions. They want companies to use facial age-estimation software, digital ID cards, or one-time photo matching to verify a user’s real age. Paul Arnold, the chief executive officer of the Information Commissioner’s Office, wrote a stern letter to the technology industry. He warned that public concern grows every single day, and the current system simply does not work. He demanded that companies act immediately to implement real technology that actually stops children from accessing dangerous services.