May tech companies like Meta use public user profiles from Facebook and Instagram to train their artificial intelligence (AI)? In a widely noted preliminary injunction proceeding, the Higher Regional Court (OLG) of Cologne has provisionally answered yes and rejected an application by the North Rhine-Westphalia Consumer Protection Association. The ruling of May 23, 2025 (case no. 15 UKl 2/25) sends an important—albeit provisional—signal for the entire tech industry.
Consumer Association vs. Meta: Dispute Over AI Training and User Data
After Meta announced in April 2025 that, from the end of May, it would use public user data for AI training purposes, the NRW consumer association went to court. By way of an urgent application, it sought to prevent the processing of data of users who had not lodged an objection. Affected are all publicly viewable information on platforms such as Facebook and Instagram.
OLG Cologne in Preliminary Proceedings: Meta’s “Legitimate Interest” Prevails
The court dismissed the application. On a preliminary review, there is no violation of the GDPR or the Digital Markets Act (DMA). The core reasoning relies on Meta’s “legitimate interest” under Article 6(1)(f) GDPR.
The judges justified their decision as follows:
Legitimate purpose: The development and training of AI systems is a legitimate purpose.
Necessity: Large data sets that cannot be fully anonymized are necessary for training; less intrusive but equally effective means do not exist.
Balancing of interests: Meta’s interest in the processing outweighs users’ interests in this case.
Mitigating measures: Meta has taken effective measures to mitigate the interference with users’ rights. These include:
• Only publicly posted data are processed—data that could also be found by search engines.
• Users were informed and have a simple way to object or can set their profiles to “non‑public.”
• No direct identifiers such as names or email addresses are used.
This assessment aligns with the evaluation of the competent Irish data protection authority and also that of the Hamburg data protection commissioner.
Data Use for AI: Implications for Companies
“Legitimate interest” as a legal basis for AI: The ruling strengthens the position of companies that wish to conduct AI training based on legitimate interest (Art. 6(1)(f) GDPR). However, this is not a carte blanche.
Transparency and the right to object are decisive: The OLG’s decision hinges largely on Meta informing users and providing an easy opt‑out. This is the key takeaway for all companies: proactive transparency and user control are essential.
Public data are not fair game, but usable: The ruling provides more clarity on how data that users themselves have made public may be used. The degree of publicity plays a decisive role in the balancing of interests.
Provisional nature of the ruling: It is crucial to understand that this was a preliminary proceeding. In the main proceedings, with a more in‑depth examination, the outcome could be different. The legal situation remains dynamic.
FAQ: AI Training & GDPR – What Companies Need to Know
What exactly is “legitimate interest” under the GDPR? It is an interest of the company (e.g., product development, security) that does not override the fundamental rights and freedoms of the data subject. It always requires a careful balancing exercise.
Can any company now use public social‑media data for AI training? No. The ruling is specific to Meta’s case. Each company must conduct its own balancing test and implement mitigating measures such as transparency and rights to object.
What is the difference between preliminary proceedings and main proceedings? A preliminary proceeding serves a quick, provisional decision. Main proceedings involve full evidence taking and lead to a final judgment at that instance.
Conclusion: An Important Stage Victory for Meta with Clear Ground Rules for the Industry
The OLG Cologne’s ruling is a significant signal. It shows that the use of public data for AI training is possible under the GDPR—but only under strict conditions. Transparency toward users and granting control are not only good practice, but legally decisive for justifying processing based on legitimate interest