Can a waste disposal fee notice or an invoice be generated and sent fully automatically without a human ever reviewing the process? The Administrative Court (VG) of Bremen has issued a ruling with significant practical implications: A purely automated notice violates the prohibition on automated individual decision-making (Art. 22 GDPR). At the same time, the ruling demonstrates how this error can be “cured” after the fact (VG Bremen, Judgment of July 14, 2025 – Case No.: 2 K 763/23).

The Case: Automated garbage fee notice ends up in court

A property owner received a waste disposal fee notice that was generated entirely by an automated system. He filed an objection, arguing it violated Article 22(1) of the GDPR, which prohibits decisions with legal effects that are based solely on automated processing. The relevant authority rejected the objection—however, this specific “objection notice” (rejection) was issued by a human official. The case eventually went to court.

The Decision: GDPR violation confirmed, but “cured” by human review

The court reached a nuanced conclusion that is relevant to any organization using automated processes:

  • The original notice was unlawful: The court confirmed that the purely automated fee notice did indeed violate Art. 22(1) of the GDPR.
  • “Cured” through the objection process: The pivotal point of the ruling is that this defect was “cured” (rectified) by the subsequent human review during the objection proceedings. The court views the entire process (the initial notice + the objection notice) as a single unit. Since a human made the final decision and assumed full responsibility at the end of the process, it was no longer considered a prohibited automated decision.
  • The result: The fee notice itself remained valid. Only the administrative fee for the objection process was waived/overturned, as the plaintiff was essentially forced to file an objection just to receive the human review he was legally entitled to.

Implications for Businesses & Authorities: Lessons from the ruling

  • Purely automated decisions are a risk: The authority’s first step was unlawful. If a customer does not object, the unlawful state persists. This carries a high risk of claims for damages or regulatory action from data protection authorities.
  • “Human-in-the-loop” is the safety net: It is crucial to have a clearly defined and easily accessible process where a human can review and correct an automated decision. This human intervention can make an otherwise unlawful process legally sound.
  • Cost risks for flawed processes: The authority had to bear the costs of the objection proceedings. For businesses, this means that if your automated process is flawed (e.g., an incorrect invoice or an unjustified late fee), you may have to cover the costs of the necessary manual follow-up and customer communication.
  • It’s about final responsibility: The court clarified that a calculation performed by a computer is not, on its own, a “decision.” A decision under the GDPR only occurs when a human ultimately takes responsibility for the result.

FAQ: Automated Decisions & GDPR – What you need to know

What is an “automated decision” under Art. 22 GDPR?
This is a decision made without any meaningful human intervention that has legal effects or similarly significant consequences for the person concerned (e.g., entering into a contract, termination of services, or a fee assessment).

Are all automated processes prohibited?
No. Art. 22 GDPR allows automated decisions if they are necessary for the conclusion or performance of a contract, if they are authorized by law, or if the individual has given their explicit consent.

What does the “curing” of a flawed notice mean in practice?
It means that a subsequent review process conducted by a human can fix the original legal defect and make the decision valid. However, the initial violation still occurred.

What do I risk if my automated invoices or notices are legally flawed?
You risk the decision being declared invalid (if not “cured”), claims for damages from the affected parties, damage to your reputation, and fines from supervisory authorities.

How can I make my automated processes legally compliant?
By carefully auditing your workflows. Experts can help you design your automated procedures to be GDPR-compliant.

Conclusion: Automation needs a human anchor

The Bremen ruling is a classic “yes and no” decision. It confirms the legal dangers of purely automated decision-making while simultaneously providing a pragmatic path to legal compliance through downstream human oversight. The message for all companies and authorities is clear: anyone who automates processes must plan for a functional and easily accessible “emergency exit” in the form of a human review instance. Without this human anchor, you are sailing in legally treacherous waters.

Table of Contents