In a case that raises serious questions about the responsibilities of AI companies under European law, a Norwegian man has filed a formal privacy complaint after ChatGPT falsely described him as a convicted murderer. The incident is not only disturbing in its content — blending fictitious criminal accusations with real personal details — but also significant in its legal implications under the General Data Protection Regulation (GDPR).

What Happened?
Arve Hjalmar Holmen, a private individual from Norway, reportedly queried ChatGPT to see what the AI might say about him. The response he received was shocking: the chatbot alleged that Holmen had murdered two of his children, attempted to kill a third, and was currently serving a 21-year sentence in a Norwegian prison.
What made the fabrication more troubling is that the response also included accurate personal details, such as Holmen’s hometown and the number and gender of his children — blending fact with fiction in a way that could appear credible to the average user.
GDPR: The Legal Backbone of the Complaint
On Holmen’s behalf, the Austrian data protection advocacy group NOYB (None of Your Business) filed a formal complaint with the Norwegian Datatilsynet (Data Protection Authority). The core of the complaint rests on GDPR’s Article 5(1)(d), which states that personal data must be:
“Accurate and, where necessary, kept up to date. Every reasonable step must be taken to ensure that personal data that are inaccurate… are erased or rectified without delay.”
This principle is not optional. GDPR requires that data controllers (in this case, OpenAI) ensure the accuracy of personal data and provide individuals with the ability to correct or delete false or misleading information.
AI Models and the “Black Box” Challenge
NOYB’s complaint also highlights a critical challenge in the AI age: while AI models like ChatGPT can generate seemingly intelligent and informed responses, they are not databases of verified facts. Instead, they predict text based on patterns in training data — which can lead to plausible-sounding but completely inaccurate statements, especially about people.
OpenAI’s defense has often included a general disclaimer that “ChatGPT may produce incorrect or misleading information” — but according to NOYB’s legal experts, a disclaimer does not absolve a company from GDPR responsibilities.
“You can’t just spread false information and in the end add a small disclaimer saying that everything you said may just not be true,” said Joakim Söderberg, data protection lawyer at NOYB.
Correction vs. Blocking: A Legal Distinction
This complaint marks NOYB’s second formal GDPR case against OpenAI. The first, in April 2024, involved a public figure whose date of birth was incorrectly reported by ChatGPT. At the time, OpenAI claimed it could not “correct” the error, only block responses to specific queries.
But under GDPR, users have a right to rectification, not just redaction or suppression. If data is wrong, it must be corrected — not just hidden.
Why This Matters
This case serves as a powerful reminder of why GDPR exists — and how critical it is in the age of AI. When AI tools generate content about individuals, especially content that includes defamatory or damaging claims, the line between fiction and violation of rights becomes dangerously thin.
The implications are clear:
- False personal data is a data protection issue
- AI-generated errors can have real-world consequences
- Compliance frameworks like GDPR aren’t optional for AI developers operating in Europe
What Comes Next?
As of now, the timeline of Holmen’s original ChatGPT query is redacted in the public version of the complaint, but NOYB has confirmed it occurred before the AI tool included live web browsing capabilities. When the same query is entered today, the results now point only to the legal complaint itself — a stark illustration of how digital footprints are not easily erased.
The case is likely to add pressure on regulators and developers alike to take AI governance seriously, and to ensure that safeguards like those in the GDPR are enforced with the same speed and scale as the technology itself.
- We can help you become FADP compliant!
Expert Guidance, Affordable Solutions, and a Seamless Path to Compliance