Thank You

You are now registered for our Rouse Insights Newsletter

ChatGPT reported for using fabricated personal data

Published on 08 Jul 2024 | 2 minute read
Take aways on NOYB's report: "ChatGPT provides false information about people, and OpenAI can’t correct it"

In a nutshell 

In April 2024, the data protection organization NOYB (short for “None of Your Business”) reported the company behind ChatGPT, OpenAI, for not guaranteeing the accuracy of personal data and  not correcting inaccurate information, as required by EU data regulation. The overall concern, which this complaint highlights, regards whether AI systems such as ChatGPT actually comply with the EU data protection regulations when processing personal data.  

The background 

It is not unusual for AI to generate incorrect information when users make a request. But if that request is about an individual it is legally very serious when the AI fabricates false personal data. 

This complaint derives from a request filed at OpenAI by a person in the public eye whose birthday was repeatedly and incorrectly created by ChatGPT. However, this was denied by OpenAI when the individual requested that their personal data be corrected. According to OpenAI, it was technically impossible to amend or block the AI’s response without completely blocking it from answering other questions about this person.  

According to NOYB, the company thereby failed to comply with the EU data protection regulations. as OpenAI did not appropriately address this person’s request to access, correct or delete their personal data.  

The take aways  

  • Before developing and implementing an AI system within an organization, one must bear in mind the importance , during the planning process and model design, how the rights of data subjects can be safeguarded.  
  • Organisations should carefully assess which data is truly needed to train the model and should not process more data than necessary. It all comes down to having control of the data that is being stored and used, and how to comply with the rights of the data subjects.  
  • Organisations should be aware that this type of complaint is likely to be more common as the use of AI becomes more regulated and the interaction between AI and the EU data protection regulations becomes clearer. From this example, entities should not blame deficiencies on technical obstacles as it will not be an acceptable excuse for not complying with the requirements in the EU data protection regulations. 

Read more: Noyb anmäler ChatGPT för påhittade personuppgifter - Forum för Dataskydd (dpforum.se) 

Questions? 

For any questions about this case or data protection queries generally, please contact My Mattson or Frida Holmer

30% Complete
Senior Associate
+46 (0) 70 233 62 62

Frida Holmér (née Siverall)

Associate, Legal Counsel
+46 076 0107192
Senior Associate
+46 (0) 70 233 62 62
Frida Holmér (née Siverall)
Associate, Legal Counsel
+46 076 0107192