ChatGPT defendant

ChatGPT defendant

Context of the ChatGPT lawsuit

ChatGPT has been sued by the Viennese association Noyb (None of your business) for responding with false information to questions asked by users.

Noyb claims that ChatGPT “invents false answers” when asked questions about which it does not have the information. This implies a breach of the Data Protection Regulation, specifically a breach of the principle of accuracy as postulated in Article 5 of the GDPR.

Noyb contends that ChatGPT provides incorrect answers on information related to specific individuals, such that it. “regularly makes up answers,” “systematically responds with false information,” and “rambles,” a circumstance that is outside the law.

According to a New York Times study, ChatGPT’s invention rate ranges from 3% to 27%.

Faced with this situation, Noyb has requested that this incorrect data be changed in the database from which ChatGPT works, and Open AI’s response is that this is impossible because they cannot prevent ChatGPT from giving these answers.

Therefore, this would be like saying that you can not ask ChatGPT to make a correction on data that are at its base, so then, could not exercise the right of rectification regulated in art. 14 of the LOPDGDD and 16 of the RGPD. In addition, it would not be able to comply with its proactive responsibility, which consists of complying with the provisions of data protection and the ability to demonstrate it.

What is the possible cause of these incorrect ChatGPT responses?

ChatGPT can “hallucinate” by giving answers that are not correct and this may be due (according to Google) to insufficient training data, incorrect model assumptions or distortions in the training data, i.e. what is known as bias.

Filing of the complaint with Austrian authorities

Thus, Noyb has initiated a complaint procedure before the Austrian authorities, with the purpose of fining Open AI for not respecting the obligations imposed by the Digital Services Regulation, as well as the Data Protection Regulation, specifically with respect to the guarantees adopted regarding the accuracy of the data processed.

Maartje de Graaf, data protection lawyer at Noyb, said, “If a system cannot produce accurate and transparent results, it cannot be used to generate data on individuals. The technology must follow the legal requirements, not the other way around.”

Recall that the Digital Services Regulation was approved in 2002, and came into force on February 17, 2024, so that large companies offering online intermediary services, and connecting users with content, products and services in the European market, regardless of whether they are established within or outside the territory of the Union, must comply with the provisions of this Regulation.

Conclusion

AI, whether ChatGPT or any other, must at all times respect personal data and data protection rights. This obligation concerns both natural and legal persons, and we must not forget that the rules do not discriminate in their compliance.

Therefore, if your company uses or plans to implement any AI tool, you should ensure that it complies with the Data Protection Regulation.

Business Adapter® at your service

If you need help, contact us by email: info@businessadapter.es, you can also call 96 131 88 04, or leave your message in this form:

[su_button url=”https://businessadapter.es/contacto” target=”blank” background=”#f6f903″ color=”#181818″ size=”7″ center=”yes” icon_color=”#000000″]Contact us, we will be pleased to help you.[/su_button]

Contact us, we will be pleased to help you.
error: Content is protected !!