OpenAI gets ultimatum from Italy to lift ban ChatGPT

In brief:

On March 30, the Italian data protectionauthority called for a ban demanded on the use of ChatGPT, the popular application from U.S. company OpenAI, for violating GDPR laws.

Italy is the first European country to take clear steps toward banning of the application. Italy, by the way, is not the only country looking suspiciously at ChatGPT. Studies have also begun in Germany, France and Ireland on the additional privacy measures needed. In Belgium, there does not appear to be any talk of a possible ban on ChatGPT for the time being.

What is ChatGPT?

ChatGPT is an artificial intelligence (AI) chatbot developed by OpenAI in San Francisco.

The Application is capable of generating extraordinarily human responses to text questions. The application provides detailed answers to every question posed to it.

It uses the generative pre-trained transformer (GPT), which is an advanced language model that draws much attention to both supervised learning and reinforcement learning and human trainers to improve the performance of the model.

A lot of data is needed to train the large language model. All available data can be scraped from the Internet, posts on social networks, books and other resources to create the generative text system.

Part of the data collected is the personal information you share about yourself online.

AI

Why does Chat GPT violate the provisions of the the GDPR according to the italian data protection authority?

The Italian Data Protection Authority has opened an investigation into ChatGPT, The Italian Data Protection Authority’s investigation into ChatGPT focused on three main areas

1.First first the controller failed to provide appropriate information about the processing to data subjects whose personal data was collected over the Internet.

The Italian data protection authority ruled that the data controller did not comply with its obligation to provide data subjects with a privacy policy under Article 13 AVG.

3. Finally, the investigation found that OpenAI has not taken measures to check whether users are older than 13. The absence of a mechanism to check the age of users constitutes a violation of Article 8 AVG. The Italian Data Protection Authority ruled that the data controller did not comply with its obligation to provide data subjects with a privacy policy under Article 13 AVG.

2. Second, the data protection authority found that the result of ChatGPT – the “answers” – often contained personal data.

The data authority held that there was no legal basis for the mass collection and storage of this personal data to train algorithms. This violates Articles 5 and 6 of the AVG.

The future of Chat GPT in Italy

Italy’s data protection authority has left the door still ajar for Open AI.

For now, the door is still ajar for OpenAI to bring ChatGPT to the Italian public after all. In light of the above, and as part of an emergency procedure, the Personal Data Authority has imposed a temporary restriction on processing on OpenAI under Article 58(2)(f) of the AVG.

Italy’s privacy watchdog has issued
issued an ultimatum
posed for OpenAI. Thus, Open AI will have to ensure that ChatGPT will have to meet at least the following voowards.

  • Need to offer information about how ChatGPT works and the rights of data subjects. That information should be visible before anyone creates an account;
  • clarify which legal basis it is relying on to process personal data;
  • Individuals should be given the opportunity to have erroneous or harmful information about themselves corrected or removed.
  • Users must verify that they are eighteen years of age or older. By Sept. 30, OpenAI must also implement a monitoring system to check the ages of users.
  • by May 15, Open AI must launch a Garante-approved information campaign to educate users about the risks of entering personal data to train algorithms.

What are the big challenges for OpenAI in the future?

Unlike in the US, publicly available personal information in Europe is still considered personal information belonging to individuals. It is this data that is now causing problems for OpenAI.

As mentioned earlier, studies have also been launched in Germany, France and Ireland on the additional privacy measures needed.

The Belgian Data Protection Authority also said ChatGPT’s possible breaches “should be discussed at [Europees] level.”

French data protection authority CNIL, meanwhile, received at least two complaints against ChatGPT, alleging privacy violations.

In response to growing issues surrounding ChatGPT, the European Data Protection Board (EDPB) decided to establish a special task force to foster cooperation and exchange information on possible enforcement actions by data protection authorities.

Delen:

Meer berichten

gdpr audit

An Audit in NIS2

Introduction: The European Union has responded by introducing the NIS2 directive, an update to the original 2016 Network and Information Systems (NIS)

Partners

©DPO Associates Alle rechten voorbehouden. Privacy verklaringCookie verklaring | Algemene voorwaarden