Privacy watchdog asks maker ChatGPT for clarification about data handling

- Advertisement -spot_imgspot_img

ChatGPT’s iPhone app
NOS NewsAmended

In a letter, the Dutch Data Protection Authority has asked the maker of ChatGPT, OpenAI, for clarification about the way in which it uses personal data to train the underlying language model. The regulator is concerned about the way companies such as OpenAI deal with such sensitive data.

The privacy watchdog also wants to know whether the questions people ask the OpenAI chatbot are also used to further train the system.

The authority also says it is concerned about the information that the system shares. According to the AP, it can be “inaccurate, outdated, incorrect, inappropriate, offensive, or offensive” and can take on a life of its own. “Whether and if so how OpenAI can rectify or delete that data is unclear,” the authority said.

Next steps unclear

Earlier, the Italian privacy watchdog attracted a lot of attention with a temporary block of ChatGPT, which lasted about a month. A spokesperson for the Dutch Data Protection Authority does not want to say anything about possible follow-up steps, but says that nothing has been ruled out. So no prohibition.

OpenAI has not yet responded to the letter, the deadline is June 23. The Dutch Data Protection Authority says that more actions will follow. What they are is unclear.

According to the watchdog, 1.5 million people in the Netherlands have already used the chatbot in the first four months of the year.

  • Great interest in AI from the business community can change professions considerably
  • Chip company on stock exchanges amid warnings against AI
  • ‘AI legislation is moving too slowly’, EU and Google make early agreements
  • Economy

  • Tech

Share article:

- Advertisement -spot_imgspot_img
Latest news
- Advertisement -spot_img
Related news
- Advertisement -spot_img