- ChatGTP and personal data
In The Netherlands, the DDPA monitors the processing of personal data. It has a specific policy on monitoring algorithms and AI in case these collect personal data.
The DDPA would like to improve supervision on the use of algorithms and AI in order to prevent discrimination and facilitate transparency.
Many people have been using ChatGTP since it became available in November 2022. You can ask the chatbot several questions like solving a math question. However, people also use it for more personal matters like relationship issues and even medical issues.
The DDPA wants clarification from ChatGTP on whether it uses this personal data to train the algorithm and if so, in what way. The DDPA also has concerns about the generated answers, since this information could be inaccurate, outdated or offensive.
The DDPA wants clarification on whether it is possible to delete information in cases it is inaccurate, outdated or offensive.
Other privacy authorities in Europe have similar concerns. This was due to the case in Italy in which the data protection authority banned ChatGTP. The operator could not prove a working age verification for use, and the chatbot was trained with data from Italian citizens without their knowledge.
Therefore, the European Data Protection Board (EDPB) wants to create more transparency on the use of AI and has decided to bundle forces in a ChatGTP-taskforce. The EDPB said the goal of the task force is to "foster cooperation and to exchange information on possible enforcement actions conducted by data protection authorities."
Supervision of algorithms is crucial for different industries within The Netherlands. Therefore, DDPA has launched a separate algorithm supervisory department, which will coordinate collaboration of the different supervisors in The Netherlands.
By coordinating these supervisory authorities, it will be easier to redirect complaints and to recognise certain risks. Overall, it will improve the protection of fundamental rights like privacy.
- Focus points for the DDPA
AI and algorithms are one of the three focus points the DDPA has established for their four-year plan. The developments it has seen are the continued growth of the data society, an increase in digital injustice and an increase in privacy awareness.
Based on these three developments the DDPA has established three focus points:
- Data trade
- The increase in the use of data has not only led to the improvement of products and services, but also unauthorised sales of personal data to third parties. The DDPA wants to prevent this by monitoring the sales of data.
- Digital government
- Different organisations within the government possess sensitive information like the police. By working together with these organisations, the DDPA wants to improve data security.
- AI and algorithms
- Companies and organisations use AI and algorithms more often. As explained before the DDPA wants to monitor how AI programs such as ChatGTP process personal data.
- New fines policy for GDPR (AVG) violations
New rules are in place for calculating fines for companies that violate the General Data Protection Regulation (AVG).
These new rules were drafted by the European Data Protection Board (EDPB), the alliance of European privacy regulators. With the new rules, all privacy regulators in the European Union (EU) will now calculate the amount of fines in the same way.
Until now, each privacy supervisor in the EU still had its own rules. By standardizing the calculation of fines within the EU, the regulators ensure that companies know where they stand: fines are calculated in the same way in each country. Supervisors can also monitor each other better when they observe a fellow supervisor's investigation.
The new rules, the EDPB's "fining guidelines," are different in three important ways from the fining policies the DDPA has used until now.
A. Company turnover greater role
A company's size is given a greater role in determining the amount of the fine. Under the old fine policy rules, the DDPA only included the size of the company at the end of the calculation of the fine. Under the new rules, this happens at the beginning.
Companies can see in the guidelines what amount is used as the starting point for calculating the fine for a given violation for a company of their size. The turnover of the violator's parent company also counts.
B. Offence severity categories
Under the new rules, there are three (3) categories for violation severity: low, medium and high. Until now, the DDPA also looked at the severity of the violation when determining the fine level, but without attaching a category to it.
With the new rules, a different starting amount for the fine applies for each category.
C. Bandwidth for starting amount
As in the old rules, the new rules use a range of fine amounts for different types of violations. The old DDPA fine policies assumed a bandwidth within which a fine amount was basically determined.
In the new rules, however, the bandwidth is intended to determine the starting amount of the fine. That amount can then be increased or decreased.
Supervisors start calculating the amount of the fine with that starting amount. Then they see if there are reasons to adjust the fine.
For example, to increase the fine if the company has previously committed a similar violation. Or to lower it if the company did everything possible to limit the impact on the victims of the violation.
Fines under the new rules, as under the old, can reach up to 20 million euros or 4 percent of a company's global turnover.
The new rules are effective immediately. Also for ongoing cases.
The new rules will apply only to businesses. This is because not all privacy regulators in the EU are allowed to impose fines on governments. The DDPA, however, is allowed to do so. For now, the DDPA's old fining policies will continue to apply to government agencies. The DDPA is still investigating in a European context what rules it wants to use in the future to calculate fine levels for government organizations.
This blog is written by Ady van Nieuwenhuizen, IP |Tech & Privacy Partner and Julia Welgraven (intern) at Fieldfisher Netherlands.
Sign up to our email digest