Key takeaways this March include:
- Fairness in AI: Businesses utilising AI may want to assess fairness principles in accordance with the latest UK ICO guidance, which includes clarification around AI design and use;
- Notification timelines: Businesses may want to revisit their incident response plans to ensure they envisage breach notifications being made even when investigations remain ongoing;
- Data Processing Agreements: Businesses may want to review their vendor contracts following the Garante’s fines against a data controller (€50,000) and a data processor (€30,000), including for not having a GDPR-compliant data processing agreement in place, among other violations;
- Operational resilience: Businesses should consider reviewing mechanisms to ensure that policies and procedures to manage operational resilience are followed in practice, after the Swedish financial services regulator fined Swedbank c.£66.8 million for, among other things, not following its own governance framework when making a change to a business-critical IT system; and
- Cookies: Businesses with touchpoints in France may wish to keep in mind the French CNIL’s investigation priorities for 2023, which specifically call out concerns around controlling user tracking by mobile applications.
These developments, and more, covered below.
UK ICO updates guidance to clarify requirements for fairness in AI
What happened: The UK ICO has updated its existing Guidance on AI and data protection following requests from industry to clarify requirements for fairness in AI. The Guidance outlines at a high level how the UK ICO is thinking about fairness, and is explicit on a number of useful points, including:
- the availability of consent as a lawful basis for AI data processing in certain circumstances;
- the importance of being open with individuals about how AI systems make decisions about them and how personal data is used to train and test the system,
- the need to ensure that AI systems are sufficiently statistically accurate and avoid discrimination if the system is used to infer data about people; and
- the need to deal specifically with variations in margins of error in AI systems as part of an organisations data protection impact assessments (DPIAs).
What to do: While regulatory guidance on AI use remains relatively thin on the ground, businesses should take what they can from the guidance that does exist, especially where, such as in the case of the UK ICO’s guidance, it has concrete applications. More specifically, business may wish to review existing policies, processes and DPIAs against the UK ICO’s updated guidance where they have touchpoints in the UK. See also our posts on whether to adopt a ChatGPT policy and implementing other AI governance measures.
Swedbank fined SEK 850 million over IT outage
What happened: Finansinspektionen (“FI”), the Swedish Financial Supervisory Authority, issued Swedbank with a “remark” (anmärkning) and an administrative fine of SEK 850 million (c.£66.8 million) for failing to follow internal controls following a change to a business-critical IT system.
The fine follows what FI described as a “major IT incident”, in which almost a million customers were left with incorrect balances in their accounts as a result of transactions being stopped in Swedbank’s systems. FI’s investigation showed that Swedbank made a change to a business-critical IT system without following the bank’s internal procedures and processes. Furthermore, FI found that Swedbank did not have suitable control mechanisms in place to catch the failure. FI concluded that Swedbank did not have satisfactory internal control of the change in the bank’s IT system, which is a violation that gives FI grounds to intervene.
Aggravating factors included: (i) Swedbank’s lack of internal controls; (ii) the fact that Swedbank is a systemically important bank; (iii) the number of affected customers; and (iv) the risk of an adverse impact on financial stability caused by the events. In mitigation, FI found that Swedbank has already taken and intends to take measures to strengthen its internal controls
What to do: Companies, particularly those subject to financial services regulation, should consider carefully what policies and procedures they have in place to manage IT operational resilience, and ensure they remain up to date and are applied in practice. Looking ahead, European-regulated entities will face increasingly onerous operational resilience requirements under the EU Digital Operational Resilience Act. See our posts for background on DORA and the significant management obligations it imposes.
Norwegian Data Protection Authority fines medical device company c.$240,000 for late data breach notification
What happened: The Norwegian DPA fined Argon, a medical device company, c.$250,000 for only reporting a personal data breach after it had completed its investigation. In July 2021, Argon identified a personal data breach affecting its employees but only notified the regulator two months later, after it completed a review of the incident and its consequences. The Norwegian DPA:
- Reiterated that the GDPR 72-hour deadline for notification begins when the data controller becomes aware that a personal data breach has occurred (and not once all the circumstances of the breach are understood); and
- Emphasised that controllers cannot wait until after a detailed investigation has been carried out before complying with notification requirements.
What to do: Companies may want to review their incident response procedures and processes to ensure that they facilitate compliance with breach notification deadlines and, in any event, make clear that notification may need to take place even before investigations have been completed.
Italian DPA fines insurance company for failures in effective post-acquisition integration
What happened: The Italian DPA published its decision to fine insurance broker Assiteca Spa €120,000 for a variety of data protection shortcoming linked to websites offering insurance quotations that came into the business via a 2021 acquisition. The DPA took issue in particular with the (i) absence of consent to processing for promotional purposes; (ii) lack of transparency regarding data processing; (iii) personal data retention; and (iv) lack of technical and organisational measures to prevent prejudice to data subjects – all of which were rooted in poor post-acquisition integration by the company of its target’s operations. For example, post-merger, nearly 9,700 of the target’s customers’ personal data was retained without their knowledge and exposed to potential processing for promotional purposes without express consent.
What to do: Consider carefully, and plan for, post-acquisition integration risk linked to taking on board additional personal data and processing activities. As the fine shows, buyers may be liable for inherited non-compliance.
French CNIL announces its investigation priorities for 2023
What happened: The CNIL announced its investigation priorities foreshadowing further scrutiny on tracking technologies. Each year, the CNIL conducts hundreds of investigations (345 in 2022) following complaints, data breach reports, and relevant news events. As part of its investigation policy, the CNIL defines priority topics for enforcement and compliance sweeps. 2023’s priorities include:
- user tracking by mobile applications, building upon the CNIL’s action plan regarding good practices in the development of mobile applications;
- cookie compliance, which has been a hot topic for a while now, with several recent sanctions (see our previous roundup) and updated guidelines; and
- the security of health data.
What to do: The CNIL’s priorities confirm the importance that European national supervisory authorities place on cookie and tracking technology compliance for mobile applications. Businesses with touchpoints in France may want to review their use of tracking tools, in particular cookies and whether it aligns with the CNIL’s guidelines.
Italian Garante fines data controller €50,000 and data processor €30,000 for failing to conclude a data processing agreement
What happened: The Garante published its €50,000 and €30,000 fines against a data controller and data processor, respectively, arising out of an employee complaint about vehicle tracking. The controller, Giessegi Industria Mobilii (“Giessegi”), installed geolocation devices provided by a data processor, Verizon Connect Italy (“Verizon”) to track delivery vehicles.
The Garante faulted both entities for:
- Failing to conclude a data processing agreement, noting that terms of services between the entities qualifying Verizon as a “processor” did not meet the GDPR’s mandatory content requirements; and
- Processing personal data without any legal basis, following the termination of Giessegi’s agreement with the delivering company.
The Garante further faulted and penalised the controller for failing to:
- Provide data subjects with an adequate privacy notice; and
- Conduct a data protection impact assessment, even though location data is “highly personal” and employees were considered vulnerable data subjects by the Garante.
What to do: Companies may want to review vendor contracts to ensure they contain the GDPR Article 28-mandated data processing terms and consider whether a data protection impact assessment is needed to ensure GDPR compliance before onboarding third party service providers.
UK ICO asked to force deletion of YouTube algorithms
What happened: A complaint lodged with the UK Information Commissioner’s Office called on the regulator to force YouTube to delete algorithms trained on children’s data that it claimed was illegally gathered. While the ICO confirmed that it has broad powers to require organisations to take all necessary steps to comply with data protection law, the deletion of software code as a specific remedy could – per the ICO – possibly exceed its mandate. This has yet to be tested and contrasts strongly with the U.S. Federal Trade Commission which has ordered algorithmic deletion as a part of several data privacy case settlements. See our blog post here.
What to do: Businesses may want to track developments in this area and review our previous post to identify best practices in avoiding algorithmic disgorgement.
Irish DPC fines Bank of Ireland Group €750,000 for banking app personal data breaches
What happened: The Irish DPC published its final decision against the Bank of Ireland Group Plc (“BOI”) after BOI notified the DPC of 10 data breaches relating to the BOI365 banking app.
The DPC fined BOI €750,000, finding that BOI had underestimated the severity and likelihood of risks to rights and freedoms of data subjects as a result of processing certain banking app data and failed to implement appropriate safeguards to protect that data. Specifically, the DPC found that BOI had (i) insufficiently frequent and targeted staff training; (ii) inadequate data governance policies directed to mitigating the risk of human error in data processing; (iii) insufficient sampling of certain data for quality assurance purposes; and (iv) inadequate mechanism to enable the proactive detection and reporting of data breaches. The DPC ordered the BOI to bring its processing into compliance with the GDPR’s data security requirements.
What to do: The decision highlights potential regulatory concern for businesses to account properly for the risk of identity theft, fraud and financial loss in the risk calculus—in this case, the DPC was clear that these factors had the consequence that the severity of the risk was high. Businesses can also take from the decision a reminder to implement appropriate mechanisms to test for and mitigate the risk of human error where individuals are involved in high risk processing activities.
Irish DPC fines medical group €460,000 for “data mishandling”
What happened: Ireland’s Data Protection Commission (“DPC”) fined Dublin-based medical group Centric Health Ltd €460,000 following a 2019 ransomware attack. The DPC held that the group mishandled data by inadvertently destroying “2,500 patient files and other data”, as well as leaving 70,000 patients’ files inaccessible for a period after the attack due to difficulties with accessing the backups. According to the judgment, the deletions happened after Centric paid a ransom to the attackers in return of the decryption key which could not be applied to the affected data as it has been deleted in the interim.
The DPC placed particular importance on the nature of the data that was deleted, stressing that health data requires an especially high level of security given its loss could impact on patient care.
The decision also stated that the group came up short in the security testing of their servers and had insufficient back-up protection, deviating from the company’s standard practice and storing them off-site.
What to do: Companies may want to review the back-ups and associated policies and procedures as part of their broader ransomware and cybersecurity preparedness efforts. Companies may also want to ensure that incident response plans and other relevant procedures contain controls to prevent the (premature) deletion of data affected by cybersecurity incidents.
To subscribe to the Data Blog, please click here.
The cover art used in this blog post was generated by DALL-E.