Privacy and Data Spring Snapshot

Tuesday 28th May 2024

In this edition, we cover:

  • the Data Protection and Digital Information Bill status update;
  • the new Ofcom draft Children’s Safety Code of Practice;
  • the recent MoD cyber attack;
  • the EU AI Act;
  • ICO new fining guidance; and
  • the ICO ‘public sector’ approach demonstrated in recent enforcement action.

 

The Data Protection and Digital Information Bill: Status Update

Following the announcement of the general election set to take place on 4 July 2024, the Data Protection and Digital Information Bill has not formed part of the ‘wash up’ process and therefore will not become law later this year as originally planned. As such, the future of the UK’s new data protection legislation is uncertain and it will be for the next government to progress with the bill, or replacement legislation. The UK GDPR and Data Protection Act 2018 will therefore continue to apply in the UK.

Ofcom releases draft Children’s Safety Code of Practice

Ofcom has now published its draft Children’s Safety Codes of Practice which sets out how they expect online services to meet their new legal obligations under the Online Safety Act 2023 (OSA). The draft Code sets out 40 practical steps which businesses must take to keep children safer online.

Amongst other obligations, websites and apps caught by the OSA must introduce robust age-checks to prevent children from seeing harmful content, and social media companies must ‘tame toxic algorithms’ which recommend harmful content to children. Responses to the draft Codes must be provided by 17 July 2024, following which Ofcom will review the feedback and produce a final version.

Read the full guidance here

MoD Cyber Attack

On 7 May, it was reported that the Ministry of Defence’s payroll system, which is managed by an external contractor, had been hacked, and the records of more than 250,000 current and former armed forces personnel and MoD staff were breached. The data is described as ‘personal HMRC-style information’ and relates to current and former members of the Royal Navy, Army and Royal Air Force over a period of several years.

Incidents like this are a reminder to any public or private sector organisation engaging third parties for services like IT and payroll systems to ensure that such third parties have robust security measures in place to protect personal and sensitive personal data.

Read our team’s comments in the press

The EU AI Act

The EU approved the first ever comprehensive regulatory framework for AI in March – the EU Artificial Intelligence Act (EU AI Act). The EU AI Act is designed to regulate AI systems and prohibit those that pose an unacceptable risk to people’s rights, safety or livelihoods. To this effect, it categorises AI systems by the level of risk posed to individuals, ranging from high-risk systems to minimal/no risk systems. It is aimed at businesses using AI and operating in the EU (even if not based in the EU).

The Act’s provisions will become applicable gradually over a period of 24 months, to allow time for those who are subject to it to implement measures and become compliant. The most onerous provisions will enter into force at a later stage, to account for the effort involved and effect on the businesses. This will generally apply to any UK organisation operating in the EU.

High-level summary of the AI Act

ICO new fining guidance

In March 2024, the Information Commissioner’s Office (ICO) published its new guidance which sets out its approach to issuing penalty notices and calculating fines. The ICO takes a holistic approach, in which it will consider the seriousness of the infringement; relevant aggravating or mitigating factors; and the effectiveness, proportionality and dissuasiveness of issuing penalty notices.

The new guidance is another reminder that if organisations can show that they have taken proactive measures to implement and maintain appropriate data protection security measures, this is likely to work in their favour when the ICO is considering the appropriate penalty.

Find a summary of the guidance here

 

ICO ‘public sector’ approach demonstrated in fine for HIV data breach

The ICO’s approach to enforcement action against organisations in the public sector (or, for example, charities which are publicly funded) has once again been brought into the spotlight.

The Central Young Men’s Christian Association was investigated by the ICO after sending an email to individuals participating in a programme for people living with HIV using “CC” rather than “BCC”, inadvertently revealing the email addresses of all 166 recipients.

Initially, a £300,000 fine was recommended, which was subsequently reduced to £7,500 in line with the ICO’s public sector approach, where fines for public sector bodies / those funded by the public are reduced and wider enforcement powers are used to raise standards.

If you have any questions or would like to discuss your business’ privacy and data protection compliance, feel free to contact one of our experts here.