Data protection

School reprimanded for using facial recognition without appropriate safeguards

Employment and immigration28.08.20247 mins read

Key takeaways

Opt-out consent is not sufficient

Schools must obtain clear, informed agreement.

DPIAs are vital for new technologies

Risk assessments are a legal requirement.

Stakeholder engagement must be prioritised

Consulting pupils and parents strengthens compliance efforts.

The Information Commissioner’s Office (ICO) has recently published details of a reprimand given to an Essex school for breach of the data protection principles after it introduced facial recognition technology in its school canteen without first taking the appropriate data protection safeguards.

In March 2023, a school in Essex started using facial recognition technology in its canteen to take cashless payments from its 1,200 pupils aged 11-18. Data protection law gives extra protection to the processing of biometric data. Further, ICO guidance confirms that data controllers must perform a mandatory data protection impact assessment for the use of innovative technology and/or for the processing of biometric data, such as facial recognition technology. However, the school failed to fully comply with data protection principles before it introduced this technology and the school has since been given a reprimand by the ICO in relation to its failings. According to the ICO’s report of the reprimand, the school’s key data protection failures were:

  1. Failure to secure the appropriate consent for use of facial recognition technology: When it introduced the facial recognition technology, the school had sent a letter to parents with a slip for them to return if they did not want their child to participate. However, this was problematic for several reasons. Firstly, an ‘opt out’ system is not a valid form of consent for data processing - explicit permission for the processing must be obtained. The school continued to wrongly rely on this inadequate ‘opt-out’ consent for over six months before it eventually secured the necessary affirmative ‘opt-in’ consent. Secondly, most of the school’s pupils were themselves old enough to provide their own valid consent, so the school’s letter seeking parental consent (via opt-out) deprived the school’s pupils of the ability to exercise their data protection rights and freedoms.
     

  2. Failure to carry out a data protection impact assessment (DPIA): The school had failed to carry out a DPIA before it introduced and started using the facial recognition technology. This failure in turn meant that the school had made no prior assessment of the data protection risks the new facial recognition system caused to the sensitive data of the school’s pupils. The ICO report reminds data controllers that ‘introducing measures such as facial recognition technology should not be taken lightly, particularly when it involves children’ and that ‘a DPIA is required by law – it’s not a tick-box exercise. It’s a vital tool that protects the rights of users, provides accountability and encourages organisations to think about data protection at the start of a project.’
     

  3. Failure to properly consult the data protection officer and stakeholders: The school also failed to seek opinions from its data protection officer, or to consult with parents and pupils, before implementing the facial recognition technology.

Your content, your way

Tell us what you'd like to hear more about.

Preference centre