The UK data and privacy watchdog has said it is launching an investigation into the use of facial recognition cameras in the King’s Cross area of central London.
A report recently claimed the controversial technology is used on the 67-acre King’s Cross Central site, home to King’s Cross and St Pancras International stations, as well as restaurants, shops and cafes.
“These cameras use a number of detection and tracking methods, including facial recognition, but also have sophisticated systems in place to protect the privacy of the general public,” Argent, the property developer for the area, told the Financial Times.
It is not clear how long the system has been active, nor how many cameras are in use.
The Information Commissioner’s Office (ICO) says it is “deeply concerned about the growing use of facial recognition technology in public spaces” and is seeking “detailed information” about how it is used.
“Scanning people’s faces as they lawfully go about their daily lives, in order to identify them, is a potential threat to privacy that should concern us all,” said Information Commissioner, Elizabeth Denham.
“That is especially the case if it is done without people’s knowledge or understanding.
“My office and the judiciary are both independently considering the legal issues and whether the current framework has kept pace with emerging technologies and people’s expectations about how their most sensitive personal data is used.
“We have launched an investigation following concerns reported in the media regarding the use of live facial recognition in the King’s Cross area of central London, which thousands of people pass through every day.
“As well as requiring detailed information from the relevant organisations about how the technology is used, we will also inspect the system and its operation on-site to assess whether or not it complies with data protection law.”
Earlier this week, London Mayor Sadiq Khan said he has written to the chief executive of the King’s Cross development to raise his concerns about the use of facial recognition.
Last month, the House of Commons Science and Technology Committee suggested that authorities cease trials of such technology until a legal framework for them is established.
In a report on the Government’s approach to biometrics and forensics, the MPs referred to automatic facial recognition testing carried out by the Metropolitan Police and South Wales Police.
It noted an evaluation of both trials by the Biometrics and Forensics Ethics Group raised questions about accuracy and bias.
Concerns were also raised that police custody images of individuals not convicted of any crime are not being deleted.