Google executives have hit back at suggestions the company is not doing enough to tackle child abuse images online.
On Tuesday, the head of the National Crime Agency (NCA) Lynne Owens said that tech companies “must do more” to deal with online child abuse to help law enforcement catch the worst paedophiles.
This follows lawyer William Chapman, representing three victims of online sexual abuse, telling an inquiry into the issue that tech firms are “reckless or indifferent” in their approach.
However, Kent Walker, Google’s senior vice president for global affairs and the firm’s chief legal officer, said the company had spent years fighting the issue.
“I think we have taken significant measure for many years – since the earliest times of the company – to deal with the scourge of child sexual abuse material online,” he told the Press Association.
“We have teams devoted to identifying it and removing it from Google search and other platforms.
“We have worked with the government quite closely to get their input, including a variety of both governmental and quasi-governmental groups to be able to make sure that we are not just identifying it but reporting it to appropriate authorities so that they can take action against the people who are propagating this kind of material.”
Mr Walker was speaking alongside other Google executives and engineers in Munich at the company’s opening of a Safety Engineering Centre (GSEC) in the German city.
It is to be used as a hub to develop new tools the firm says will help protect the privacy and security of Google users and their data.
“We’re constantly trying to figure out how do we use information in ways that are helpful to users in a whole variety of settings, but simultaneously be respectful of the variety of privacy preferences that people have for those,” Mr Walker said.
“So, this centre is a capstone in a sense, but it’s also a first step, a foundation as we continue to make that progress.”
However, Mr Walker did admit the tech giant still needed to improve its systems for finding and removing dangerous content, highlighting the terror attack in Christchurch, New Zealand, as an example.
The attack on mosques in the city in March was live-streamed to Facebook by the gunman. Versions of the video were repeatedly circulated on other platforms – including Google-owned YouTube – with some taking hours to be removed.
Google is taking part in a summit in Paris on Wednesday involving president Emmanuel Macron and New Zealand’s prime minister Jacinda Ardern. The meeting, under the title the Christchurch Call, follows the shootings in that city and is aimed at addressing terrorist and violent content online.
Mr Walker said: “In the situation of Christchurch, we were able to avoid having live-streaming on our platforms, but then subsequently we were subjected to a really somewhat unprecedented attack on our services by different groups on the internet which had been seeded by the shooter.
“So we were seeing roughly one copy of that video being uploaded every second, in ways that were trying to overwhelm our defences effectively, to the point where we actually made a decision to shut down any future uploads of any version of that video – including from legitimate news sources like the Australian Broadcasting Corporation – simply because we couldn’t at scale disambiguate what was meant to be incendiary and what was news and documentary coverage.
“So these are challenges, but we’re constantly improving our policies and our technologies for being able to block abuse of our products, and you’re seeing that change on YouTube over the course of the last several years, and you’ll see more to come.”
Google executives also expressed their support for greater regulation of the technology sector and confirmed it would participate in the current consultation on the Government’s white paper on online harms, which was published last month.
Kristie Canegallo, the firm’s vice president of trust and safety and a former US government official, said the company was also keen to hear more from governments about the specific issues they wanted to focus on.
“As we think about regulation, and helping to solve some of these challenges, what’s helpful for us is to understand what specific problems regulators are trying to solve, to be mindful of the potential unintended consequences of regulation and so, really, having a dialogue about that is really important,” she said.
“So we look forward to participating in the (consultation) period that we have now to provide our feedback because we certainly do understand and appreciate what the UK Government is trying to do here.
“It’s important that we understand what problem they’re trying to solve and that we not legislate in something that could have unintended consequences that actually don’t help solve that problem.”
Ms Canegallo, who is due to appear before the on-going Independent Inquiry into Child Sexual Abuse (IICSA) later this week, added a collaborative effort was needed to eradicate such content from the internet.
“I think this is an area where Google has done so much in terms of investing in industry-leading technology and I look forward to talking to the inquiry to help them understand how we approach these challenges but much like some of these other challenges we have a role to play, law enforcement has a role to play, NGOs have a role to play, and it’s going to take all of us to get it working,” she said.