Ads1

ads2

pAGES

Monday 18 November 2013

How Microsoft, Google and ISPs aim to halt child abuse images

Ministers and technology executives attend Downing Street summit called by David Cameron

Computer keyboard
David Cameron ordered a meeting of ministers and technology executives to tackle the issue of child abuse images. Photograph: Reuters/Kacper Pempel/Files Photograph: KACPER PEMPEL/REUTERS
David Cameron has made a personal mission of being seen to rid the internet of child abuse material.
In June, he berated web's biggest technology firms to do more to tackle the production, distribution and accessibility of abuse material online, and summoned Google, Facebook, Yahoo and others to a "council of war" in which he demanded that they provide more funding and technical support to identify and block images of child abuse online.

On Monday, Cameron called together senior ministers and technology executives for a cyber-security summit to announce the latest progress in identifying and blocking child abuse material from the internet.

Who is there?
Search firms Google and Microsoft will attend the meeting at Number 10, broadband providers BT, TalkTalk and Virgin Media, charity the NSPCC, web monitoring firm the Internet Watch Foundation and the UK's National Crime Agency.

How big is the problem?
The exact number of indecent images of children, both photos and video, is hard to pin down, but research by the UK's Child Exploitation and Online Protection agency (Ceop) notes that the number of individual pieces of content runs into the millions, while the number of children abused in the production of the material is thought to be tens of thousands. UK police seized 2.5m pieces of content in 2011. By far the biggest threat is from material spread on private networks, or peer-to-peer, where individuals can share files with each other directly and which bypass safety filters and search indexes. The vast majority of indecent and illegal imagery is distributed this way.

What kind of content is being shared?
As well as an increase in the volume of content, Ceop's research has found that photos and video of child abuse is becoming more violent and sadistic, and involving younger children and toddlers that would not be able to identify the abusers. The majority of content involves white children and perpetrators, and images are sometimes used by abusers to normalise the abuse and encourage children to reenact the activity. Offenders, Ceop notes, are generally socially skilled, educated individuals with access to children through family, work or their home.

What's the difference between 'child abuse material' and 'child pornography'?
'Pornography' attempts to legitimise the material. In the words of the NSPCC, the term gives "a misleading and potentially trivialising impression of what is a very serious crime" and favours the term "child abuse images".

Photos and video of sexual acts with children document a criminal act – the abuse of a child – and need to be labelled as such. One victim told Ceop of the terror of being pictured in abusive material: "I am scared because there are photos of me on the internet that other paedophiles know what I look like. I don't know if they know where I live."

What is being announced?
Google and Microsoft have expanded systems to block child abuse material from their search engines. Google has dedicated 200 engineers to ramp up these technologies, and is also sending a small number of engineers to the Internet Watch Foundation (IWF) which investigates URLs that reportedly link to abuse material. Google now more strictly controls search results for 100,000 terms related to underage sex abuse, and has edited its auto-complete feature, and will point searches for the most explicit search terms towards warnings and links to expert help.

Google has developed a Video ID tool which uses digital fingerprinting technology to identify and block child abuse videos, even if they have been edited and repurposed. Microsoft says it is looking at implementing Video ID on its own video services, and has a similar tool for photos called Photo DNA.
Crucially, Google and Microsoft are working with Ceop and the IWF to explore how to identify and stop distribution of material on P2P networks by removing direct links to content.

How does this affect parents?
The changes announced by Google and Microsoft today should make child abuse material harder to find and easier and quicker to remove, but very easy access to legal, adult content online is a different issue. Anyone taking out a new broadband contract by the end of 2013 will find age-appropriate safety filters automatically switched on, but concerned parents with existing broadband contracts can switch on their ISP's own filters now. Parents can use various guides to internet safety, such as the NSPCC's resources, to find out the risks and safety tools available to children on social networks, mobile services and how to activate age-restricted safety features.

What happens next?
The next step is for Google and Microsoft, working with Ceop and the IWF, to investigate and develop a reporting process for links to content hosted on P2P networks. Currently, the IWF only has a remit for investigating and blacklisting URLs of content on the open web which is indexed by search engines, which accounts for only a small minority of child abuse material. With an expanded remit and extra staff, the IWF could then begin to unpick 'the dark web' where content is shared with impunity.
Another major challenge is to tackle grooming on public areas such as social networks, where conversation may appear benign but where abusers groom children and eventually meet offline.

No comments:

Post a Comment