The Internet Watch Foundation (IWF) has claimed positive results from integrating a new feature into its Intelligrade system for recording online images of child sexual abuse.
The charity, which is supported by the global internet industry, has made it possible to record details of more than one child in relation to an image on the dataset used by tech companies and law enforcement bodies to protect children around the world.
It received financial support for the move from the Countering Online Harms fund of UK domain name registrar Nominet.
IWF said the improvements were made earlier this year, since when over 60,000 children who would previously have been ‘invisible’ have been incorporated into a dataset.
Grading images
Intelligrade enables the charity to accurately grade individual images of child sexual abuse while automatically generating unique hashes as digital fingerprints. Contextual metadata such as the age of the child and severity of the abuse is added to the hashes.
Previously, if an image featured more than one child, only information about the youngest one could be recorded. This approach was born out of the necessity to remove confirmed child sexual abuse material (CSAM) from the internet as swiftly as possible, and logging multiple ages could delay that process.
Following the integration of the multichild feature, analysts can now more easily track information about all the children seen in an image.
Hashed images are loaded on to the IWF Hash List which is then provided to companies, law enforcement agencies and governments around the world that cooperatively work to block and remove the criminal content.
To date, there are more than two million hashes on the Hash List, preventing the abuse images from being shared again and again.
IWF said the new capability provides vital intelligence to the tech industry, policy makers and police, and supports efforts to develop better solutions to tackle CSAM online.
Boosting capability
Its chief technology officer, Dan Sexton, said: “This tech advancement significantly boosts our ability to capture more robust information about all the children featured in the child sexual abuse images that we assess.
“To my knowledge we are now the only organisation in the world that can record information about all the victims seen in still images; images that can sometimes depict the most severe types of exploitation.
“Being able to add in details for each individual child, recording their sex, age and skin tone will give the IWF a much richer, fuller picture about the impact of child sexual abuse online.
“This way, every child is counted, which I know gives assessors much greater satisfaction in their challenging roles as they can now feel that they are helping even more children when the images are hashed and then removed or blocked online.”
The IWF works closely with UK police’s Child Abuse Image Database (CAID) and performs a vital role in assessing CSAM taken from devices during investigations. These images are then uploaded back into CAID with a full assessment, along with the different legal categories of child sexual abuse material which align with UK sentencing guidelines.