Apple’s move to proactively tackle child sexual abuse material an important step toward industry‑wide, victim‑focused action

Proposed image scanning technology only part of solution; many other problems unaddressed, says Canadian Centre for Child Protection.

August 10, 2021
For Immediate Release

Winnipeg, Canada — The Canadian Centre for Child Protection (C3P) welcomes news of Apple’s plans to tackle the spread of child sexual abuse material (CSAM) online through the deployment of proactive image detection technology across most of its platforms in the U.S.

Last week, Apple announced plans to roll out anti‑CSAM features for their iMessage, iCloud, and Siri/Search applications. The move will see Apple’s system warn children and parents about sensitive content and scan on‑device images against data banks of known images of CSAM.

“Proactive detection of harmful and illegal images is something we expect all electronic service providers (ESPs) to be seriously engaged in. The plan proposed by Apple for its users is a win for survivors who have had their CSAM repeatedly shared for years and I certainly hope other companies operating in the digital space will also step up and do their part,” says Lianna McDonald, Executive Director for C3P.

The adoption of robust proactive detection technology, one of the key recommendations covered in C3P’s recent report, however, has limitations in that it cannot prevent or block the sharing of new material that is unknown and therefore had no match in the data banks of existing CSAM.

“It’s tempting to believe that algorithms and technology alone can prevent online harm to children, but as we’ve seen over the years, these tools have limitations and represent only part of the solution. Human moderation, age verification, swift complaint response and content removal processes — these are all key features that help keep children safe online,” says McDonald.

Apple has not yet announced plans to expand this program to Canada. Should this occur, C3P would welcome the opportunity to assist Apple in understanding the Canadian context surrounding CSAM and the distribution of non‑consensual intimate images.

Media relations contact:
(204) 560-0723
communications@protectchildren.ca

-30-

About the Canadian Centre for Child Protection: The Canadian Centre for Child Protection (C3P) is a national charity dedicated to the personal safety of all children. The organization’s goal is to reduce the sexual abuse and exploitation of children through programs, services, and resources for Canadian families, educators, child‑serving organizations, law enforcement, and other parties. C3P also operates Cybertip.ca, Canada’s national tipline to report child sexual abuse and exploitation on the internet, and Project Arachnid, a web platform designed to detect known images of CSAM on the clear and dark web and issue removal notices to industry.