WINNIPEG -- The Canadian Centre for Child Protection is calling for regulatory changes that will better monitor the posting and sharing of child sexual abuse materials in the digital sphere.
"We have an epidemic online and we have an epidemic because we have failed to put any guardrails in and around what is happening in the online world," said Signy Arnason, associate executive director for the Manitoba-based Canadian Centre for Child Protection.
"We regulate all other spaces, everything in the offline world to protect children and somehow we've abandoned kids online," said Arnason.
The renewed call-to-action comes as the Canadian Centre for Child Protection presented federal lawmakers with findings from a probe into the amount of child sexual abuse materials (CSAM) found on a popular pornographic website.
The Centre, using its CSAM-detecting web tool and platform 'Project Arachnid,' looked into how many instances of such content were found on PornHub, a pornographic website owned by the Canadian Company MindGeek, over a three-year period.
There were 193 instances of what the Centre believes to be CSAM found, with some involving images of children possibly in the age range of six to eight-years-old.
"We do not believe the above numbers are representative of the scope and scale of this problem," said Lianna McDonald, Executive Director of Canadian Centre for Child Protection during a parliamentary committee held in Ottawa Monday.
Daniel Bernhard, the executive director of the advocacy group 'Friends of Canadian Broadcasting,' also spoke at the committee and called for increased oversight of tech companies and platforms.
"If a relatively small not-for-profit organization in Manitoba is able to deploy technology that can find this material, surely a company the size of MindGeek can do the same," said Bernhard.
"There is a difference between hosting material and actively recommending it to people and, in that sense, the platforms are arguably more responsible than the users themselves," he said.
MindGeek, in a statement provided to CTV News, says it "has zero tolerance for non-consensual content, child sexual abuse material (CSAM), and any other content that lacks the consent of all parties depicted."
The statement adds the company is "continually improving [its] processes" and "will continue to work with law enforcement globally to stamp out CSAM and non-consensual material on our platforms and on the internet as a whole."
But the posting and sharing of CSAM online is a problem that goes beyond one website or company.
A 2020 report by the Canadian Centre for Child Protection lists social media sites, like Facebook and Twitter, and messaging platforms, such as WhatsApp, alongside pornographic websites as digital spaces where images and videos of children being sexually abused can be shared.
Manitoba's acting advocate for children and youth identifies the sharing of CSAM in digital spaces as a serious concern in the province.
“The distribution of child sexual abuse materials online is an ongoing concern in Manitoba, which our office receives calls about much too often," reads a statement provided to CTV News by the office of Acting Manitoba Advocate for Children and Youth Ainsley Krone.
"Young people in our province are being exploited by adults every day and because sexual exploitation is often hidden from view, we only know about a small portion of this type of child abuse."
Families Minister Rochelle Squires said, in a statement that “this is a matter of federal jurisdiction, though I certainly support measures that strengthen the protection of children.”
“I am eager to see a strong response from the federal government to ensure that all children in Manitoba, and across Canada, are safe from this kind of exploitation,” the statement reads.
When it comes to immediate changes that can ameliorate the issue, the Canadian Centre for Child Protection would like to see tech companies and platforms make a concerted effort to verify the ages of individuals included in an image or video.
Right now, Arnason says the onus is largely on sexual abuse survivors to notify a website of an instance of CSAM. Often those requests are ignored, she says, until a group like the Canadian Centre for Child Protection submits a complaint on their behalf.
Instead, Arnason would like to see a "reverse onus" put in place.
"At the end of the day we've got to start putting the responsibility on the other side," said Arnason. "Companies need to be sure that what in fact is being put on their platforms is of someone of age that's consented to be posted in that particular image or video."