Statement on behalf of the Phoenix 11: Survivors of child sexual abuse material demand action from tech one year after key players support online anti-exploitation principles
For Immediate Release
Winnipeg, Canada — Today the Phoenix 11, a collective of survivors whose child sexual abuse was recorded, and in the majority of cases, distributed online, issued the following statement on the one year anniversary of the establishment of the Five Country Ministerial group’s (5 Eyes) Voluntary Principles to Counter Online Child Sexual Exploitation and Abuse (Voluntary Principles):
“One year ago we, the Phoenix 11, were invited to participate in a roundtable meeting in Washington, D.C. with White House representatives, ministers from the 5 Eyes, the National Center for Missing & Exploited Children (NCMEC), and prominent members of the tech industry.
We spoke vulnerably in front of executives from some of the most widely-used online platforms. We shared openly about being sexually abused as children, having that abuse recorded, and facing the constant fear that there was widespread and continuous distribution of the most horrific moments of our lives. We wanted to put a face to the statistics of child sexual abuse material (CSAM) and make a difference in the way tech companies work to protect children, and rid their platforms of this abusive content.
Following this event on March 5, 2020, the 5 Eyes released a set of Voluntary Principles with the goal of ensuring tech prioritizes child safety throughout their platforms, as well as to protect survivors. Six companies publicly pledged their support of the Voluntary Principles to do better for children, including Facebook, Google, Microsoft, Twitter, Snap, and Roblox.
On the anniversary of this declaration, we are disappointed by what we see.
It seems this has not been taken seriously, especially at a time where reports of online child exploitation are at an all-time high, driven by pandemic restrictions world‑wide. With more children being at home and online, they are at risk now more than ever. We know this better than anyone, and it worries us greatly. It shows us that industry does not view this as a priority, does not understand the urgency, or both.
We want to know from companies that have supported the Voluntary Principles, what changes have been put in place to address these principles? Over the past year, a great deal of resources have been committed to combat the spread of misinformation on your platforms. Have the same resources been put towards implementing the Voluntary Principles? We require more transparency about your progress and effectiveness.
In June 2020, the Tech Coalition announced Project Protect, which included supporting the Voluntary Principles, in their efforts to track progress in reducing the amount of CSAM. We ask how many members of the Tech Coalition have adopted these to date?
We are also asking the 5 Eyes what other tech companies have supported the Voluntary Principles? It appears that only a handful have adopted them since the announcement 12 months ago — this is unacceptable. What pressure is being placed on the hundreds of tech companies that have not supported the Voluntary Principles? What excuse could there be for not supporting these principles?
One year ago, industry pledged to do better, and we are still waiting.”