Dec 21, 2020
Facebook child abuse detection hit by new EU rules
Facebook has switched off some its child abuse detection tools in Europe in response to new rules from the EU. The company said it has had no choice but to do so, since the new privacy directive bans automatic scanning of private messages. The change only applies to messaging services rather than all content uploaded to Facebook. The problem has emerged despite warnings from child protection advocates that the new privacy rules effectively ban automated systems scanning for child sexual abuse images and other illegal content. Few companies deal with the sheer volume of private messages as Facebook, which runs its own messenger service and owns Instagram. As a result, "The European Commission and child safety experts have said that the directive does not provide a legal basis for these tools", Facebook said in a blog post explaining the issue.
Related companies
Make a complaint about Facebook by viewing their customer service contacts.