EU Plan to Scan Private Messages for Child Abuse Images Puts Encryption at Risk


“As such, there is only one logical solution: the client-side scan where the content is examined when it is decrypted on the user’s device so that he can see / read it,” says Woodward. Last year, Apple announced that it would introduce client-side scanning, scanning of people’s iPhones instead of Apple’s servers, to check known CSAM photos that were uploaded to iCloud. The move sparked outrage over possible civil rights groups monitoring Edward Snowden and caused Apple to halt its plans a month after it was initially announced. (Apple declined to comment on this story.)

For technology companies, detecting CSAM on their platforms and scanning some communications is not new. Businesses operating in the United States must report any CSAM they find or have users report to the National Center for Missing and Exploited Children (NCMEC), a U.S.-based nonprofit organization. Last year alone, more than 29 million reports were made to NCMEC, containing 39 million images and 44 million videos. Under the new EU rules, the EU Center will receive CSAM reports from technology companies.

“Many companies are not doing the screening today,” Johansson told a news conference to introduce the legislation. “This is not a proposal on encryption, it is a proposal on child sexual abuse material,” Johansson said, adding that the law “is not about reading the communication,” but about detecting illegal content. abuse.

Right now, tech companies are finding CSAM online in different ways. And the amount of CSAM found increases as technology companies improve when it comes to detecting and reporting abuse, although some are much better than others. In some cases, AI is used to hunt CSAM that has not been seen before. Duplicate existing photo and video abuse can be detected using “hashing systems”, where the abuse content is assigned a fingerprint that can be detected when uploaded back to the web. More than 200 companies, from Google to Apple, use Microsoft’s PhotoDNA hashing system to scan millions of shared files online. However, to do this, systems must have access to the messages and files that people are sending, which is not possible when there is end-to-end encryption.

“In addition to detecting CSAM, there will be obligations to detect the request of children (” grooming “) in the scope can only mean that conversations should be read 24 hours a day, 7 days a week “says Diego Naranjo, chief of policy for the civil liberties group. European digital rights. “This is a disaster for the confidentiality of communications. Companies will be asked (through detection orders) or encouraged (through risk mitigation measures) to offer less secure services to everyone if they want to meet these obligations.”

Discussions about online child protection and how this can be done with end-to-end encryption are very complex, technical, and combined with the horrors of crimes against vulnerable youth. A study by Unicef, the UN Children’s Fund, published in 2020, says that encryption is necessary to protect the privacy of people, including children, but adds that it “prevents” efforts to remove content and identify the people who share it. For years, law enforcement agencies around the world have been pushing to create ways to prevent or weaken encryption. “I’m not saying privacy at all costs, and I think we can all agree that child abuse is disgusting,” says Woodward, “but there needs to be a proper, public and passionate debate about whether the risks of what could be The real effectiveness of the fight against child abuse is to emerge. “

Increasingly, researchers and technology companies have focused on security tools that may exist along with end-to-end encryption. Proposals include the use of encrypted message metadata (who, how, what and why of messages, not their content) to analyze people’s behavior and potentially detect crime. A recent report by the non-profit group Business for Social Responsibility (BSR), commissioned by Meta, found that end-to-end encryption is an overwhelmingly positive force for defending people‚Äôs human rights. He suggested 45 recommendations on how encryption and security can go together and not involve access to people’s communications. When the report was published in April, Lindsey Andersen, BSR’s associate director of human rights, told WIRED: “Contrary to popular belief, there are actually many things that can be done even without access. in messages “.



Source link

Leave a Reply