Home SecurityPrivacy Europe puts Apple’s CSAM plans back in the spotlight

Europe puts Apple’s CSAM plans back in the spotlight

Source Link

Apple may have put some of its plans to scan devices for CSAM material on hold, but the European Commission has put them right back in the spotlight with a move to force messaging services to begin monitoring for such material.

CSAM is emerging as a privacy test

In terms of child protection, it’s a good thing. Child Sexual Abuse Material (CSAM) is a far bigger problem than many people realize; victims of this appalling trade end up with shattered lives.

What’s happening, according to Euractiv, is that the European Commission is planning to introduce measures requiring messaging services to perform scans for CSAM material. However, Europe does seem to understand some of the arguments raised against Apple’s original proposals by privacy advocates, and is insisting on some restrictions, specifically:

  • The scanning technology must be ‘effective.’
  • It must be ‘suitably reliable.’
  • And it must avoid collection of “any other information from the relevant communications than the information strictly necessary to detect.”

Of course, ensuring the system is “reliable” is a challenge.

Just what is reliable?

When Apple announced its own take on CSAM scanning on its platforms, Imperial College London researchers soon warned the technology behind the system was easy to fool, calling it “not ready for deployment.”

Apple subsequently stepped back its plans, and later introduced a system to monitor for such content in its Messages app. It has not yet extended this to on-device analysis of people’s Photos libraries, as it had originally intended. It remains quite possible it scans photos stored in iCloud, as other image archiving firms do.

When it comes to Europe’s proposals, there’s hope that the injunction to create “suitably reliable” systems will eventually face some burden of proof. While these restrictions won’t completely set people’s minds at rest, as the threat of such technologies being abused by repressive or authoritarian governments remains, it does at least set in motion steps that could coalesce around an understanding of what people’s online privacy rights online should be.

At the same time, the EC proposals seem to threaten use of end-to-end encryption, which Apple continues to argue to protect.

Toward a digital bill of privacy rights

The lack of a clear and agreed-upon set of rights to protect online privacy is becoming increasingly critical as the world becomes more connected. At the same time, Europe is also insisting on regulations — such as mandatory sideloading — that may erode privacy and security on devices. These two strands seem philosophically opposed, but it is possible that as regulators and lawmakers consider the complexity of these issues, they will begin to see some glimmer of light.

I think this is what Apple is working to encourage, as it seems increasingly vital (even the World Economic Forum agrees) that an international standard to define digital rights is developed. And the need for that standard is growing.

Europe understands this; it put forward a declaration on digital rights and principles for EU residents in early 2022.

When it did, Executive Vice-President for a Europe Fit for the Digital Age Margrethe Vestager said in a statement: “We want safe technologies that work for people, and that respect our rights and values. Also when we are online. And we want everyone to be empowered to take an active part in our increasingly digitized societies. This declaration gives us a clear reference point to the rights and principles for the online world.”

What should those rights be?

Apple executives have been actively lobbying for frameworks around such rights for some time. Ever since Apple CEO Tim Cook’s powerful speech on digital surveillance in 2018, the company has constantly and (mostly) consistently lobbied for agreement around personal privacy. Cook’s company continues to work toward providing such rights on a unilateral basis, but also calls for universality in such protection. Apple has argued for the following four pillars:

  • Users should have the right to have personal data minimized.
  • Users should have the right to know what data is collected on them.
  • Users should have the right to access that data.
  • Users should have the right to right for that data to be kept secure.

While we are all aware that some business models will be forced to change as a result of any such set of rights, the introduction of some digital certainty would, at least, help promote a level playing field in tech.

And the need to strike a well-thought-out balance between individual rights and collective responsibility seems stronger today than ever before.

Please follow me on Twitter, or join me in the AppleHolic’s bar & grill and Apple Discussions groups on MeWe.

Copyright © 2022 IDG Communications, Inc.

Related Articles

Leave a Comment