Later this month, Apple will roll out an upgraded version of its software with the ability to scan for child abuse images on iPhones. For now, there’s a separation between Apple servers and your iPhone which means Apple cannot scan anything on devices. The new software, according to privacy experts, will change all that and open pandora's box. One leading voice on this subject is Edward Snowden, the man who exposed the surveillance machinery in the US.
Here is his view on the matter, selected and edited for ease of reading:
“Apple’s new system, regardless of how anyone tries to justify it, will permanently redefine what belongs to you, and what belongs to them.
“Under the new design, your phone will now perform these searches on Apple’s behalf before your photos have even reached their iCloud servers, and if enough ‘forbidden content’ is discovered, law-enforcement will be notified (in the US for now). Apple plans to erase the boundary dividing which devices work for you, and which devices work for them.
“Apple says their system, in its initial, v1.0 design, has a narrow focus: it only scrutinizes photos intended to be uploaded to iCloud (although for 85% of its customers, that means EVERY photo), and it does not scrutinize them beyond a simple comparison against a database of specific examples of previously-identified child sexual abuse material (CSAM).
“What happens when a party in India demands they start scanning for memes associated with a separatist movement? What happens when the UK demands they scan for a library of terrorist imagery? How long do we have left before the iPhone in your pocket begins quietly filing reports about encountering ‘extremist’ political material, or about your presence at a ‘civil disturbance’?
“Or simply about your iPhone’s possession of a video clip that contains, or maybe-or-maybe-not contains, a blurry image of a passer-by who resembles, according to an algorithm, ‘a person of interest’?
“If Apple demonstrates the capability and willingness to continuously, remotely search every phone for evidence of one particular type of crime, these are questions for which they will have no answer. And yet an answer will come – and it will come from the worst lawmakers of the worst governments. This is not a slippery slope. It’s a cliff.
“Apple’s proposal to make their phones inform on and betray their owners marks the dawn of a dark future, one to be written in the blood of the political opposition of a hundred countries that will exploit this system to the hilt. See, the day after this system goes live, it will no longer matter whether or not Apple ever enables end-to-end encryption, because our iPhones will be reporting their contents before our keys are even used. “They are inventing a world in which every product you purchase owes its highest loyalty to someone other than its owner.”
Commentary: What Snowden has raised is something that can no longer be ignored whether Apple backtracks or not. What we now know is that software exists to scan mobile devices for their contents. We also know this is not new as Google already does somehow scan for contents of Android devices.
What raises eyebrows this time around is that a tech company that took a stand for privacy is doing the opposite of its core value. It is now clear there’s little that will keep technology users' privacy sacred.
The only option for users of these tools is to examine their relationship with technology to a point of going back to devices that are considered dumb, like feature phones.
As a society, we are getting closer to a situation that may require a digital diet. We may need to forego the advantages that come with connectedness to safeguard our privacy.
As long as tech companies remain more powerful than governments, personal protections against technology may be the only protection left for mankind. After all, we don’t need to capture every special moment with a smartphone – there’s a dedicated camera for that; we don’t have to make phone calls via an app as the telephone still exists for phone calls.
A less digital approach to life may slow down the pace of our civilization, however, it may be a necessary price to pay while we establish rules of engagement about how we want technology to work for us. The problem with tech is not that we cannot regulate, it’s simply that it’s moving at a faster pace for society to come to terms with its changes. The other alternative is that leading tech companies just open their ears and do what is good for society.
IOL TECH