Sebastian Schaub, CEO, Hide.me
They say a picture is worth a thousand words. Back in 2019, ahead of CES 2019, Apple unveiled a huge billboard with the caption, “What happens on your iPhone, stays on your iPhone”. This was clearly a reference to Apple’s efforts at protecting user privacy, with the billboard also including a link to Apple’s Privacy website. At the time, Apple was keen to highlight how their suite of products were designed to protect Users’ privacy. So, there is certainly surprise (perhaps even shock) at Apple’s recent plans regarding the hash verification of images stored on Users’ iPhones. Why should any iPhone (or Apple customer) be concerned?
Only last week, Apple unveiled its proposals to deal with the problem of child abuse on its operating systems (within the United States currently), via a series of updates to the various OS suites. Where Apple have been drawing the most amount of fire from following this announcement relates to its child sexual abuse material (CSAM) detection system. Essentially this system would see Apple devices comparing the images on the device against a list of known CSAM image hashes provided by the US National Center for Missing and Exploited Children (NCMEC) for any possible matches, before then going on to store said image in iCloud.
Apple has said that underpinning the detection system, the requisite device creates a cryptographic safety voucher that encodes any match result along with additional encrypted data about the image – the safety voucher is then uploaded to iCloud Photos along with the image and upon reaching a hitherto unstated threshold, Apple will step in and manually look at the vouchers. At this point, If Apple decides that the image is CSAM then it will disable the user’s account reporting the findings to NCMEC. Apple have gone on to try and allay general fears pointing out that users would still be able to appeal to have an account reinstated and that its threshold parameters would mean it was highly unlikely that any user would be mistakenly targeted – a one in a trillion chance according to Apple.
So there you have it. In a nutshell, Apple announced to the world a seemingly great idea to help combat the tide of CASM that sadly exists in the first place. And whilst nobody would argue against the core sentiment of the exercise, has Apple not gone against their promise (even though the implementation is non-intrusive) that they posted on a giant billboard for all the world to see? Perhaps it should now read, “What happens on your iPhone, stays on your iPhone (unless we change our minds, which we have).” Isn’t that an abuse of trust?
Essentially, Apple’s solution is disingenuous when it comes to the issue of what they do after they reportedly find suspicious content. They are currently saying that ‘only’ when the threshold is exceeded will Apple manually intervene to investigate the safety vouchers / CSAM images. However, in order to manually review the match, they must necessarily have access to the content, i.e., the content must be transferred to Apple. And furthermore, as a user, it is not possible to get direct feedback from the system, you wouldn’t even know if any of your photos actually made a match against the database. Lots of smoke and mirrors you might say.
Who is to say that Apple might not decide to extend the capabilities of such a system to enable it to interrogate other files in the future? And the fact that technology like this exists will bring about huge concern amongst citizens who currently live under authoritarian / totalitarian regimes whereby such technology enables those in power to effectively spy on their own people. If Governments wanted to they could readily adapt a system like this to scan private content for just about anything they wanted to – freedom of speech probably. Don’t forget, Apple devices are readily available all over the world – but those myriad countries will have myriad definitions regarding what is ‘right’.
Ultimately, we feel that Apple’s decision amounts to a privacy nightmare and a very bad day for civil liberties at that. It is very clear, irrespective of the rationale that led to this course of action, that Apple has reneged on its hitherto promise of privacy for all of its users.
About the Author
Sebastian is the founder of hide.me VPN and he has been working in the internet security industry for over a decade. He started hide.me VPN, 9 years ago to make internet security and privacy accessible to everybody.