Apple takes a step towards opening the back door

Capture investment opportunities created by megatrends

Apple takes a step towards opening the back door

7 August 2021 Technology & Digitalization 0

Technology sector updates

Child sexual abuse is abhorrent. For many smartphone owners, though, the idea that government or police could read content on a device that is almost an extension of themselves is repugnant too. The imperatives of tackling crime and protecting privacy collide in Apple’s decision to scan US iPhones for child abuse imagery. Campaigners for better child protection will celebrate. But the move sets a weighty precedent.

Apple has long rejected pressure to insert a “back door” in code that would allow law enforcement, in certain circumstances, to access its devices. It has twice resisted FBI demands to help it unlock phones, after shootings in San Bernardino, California, in 2015, and Florida in 2019 — though Apple said it had provided data including iCloud backups. As encryption has become key to many products and services, Facebook and other tech groups have also opposed moves to allow “exceptional access”.

Encrypted devices and messaging are a boon to organised crime, terrorists, a child abusers. But Big Tech and privacy advocates have argued, with strong justification, that creating any kind of back door opens the way for hackers, cyber criminals or unscrupulous governments to abuse it.

Apple’s “neuralMatch” is not — quite — a back door, in the sense of providing direct access to content via the operating system. Apple, moreover, already decrypts photos on its iCloud servers if required by law enforcement. The precedent is that its technology will now proactively screen images on iPhones — breaking down the ringfence that had surrounded its devices — looking for matches with those on a US database of known child abuse images. Matches are flagged when photos are uploaded to iCloud, studied by human reviewers, and sent to law enforcement if verified.

Privacy campaigners warn that by allowing such pattern-matching in encrypted photos on iPhones, Apple is opening itself and others to pressure from governments to do the same for other types of content, such as imagery of opposition protests. Companies could refuse, but might face legislative moves to compel it. Disclosures about the thousands apparently targeted by Pegasus spyware from Israel’s NSO have shown plenty of governments are happy to use backdoor mechanisms.

Apple may hope that by co-operating with US authorities in countering one of the most morally vile activities that exploits digital encryption it can fend off legislation forcing it to go further. The US, UK, Australia, New Zealand and Canada have called on tech companies to include mechanisms that would enable governments — with appropriate legal authority — to gain access to data. The danger is that Apple will simply whet appetites. Some rivals are privately furious, feeling the Cupertino-based company has broken ranks and conceded an important principle.

Some in the security community speculate, though, that Apple may be preparing to introduce more encryption protections around data on iCloud, which do not currently exist. Offering help in finding child abuse material might then be a trade-off for reducing access that law enforcement currently has to iCloud, by encrypting other data stored on it. This could provide welcome extra protection to, say, dissidents in Hong Kong. Apple and other foreign groups have in recent years been compelled to store Chinese users’ data in a data centre inside the country.

Cooperation between Big Tech and law enforcement is essential in legitimate efforts to fight crime and safeguard security, but “back doors” are fraught with hazard. Not just its users but billions of phone users the world over will hope Apple’s move does not prove to be the thin end of a much larger wedge.