In a statement sent to the committee behind the UK's proposed investigatory powers bill, Apple expressed its concerns over the scope of the bill and how it may affect the security of millions of Apple's customers through the creation of so-called back doors. As reported by The Guardian, Apple's main concern lies in the potential for the bill to weaken encryption for law-abiding users:
In its submission, Apple said: "The creation of backdoors and intercept capabilities would weaken the protections built into Apple products and endanger all our customers. A key left under the doormat would not just be there for the good guys. The bad guys would find it too."
Apple goes on to comment on the language of the bill as it applies to companies no matter where they are based, meaning domestic UK legislation could have a direct affect on users of Apple products around the world:
"For the consumer in, say, Germany, this might represent hacking of their data by an Irish business on behalf of the UK state under a bulk warrant – activity which the provider is not even allowed to confirm or deny. Maintaining trust in such circumstances will be extremely difficult."
Apple is calling for changes in the language of the bill to address these concerns. The bill itself was introduced in November, and is still in the committee stage.
Apple's concerns very much echo what CEO Tim Cook has espoused in recent months as governments have increasingly ramped up their calls for cooperation from tech companies in an effort to gain access to encrypted communications. Just this week, Cook repeated his concerns in an interview with CBS' 60 Minutes, receiving yet more backlash from U.S. Senator Tom Cotton, who sits on the Senate Select Committee on Intelligence. Says Cotton:
Apple is a distinctive company that has improved the lives of millions of Americans. But Tim Cook omitted critical facts about data encryption on 60 Minutes last night. He claimed that Apple does not comply with lawful subpoenas because it cannot. While it may be true that Apple doesn't have access to encrypted data, that's only because it designed its messaging service that way. As a society, we don't allow phone companies to design their systems to avoid lawful, court-ordered searches. If we apply a different legal standard to companies like Apple, Google, and Facebook, we can expect them to become the preferred messaging services of child pornographers, drug traffickers, and terrorists alike--which neither these companies nor law enforcement want. Our society needs to address this urgent challenge now before more lives are lost or shattered."
Despite Cotton's argument, one has to consider whether a wholesale dragnet of user information would be worth the cost of potentially exposing the sensitive data and communications of an entire user base in order to access the potentially nefarious communiques of unsavory characters using such services. Companies like Apple and Google as well as independent security researchers have been nearly unanimous in stating that creating any sort of the back door that government officials are calling for would by its very nature compromise the safety and privacy of all communications and data that pass through these services.