A number of privacy experts are expressing concern over how Apple will be able to regulate the use of face data of iPhone X users by app developers.
Apple may struggle to control how face data of iPhone X users will be used once they are transferred by developers to their remote servers.
Apple is allowing app developers to gain access to face data stored on iPhone X handsets, once permission is granted by individual users so that they can build apps using their facial features. Likely scenarios could be creating artificial masks for users’ faces or embedding a user’s face on popular game characters.
However, such access, if granted, may open up a Pandora’s box in the near future. A number of privacy groups, including the American Civil Liberties Union and the Center for Democracy and Technology, are now worried that Apple may not be able to control how developers will use their access to sensitive biometric data once they transfer such data to their individual servers.
However, Apple has said that such data will not be detailed enough to enable malicious actors to mimic a user’s face data to unlock an iPhone. The new Face ID feature in iPhone X, which may be used in future iPhone models as well, uses a TrueDepth camera system’s infrared camera, proximity and light sensors, specialised hardware and a flood illuminator to create 30,000 invisible dots to map out a user’s unique facial features.
On the other hand, the data available to developers will not be detailed enough to help developers create visual maps of users’ faces. However, it is also true that mere facial contours can also be used to create graphic representations for other purposes, like creating images for fake passports or other identities.
At the same time, there are also concerns about developers sharing such data with other parties who may in turn use such data for advertising purposes. Apple insists that it has a clear policy of kicking out developers who violate its privacy policies but back in 2011, tjhe company admitted to U.S. Congress that it had never kicked out any app for violating its policies.
According to Apple’s terms, developers must obtain clear consent from users before using their face data for legitimate features in their apps. At the same time, they are prohibited from sharing such data with other parties or from selling them to advertisers.
While such policies may ensure that face data of users will remain secure, there’s no clear solution on what will happen if hackers manage to breach remote servers managed by app developers. Back in September, app developers feared that Apple’s developer site itself had been hacked after they noticed that all of their account addresses had been updated with addresses in Russia. In 2013, hackers had succeeded in breaching Apple’s developer site and had walked away with names, physical addresses and email addresses of several developers.
At the same time, the methodology used by Apple to review apps includes random spot checks instead of source code reviews, and this makes such reviews hardly comprehensive.
‘Apple does have a pretty good historical track record of holding developers accountable who violate their agreements, but they have to catch them first – and sometimes that’s the hard part. It means household names probably won’t exploit this, but there’s still a lot of room for bottom feeders,’ said Jay Stanley, a senior policy analyst at the American Civil Liberties Union.
The reason why privacy groups are so concerned is that even though data stored by Apple’s Face ID feature is complex and encrypted inside iPhones rather than on the iCloud, the hacker community may fervently try to defeat it and may come up with a solution sooner or later. The fact that app developers will have access to such data renders the security of such data more vulnerable to hacks.
‘No single authentication technique is beyond the reach of attackers. Devices will be hacked and sensors will be tricked. It is important to layer such technology with adaptive authentication methods, such as IP reputation, phone number fraud prevention capabilities or behavioural biometrics. Security is very much about layers,’ said Stephen Cox, Chief Security Architect at SecureAuth.