Apple will let you unlock the iPhone X with your face – a move likely to bring facial recognition to the masses, along with concerns over how the technology may be used for nefarious purposes.
Apple’s newest device, set to go on sale November 3, is designed to be unlocked with a facial scan with a number of privacy safeguards – as the data will only be stored on the phone and not in any databases.
Unlocking one’s phone with a face scan may offer added convenience and security for iPhone users, according to Apple, which claims its “neural engine” for FaceID cannot be tricked by a photo or hacker. While other devices have offered facial recognition, Apple is the first to pack the technology allowing for a three-dimensional scan into a hand-held phone.
But despite Apple’s safeguards, privacy activists fear the widespread use of facial recognition would “normalise” the technology and open the door to broader use by law enforcement, marketers or others of a largely unregulated tool.
“Apple has done a number of things well for privacy but it’s not always going to be about the iPhone X,” said Jay Stanley, a policy analyst with the American Civil Liberties Union.
“There are real reasons to worry that facial recognition will work its way into our culture and become a surveillance technology that is abused.”
A study last year by Georgetown University researchers found nearly half of all Americans in a law enforcement database that includes facial recognition, without their consent.
Civil liberties groups have sued over the FBI’s use of its “next generation” biometric database, which includes facial profiles, claiming it has a high error rate and the potential for tracking innocent people.
“We don’t want police officers having a watch list embedded in their body cameras scanning faces on the sidewalk,” said Stanley.
Clare Garvie – the Georgetown University Law School associate who led the 2016 study on facial recognition databases – agreed that Apple is taking a responsible approach but others might not.
“My concern is that the public is going to become inured or complacent about this,” Garvie said.
Widespread use of facial recognition “could make our lives more trackable by advertisers, by law enforcement and maybe someday by private individuals,” she said.
Garvie said her research found significant errors in law enforcement facial recognition databases, opening up the possibility someone could be wrongly identified as a criminal suspect.
Another worry, she said, is that police could track individuals who have committed no crime simply for participating in demonstrations. Shanghai and other Chinese cities have recently started deploying facial recognition to catch those who flout the rules of the road, including jaywalkers.
Facial recognition and related technologies can also be used by retail stores to identify potential shoplifters, and by casinos to pinpoint undesirable gamblers. It can even be used to deliver personalized marketing messages – and could have some other potentially unnerving applications.
Last year, a Russian photographer figured out how to match the faces of porn stars with their social media profiles to “doxx” them, or reveal their true identities.
This type of use “can create huge problems,” said Garvie. “We have to consider the worst possible uses of the technology.”
Apple’s system uses 30,000 infrared dots to create a digital image which is stored in a “secure enclave,” according to a white paper issued by the company on its security. It said the chances of a “random” person being able to unlock the device are one in a million, compared with one in 50,000 for its TouchID.
Apple’s FaceID is likely to touch off fresh legal battles about whether police can require someone to unlock a device. FaceID “brings the company deeper into a legal debate” that stemmed from the introduction of fingerprint identification on smartphones, according to ACLU staff attorney Brett Max Kaufman.
Kaufman says in a blog post that courts will be grappling with the constitutional guarantees against unreasonable searches and self-incrimination if a suspect is forced to unlock a device. The U.S. courts have generally ruled that it would violate a user’s rights to give up a passcode because it is “testimonial” – but that situation becomes murkier when biometrics are applied.
Apple appears to have anticipated this situation by allowing a user to press two buttons for two seconds to require a passcode, but Garvie said court battles over compelling the use of FaceID are likely. Regardless of these concerns, Apple’s introduction is likely to bring about widespread use of facial recognition technology.
“What Apple is doing here will popularize and get people more comfortable with the technology,” said Patrick Moorhead, principal analyst at Moor Insights & Strategy, who follows the sector.
“If I look at Apple’s track record of making things easy for consumers, I’m optimistic users are going to like this.”Garvie added it is important to have conversations about facial recognition because there is little regulation governing the use of the technology.
“The technology may well be inevitable,” she said.
“It is going to become part of everyone’s lives if it isn’t already.”