Apple iPhone X's FaceID Technology: What It Could Mean For Civil Liberties
Apple’s new facial recognition software to unlock their new iPhone X has raised questions about privacy and the susceptibility of the technology to hacking attacks.
Apple's iPhone X is set to go on sale on Nov. 3. The world waits with bated breath as Apple plans on releasing a slew of new features including a facial scan. The new device can be unlocked with face recognition software wherein a user would be able to look at the phone to unlock it.
This convenient new technology is set to replace numeric and pattern locks and comes with a number of privacy safeguards. The data that the phone acquired to recognize a particular face will only be stored on the phone and not in any databases.
A video by CNET showing us a demo of the Face ID technology given by Phil Schiller SVP Worldwide, Marketing, Apple.
Apple says its neural engine for Face ID cannot be tricked by a photo or hacker. But, Apple introducing new technology inevitably guarantees that it will be widely used or tried-out. This makes it an almost inevitable addition to our daily lives in the future.
"What Apple is doing here will popularize and get people more comfortable with the technology," said Patrick Moorhead, principal analyst at Moor Insights and Strategy, who follows the sector, in a Phys.Org report.
"If I look at Apple's track record of making things easy for consumers, I'm optimistic users are going to like this," he added.
Clare Garvie — Georgetown University Law School associate and lead of a 2016 study on facial recognition databases — said that "the technology may well be inevitable; it is going to become part of everyone's lives if it isn't already."
The problem, according to privacy activists, with introducing such technology to masses is that this will "normalize the technology and open the door to broader use by law enforcement, marketers or others of a largely unregulated tool," said a Phys.Org report.
The report said that the study found out that nearly half of all Americans who are a part of a law enforcement database that includes facial recognition, were in it without their consent.
Civil liberties groups have already sued the FBI over the use of its biometric database, which includes facial profiles. They claim that it has a high error rate and the potential for tracking innocent people increases manifold with a technology we are still unsure about.
"We don't want police officers having a watch list embedded in their body cameras scanning faces on the sidewalk," said Jay Stanley, a policy analyst with the American Civil Liberties Union.
"Apple has done a number of things well for privacy but it’s not always going to be about the iPhone X," he said in the report.
"There are real reasons to worry that facial recognition will work its way into our culture and become a surveillance technology that is abused," he added.
Last year, a Russian photographer figured out how to match the faces of porn stars with their social media profiles to "doxx" them, or reveal their true identities. This meant that their private information and lives were published on the internet for everyone to see. This could create a huge security concern for many.
This type of use "can create huge problems," said Garvie. "We have to consider the worst possible uses of the technology."
The research conducted by Garvie and her team found out significant errors in law enforcement facial recognition databases. This could increase the chances of wrong arrests in the future. According to the Phys.Org report, "Shanghai and other Chinese cities have recently started deploying facial recognition to catch those who flout the rules of the road, including jaywalkers."
The FaceID technology uses 30,000 infrared (IR) dots to create a digital image of a persons’ face. This data is stored in a secure location in the device. According to the report, the chances of a random person being able to unlock the device are one in a million, compared with one in 50,000 for its current TouchID.
© Copyright IBTimes 2024. All rights reserved.