Block on Trump's Asylum Ban Upheld by Supreme Court
For iPhone X users, and users of other smart devices with facial recognition capabilities, opting in may be the price of convenience and "security." But when you're just out in public and your face gets scanned by law enforcement, there is no opting in, at least not yet.
Recently, Microsoft, the ACLU, and other major voices in the smart tech revolution have demanded that lawmakers do something fast to prevent facial recognition software from disrupting individual civil rights. While the tech is good, it's far from perfect, and civil rights are easily trampled upon when law enforcement isn't clear on how tech practically applies to constitutional rights.
Civil Rights Disrupted
Sure, many states, like Illinois, have laws that protect individuals from companies collecting biometric information without consent, but what happens when it is the government that wants to use this tech? The issues with accuracy and privacy are likely to give way to calls for public safety. But, as many have noted, the lack of regulation is somewhat frightening, and has caused tech companies to refuse to do business with governments and law enforcement departments.
Like DNA samples, and fingerprints, criminal offenders can be subjected to biometric scanning, and have their data stored in a national database that allows law enforcement to access that information. We're already pretty close to that now, given the FBI has boasted about having some big capabilities and the fact that nearly half of the population's facial biometrics are already in a database.
From Fiction to Reality
The fact that facial recognition software has been developed and is being used by law enforcement is a big technological advance. It holds great promise for increasing public safety through deterrence, particularly as the Internet of Things is making home and business security cameras more and more affordable, and plentiful.
But, as the ACLU and Microsoft urge lawmakers, the framework that governs legitimate use of facial recognition tech is virtually nonexistent, and is needed to make sure the government doesn't go too far, or at least know what "going too far" would even entail. When is it a search? Should probable cause, a lower standard, or any standard, be required to put an image through a facial recognition scan?