It’s often the case that new technologies arrive on the scene faster than our society and its legal code can keep up. Sometimes this can be a good thing. For instance, 3D printing allows people to print out unregulated gun parts, thus allowing gun owners to circumvent the onerous laws of our government, which has struggled to come up with new laws to restrict the technology.
When technology advances at a breakneck pace however, it can also be quite dangerous for our liberties. This is especially true in regards to privacy. If a new technology makes it easy for the government to track us, you can bet that the government is going to take its sweet time updating the legal code in a way that will protect us from surveillance.
That certainly seems to be the case with facial recognition software. During a recent Congressional Oversight Committee hearing, members of both political parties sounded the alarm on the FBI’s use of the technology, and read the written testimony of Electronic Frontier Foundation senior staff attorney Jennifer Lynch:
Lynch detailed the stunning scope of the FBI’s photo collection. In addition to collecting criminal and civil mug shots, the agency currently has “memorandums of understanding” with 16 states that mean every driver’s license photo from those states is accessible to the agency—without the drivers’ consent. The FBI also has access to photos from the U.S. State Department’s passport and visa records.
Lynch argued that “Americans should not be forced to submit to criminal face recognition searches merely because they want to drive a car. They shouldn’t have to worry their data will be misused by unethical government officials with unchecked access to face recognition databases. And they shouldn’t have to fear that their every move will be tracked if face recognition is linked to the networks of surveillance cameras that blanket many cities.”
“But without meaningful legal protections, this is where we may be headed,” Lynch stated. “Without laws in place, it could be relatively easy for the government and private companies to amass databases of images of all Americans and use those databases to identify and track people in real time as they move from place to place throughout their daily lives.”
All told, law enforcement agencies around the country have access to 400 million photos in facial recognition databases, which are connected to roughly 50% of American adults. Most of these people have never committed a crime, and obviously haven’t given any consent to this.
At first glance it may sound harmless to be in one of these databases. Movies and TV shows make it sound like this technology can help law enforcement swiftly and precisely nab suspects. So what do you have to fear if you haven’t committed a crime? It turns out that in real life, facial recognition is far from perfect.
Internal FBI documents obtained in a Freedom of Information Act lawsuit by the nonprofit Electronic Privacy Information Center indicate that the FBI’s own database, called the Next Generation Identification Interstate Photo System, or NGI-IPS, had an acceptable margin of error of 20 percent — that is, a 1-in-5 chance of “recognizing” the wrong person.
And research published in the October 2015 issue of the scientific journal PLOS ONE by researchers at the universities of Sydney and New South Wales in Australia found that the humans who interpret such data build in an extra error margin approaching 30 percent.
If we ever allow our government to roll out facial recognition cameras on a wider scale, lots of innocent people are going to be hurt. Whether by mistake or by malice, it will become shockingly easy for law enforcement to identify ordinary people as criminals. The surveillance control grid will not only be inescapable, it will be unwieldy and rife with abuse.
It’s often said that you should never trade freedom for safety. In this case, we wouldn’t receive any kind of safety.
Written by Daniel Language
Please like our Facebook PageTags: Big Brother, Facial Recognition, Government, Government Control, Surveillance State
copyright 2018 Blog WordPress Theme By ThemeShopy