GRAHAM: Artificial Intelligence will increase policing
Column: Considerations of Crime
We have all heard about the increase in facial recognition and artificial intelligence.
Whether this sparks scenes from the movie "Smart House" or gets your gears grinding for new possibilities, I think we can all agree that it feels a little outlandish how far this technology has come.
What we do not necessarily realize is the extent to which this technology is already being used in unstable, crucial areas — primarily, criminal justice. Criminal justice is a topic that many find to be easily understandable, and therefore easily marketable. Though not all Americans fully understand tax code, almost all if not all, have a general understanding of what it means when a paper has a headline, “Crime Spikes in College Town.”
This installation of fear encourages many to give up their rights, primarily privacy, for “increased security.”
This phenomenon is especially true when it comes to technological developments. Rights are slowly being scraped away, all with the promise of safety and security. We started installing cameras for stores and red lights, put "Ring" systems on our doors and even began forcing people who were arrested (not even convicted) to provide DNA samples for the police.
This brings us to the modern reality — facial recognition.
Many are familiar with the general concept of facial recognition. An algorithm creates a “faceprint” of you, and then connects it to other known images of you. In a law enforcement or criminal justice context, this usually comes from security cameras or government surveillance.
Now, there is a radical new company introducing a different kind of facial recognition. It compiled a major database of billions of photos from Facebook, Venmo, Twitter, education websites and more. All you would have to do is take a photo of someone on the street, upload it to the application and you would be able to identify them in seconds.
This company is called Clearview AI and it is attempting to remain anonymous. The New York Times' podcast "The Daily" investigated this company, unveiling a fake address, secretive investors and false identities to cover employees. Its goal is to remain entirely private (ironic, is it not?), with one of their employees telling The New York Times reporter Kashmir Hill, “the company has told us not to talk to you.” Eventually, Hill is able to get information, uncovering that more than 600 law enforcement agencies have already begun using or testing this app.
The appeal of this app is its ease and effectiveness. Some may even promote it. A police officer, during his interview with Hill, stated that he had solved a case in 20 seconds that they had not been able to solve after reviewing more than 30 other dead-end cases. So what is the problem? It is a great app, that will allow law enforcement to protect their jurisdictions with ease and efficacy.
Primarily, this is still a private company. It is currently only selling to law enforcement, but then it is able to monitor what law enforcement is doing. It knows who law enforcement uploads, searches or investigates. It also possesses the power to remove specific officers or agents from the app, giving this company power over law enforcement agencies across the country.
This is not a narrow problem. The use of facial recognition is spreading to all industries, creating a chain reaction. This software is available to marketing companies, retailers, with even churches using it for tracking attendance. Perhaps most terrifying, though, is its new role in schools.
Just last week, a New York school district has approved the implementation of facial recognition technology in its schools. This project was initially approved back in 2017, but faced criticism from civil liberties groups.
It caught on though, because the technology was initially recommended to the school in 2015 during the aftermath of Sandy Hook. There it is again — the promise of safety for the small price of forfeited liberty. The district technically complied with all the regulations of getting the new technology approved but this might not be representative of the majority since many did not attend the meeting in June 2019, according to The New York Times.
Facial recognition and its ramifications are not far away problems. Its use in public schools could become the norm, eventually leading to its installation on the campuses of public universities (yes, that includes Rutgers). This means that facial recognition may be used for “safety,” or become commonplace — taking attendance, securing exams and even tracking bus usage or monitoring residence halls.
Smile for the cameras!
Jessica Graham is a School of Arts and Sciences senior majoring in political science. Her column, "Considerations of Crime," runs on alternate Wednesdays.
*Columns, cartoons and letters do not necessarily reflect the views of the Targum Publishing Company or its staff.
YOUR VOICE | The Daily Targum welcomes submissions from all readers. Due to space limitations in our print newspaper, letters to the editor must not exceed 500 words. Guest columns and commentaries must be between 700 and 850 words. All authors must include their name, phone number, class year and college affiliation or department to be considered for publication. Please submit via email to [email protected] by 4 p.m. to be considered for the following day’s publication. Columns, cartoons and letters do not necessarily reflect the views of the Targum Publishing Company or its staff.So why are we allowing rapists and attempted murders to carry weapons?
Comments powered by Disqus
Please note All comments are eligible for publication in The Daily Targum.