On Thursday, the U.S. Court of Appeals for the Ninth Circuit became the first appellate court in the nation to directly address the privacy harms posed by face recognition technology. The decision is a significant advance in the fight against the threats of face surveillance, sounding the alarm on the potential for this technology to seriously violate people’s privacy.
In Patel v. Facebook, a group of Facebook users from Illinois allege that Facebook violated the Illinois Biometric Information Privacy Act (BIPA) by using face recognition technology on the users’ photographs without their knowledge and consent. BIPA is the oldest and strongest biometric privacy law in the country, requiring companies to obtain informed consent before collecting a person’s biometric identifiers, including face recognition scans. Importantly, the law provides individuals in Illinois with a right to sue for damages if a company has violated their rights.
Facebook’s primary argument in the case was that in order to establish “standing” to sue, the plaintiffs should have to demonstrate some concrete injury beyond a violation of BIPA's requirement of notice and consent. As we argued in an amicus brief last year, surreptitious use of face recognition technology does cause harm, by subjecting people to unwanted tracking and by leaving them vulnerable to data breaches and invasive surveillance. Given the rapid proliferation of face surveillance technology in recent years, it is critical that Illinoisans are able to enforce BIPA’s protections against unwanted collection of their biometric information. A requirement that a person must demonstrate monetary loss or similar injury in order to sue would seriously undermine BIPA’s intent to safeguard against abusive collection of biometric data in the first place.
In Thursday’s ruling, the Ninth Circuit agreed, holding that “the development of a face template using facial-recognition technology without consent (as alleged here) invades an individual’s private affairs and concrete interests.”
To reach that conclusion, the court looked not only to the long-recognized entitlement of people to sue private parties over violations of common-law privacy rights but also to evolving Fourth Amendment protections against law enforcement surveillance. This includes the landmark decision in Carpenter v. United States, an ACLU case about police access to cell phone location data decided last year. As the Ninth Circuit explained, drawing from language in Carpenter, “[i]n its recent Fourth Amendment jurisprudence, the Supreme Court has recognized that advances in technology can increase the potential for unreasonable intrusions into personal privacy… As in the Fourth Amendment context, the facial-recognition technology at issue here can obtain information that is ‘detailed, encyclopedic, and effortlessly compiled,’ which would be almost impossible without such technology.”
The Ninth Circuit’s ruling is important not only because it explains why surreptitious use of face recognition by corporations harms people’s privacy interests, but also because it puts law enforcement on notice that recent Supreme Court cases regulating other forms of electronic surveillance have something to say about face surveillance technology.
Indeed, the potential for this technology to enable the government to pervasively identify and track anyone (and everyone) as they go about their daily lives is one of the reasons the ACLU is urging lawmakers across the country to halt law enforcement use of face surveillance systems. This decision puts both corporations and law enforcement agencies on notice that face surveillance technology poses unique risks to people’s privacy and safety.
The Ninth Circuit’s ruling also demonstrates the importance of privacy laws including strong private rights of action, affirming people’s right to turn to the federal courts for redress when their rights have been violated. Without a right to sue, privacy guarantees will often prove ephemeral. As state legislatures and Congress move forward on consumer privacy legislation, they should follow Illinois’ lead by including private rights of action in these statutes.
The Varying Laws Governing Facial Recognition Technology
News coverage abounds about the latest breakthroughs in facial recognition technology. But, while this technology is an amazing technical achievement, it is not without potential drawbacks to privacy for those unwittingly subject to facial recognition in public. This includes the recent emergence of facial recognition technology paired with the large amounts of data available on the internet and social media through the scraping of images from numerous internet sources to provide an unusually powerful tool for uncovering the identity – including name, address, and interests – of an individual through the use of just a single photograph. In response to these burgeoning technological advances in the field, cities, and states have begun developing an array of legal approaches to regulating facial recognition technology, some scrambling to limit or prohibit its use, others enthusiastically embracing it. In this patchwork legal landscape, it can be challenging to know where and when the technology can be used – and for what purposes.
Facial Recognition and Privacy Concerns
At its core, facial recognition technology consists of computer programs that analyze images of human faces and compare them against other images of human faces for the purpose of identifying or verifying the identity of the individuals in the images. The technology can be used for a myriad of purposes, such as identifying a smartphone user in order to unlock her phone, streamlining the check-in process at a hotel or rental car facility, enabling a customer to try out makeup virtually, or sending parent images of his / her child at school or day camp. The technology also has obvious surveillance and law enforcement uses, and concern about the privacy implications for use in these contexts has driven many jurisdictions to place limits on the use of facial recognition technology.
San Francisco and Oakland, California, Brookline, Cambridge, Northampton, and Somerville, Massachusetts have all banned the use of facial recognition technology by city agencies. The city council in Portland, Oregon has proposed going a step further, banning the use of the technology in both the public and private sectors to the extent the technology is or might be used for security purposes.
Law enforcement agencies in many other cities have also taken stances on the technology. Some, like the Seattle Police Department, have ceased to use facial recognition technology amid concerns about biased and inaccurate results. The Plano, Texas Police Department touts the benefits they reap from the use of the technology, which they credit with solving numerous cases. Others, like the Detroit Police Department, permit the use of facial recognition technology only under certain conditions, such as when the technology seems reasonably likely to aid the investigation into violent crimes. At the state level, three states have banned facial recognition technology in police body cameras: Oregon, New Hampshire, and, most recently, California.
Getting Consent
Laws regarding the use of facial recognition technology are not limited to the public sector. Several states have worked biometric information into their existing data privacy laws – or created new laws specifically geared toward biometric data collection. Illinois was the first state to address the collection of biometric data by private businesses. Its Biometric Information Privacy Act (BIPA), passed in 2008, places significant limitations on how private entities can collect and use a person’s biometric data. It requires that a business obtain informed consent prior to the collection of biometric data. It also prohibits profiting off biometric data, allows only a limited right to disclose collected data, sets forth data protection obligations for businesses, and creates a private right of action for individuals whose data has been collected or used in violation of the law — even if the individual is unable to show that he or she suffered actual harm from the violation. Illinois has seen numerous class actions premised on this law in the decade since the law has been in effect.
Texas law, like Illinois, requires individuals or companies who collect biometric data to inform individuals before capturing the biometric identifier and to receive the individual’s consent. But unlike the Illinois law, the Texas biometric privacy statute does not require a written release. The Texas law, like the Illinois law, does prohibit the sale of biometric information, and it likewise sets restrictions on how such information is stored.
Washington state’s biometric privacy statute took effect in 2017. Like the Texas law, it does not specify that consent to the collection of biometric data be in writing, nor does it create a private cause of action against violators. Unlike its Illinois and Texas counterparts however, the Washington state law carves out an exemption to biometric data collection and storage; businesses may collect and store such information without providing notice and obtaining consent so long as the information is collected for “security purposes.” defined to include collection, storage and use of the information for purposes of preventing shoplifting, fraud, and theft. The Washington state law also permits companies to sell biometric information under limited circumstances.
Most recently, California’s law applies on a somewhat more limited scale than the Illinois, Texas, and Washington laws. The California law targets any company that (1) operates in California and (2) either makes at least $25 million in annual revenue, gathers data on more than 50,000 users, or makes more than half its money off of user data. The law treats biometric information, including images of one's face, as personal information, and provides rights to consumers to protect their personal information, such as by allowing consumers to obtain information about the collection and sale of their personal information and by allowing consumers to opt out of the sale of their personal information. Although California lawmakers contemplated requiring businesses to disclose their use of facial recognition technology by conspicuously posting physical signs at every entrance to any facility that uses facial recognition technology, the proposed legislation requiring such disclosure failed to garner sufficient interest to pass and has been ordered inactive.
Stay Engaged
Lawmakers in cities and states around the country continue to explore options to limit how the public and private sector utilize facial recognition technology. Municipal agencies, businesses and inventors must keep themselves apprised of this increasingly complex regulatory environment. If you live in an area considering such a law, you should participate in the process. If you operate a business in a state or city with such laws or collect data on individuals living in any of those areas, you must keep abreast of the changes in this field if you also collect biometric data. This might require any single company to carefully comply with a patchwork of different laws. It is an ever expanding and changing area of law and what might have been acceptable and legal previously could prove problematic or illegal in the future.
Cite this page
A Federal Court Sounds the Alarm on the Privacy Harms of Face Recognition Technology - Free Paper. (2023, Dec 16). Retrieved from https://proessays.net/essays/a-federal-court-sounds-the-alarm-on-the-privacy-harms-of-face-recognition-technology-free-paper
If you are the original author of this essay and no longer wish to have it published on the ProEssays website, please click below to request its removal:
- The Color of Justice - Essay Sample
- Assignment Example on Ethical Issues in Criminal Justice
- Paper Example on Parental Incarceration Impact on Child Behavior
- Research Paper on Ethical and Legal Issues in CBPR
- ACLU Advocates for New Immigration Policy in NY: Protecting Immigrant Rights
- Recidivism: Prior Conviction, Reoffending, and Delinquent Action - Essay Sample
- Essay Example on Bob Shot Twice After Plan of Attack at LGBTQ Rally Discovered