Skip to content
Search

Latest Stories

Follow Us:
Top Stories

On Live Facial Recognition in the City: We Are Not Guinea Pigs, and We Are Not Disposable

Opinion

On Live Facial Recognition in the City: We Are Not Guinea Pigs, and We Are Not Disposable

New Orleans fights a facial recognition ordinance as residents warn of privacy risks, mass surveillance, and threats to immigrant communities.

Getty Images, PhanuwatNandee

Every day, I ride my bike down my block in Milan, a tight-knit residential neighborhood in central New Orleans. And every day, a surveillance camera follows me down the block.

Despite the rosy rhetoric of pro-surveillance politicians and facial recognition vendors, that camera doesn’t make me safer. In fact, it puts everyone in New Orleans at risk.


On Aug. 21, a live facial recognition ordinance was withdrawn by the New Orleans City Council, after months of community organizations fighting back and loudly opposing this dangerous ordinance. A council member's office confirmed that it was removed, pending edits, suggesting that a new one will be introduced. If this or a similar surveillance ordinance is approved, Louisiana would become the first state in the nation with a city-wide biometric surveillance network capable of tracking hundreds of thousands of residents in real time.

That’s not a step we want to take. Once invasive surveillance technology like that ends up in the hands of the government, there are no guardrails or oversight mechanisms powerful enough to protect our freedom and our privacy from bad actors, corrupt politicians, hackers, and anyone who doesn’t have our best interest at heart.

Expanding real-time facial recognition to all city cameras would set an unprecedented shift in mass surveillance for the whole country. It would build the infrastructure for a database that would record our facial features, personal characteristics, and our whereabouts, every time we stepped outside our front doors. All of that data, even if eventually deleted, can be used to train artificial intelligence to get better at recognizing and tracking us over time.

Disturbingly, a collection of cameras positioned across New Orleans is already capable of tracking residents’ every move, recording our data, and trying to match our faces to databases of millions of images of people. These cameras were never approved by the people of New Orleans. They were set up by Project NOLA, a crime prevention nonprofit group, which we now know because of bombshell revelations in the Washington Post. Project NOLA has been secretly spying on New Orleans residents with live facial recognition cameras for years. These cameras are at undisclosed locations around the city, and most importantly, police use of this technology has been outlawed since the local community rallied behind a surveillance ban in 2021.

Enough is enough. Time and again, New Orleans has been used as a testing ground for disempowering programs against our Black and brown communities––not only for secretive racist mass surveillance tech but also for a racist charter school system that has deteriorated our youth’s education. We have been treated as a sacrifice zone for oil, gas, and plastic plants to destroy our ecosystem and poison our health, causing us to have the highest rates of cancer in the country. We are not guinea pigs, and we are not disposable.

As an immigrant, I am desperately sounding the alarm about how devastating this surveillance ordinance would be for all New Orleaneans, including our migrant communities. All over the country, our people are being snatched off the street, our families are being separated, and in New Orleans, even our U.S. citizen babies with cancer are being deported. If we roll out real-time facial recognition in New Orleans, we have to expect that our facial recognition data will be demanded by ICE, requested by Louisiana police, or even hacked by anti-immigrant groups—empowering Trump’s agenda of terrorizing and violating our immigrant communities' fundamental human rights.

Instead of doubling down and investing in costly, racist technology, we should refocus on the root causes of crime and harm. 26% of all adults in New Orleans have low literacy levels. At 22.6%, our poverty rate dwarfs the national average of 10%. Dystopian face surveillance doesn’t solve those problems, but it does put us all at risk. The good news is we already know how to do better. Just last week, The Advocate editorialized about the many community programs and nonprofit efforts that are successfully reducing crime in Louisiana year by year.

Our elected officials have a duty to their constituents: to protect our freedoms, defend our dignity, and keep us safe. Our problems can’t be solved with more cameras and surveillance; they have deep systemic roots that have to be addressed.


Edith Romero is a Honduran community organizer with Eye On Surveillance, a researcher, writer, and a Public Voices fellow of The OpEd Project, The National Latina Institute for Reproductive Justice, and the Every Page Foundation.


Read More

Government Cyber Security Breach

An urgent look at the risks of unregulated artificial intelligence—from job loss and environmental strain to national security threats—and the growing political battle to regulate AI in the United States.

Getty Images, Douglas Rissing

AI Has Put Humanity on the Ballot

AI may not be the only existential threat out there, but it is coming for us the fastest. When I started law school in 2022, AI could barely handle basic math, but by graduation, it could pass the bar exam. Instead of taking the bar myself, I rolled immediately into a Master of Laws in Global Business Law at Columbia, where I took classes like Regulation of the Digital Economy and Applied AI in Legal Practice. By the end of the program, managing partners were comparing using AI to working with a team of associates; the CEO of Anthropic is now warning that it will be more capable than everyone in less than two years.

AI is dangerous in ways we are just beginning to see. Data centers that power AI require vast amounts of water to keep the servers cool, but two-thirds are in places already facing high water stress, with researchers estimating that water needs could grow from 60 billion liters in 2022 to as high as 275 billion liters by 2028. By then, data centers’ share of U.S. electricity consumption could nearly triple.

Keep ReadingShow less
Posters are displayed next to Sen. Ted Cruz (R-TX) as he speaks at a news conference to unveil the Take It Down Act to protect victims against non-consensual intimate image abuse, on Capitol Hill on June 18, 2024 in Washington, DC.

A lawsuit against xAI over AI-generated deepfakes targeting teenage girls exposes a growing crisis in schools. As laws struggle to keep up, this story explores AI accountability, teen safety, and what educators and parents must do now.

Getty Images, Andrew Harnik

Deepfakes: The New Face of Cyberbullying and Why Parents, Schools, and Lawmakers Must Act

As a former teacher who worked in a high school when Snapchat was born, I witnessed the birth of sexting and its impact on teens. I recall asking a parent whether he was checking his daughter’s phone for inappropriate messages. His response was, “sometimes you just don’t want to know.” But the federal lawsuit filed last week against Elon Musk's xAI has put a national spotlight on AI-generated deepfakes and the teenage girls they target. Parents and teachers can’t ignore the crisis inside our schools.

AI Companies Built the Tool. The Grok Lawsuit Says They Own the Damage.

Whether the theory of French prosecutors–that Elon Musk deliberately allowed the sexualized image controversy to grow so that it would drive up activity on the platform and boost the company’s valuation–is true or not, when a company makes the decision to build a tool and knows that it can be weaponized but chooses to release it anyway, they are making a risk-based decision believing that they can act without consequence. The Grok lawsuit could make these types of business decisions much more costly.

Keep ReadingShow less
Sketch collage image of businessman it specialist coding programming app protection security website web isolated on drawing background.

Amazon’s court loss over Just Walk Out highlights a deeper issue: employers are increasingly collecting workers’ biometric data without meaningful consent. Explore the growing conflict between workplace surveillance, privacy rights, and outdated U.S. laws.

Getty Images, Deagreez

The Quiet Rise of Employee Surveillance

Amazon’s loss in court over its attempt to shield the source code behind its Just Walk Out technology is a small win for shoppers, but the bigger story is how employers are quietly collecting biometric data from their own workers.

From factories to Fortune 500 companies, employers are demanding fingerprints, palmprints, retinal scans, facial scans, or even voice prints. These biometric technologies are eroding the boundary between workplace oversight and employee autonomy, often without consent or meaningful regulation.

Keep ReadingShow less
Close up of a woman wearing black, modern spectacles Smart glasses and reality concept with futuristic screen

Apple’s upcoming AI-powered wearables highlight growing privacy risks as the right to record police faces increasing threats. The death of Alex Pretti raises urgent questions about surveillance, civil liberties, and accountability in the digital age.

Getty Images, aislan13

AI Wearables and the Rising Risk of Recording Police

Last month, Apple announced the development of three wearable smart devices, all equipped with built-in cameras. The company has its sights set on 2027 for the release of their new smart glasses, AI pendant, and AirPods with built-in camera, all of which will be AI-functional for users. As the market for wearable products offering smart-recording capabilities expands, so does the risk that comes with how users choose to use the technology.

In Minneapolis in January, Alex Pretti was killed after an encounter with federal agents while filming them with his phone. He was not a suspect in a crime. He was not interfering, but was doing what millions of Americans now instinctively do when they see state power in motion: witnessing.

Keep ReadingShow less