Here on PetaPixel, the focus is often how people use cameras to create beautiful art, capture incredible moments, and document important events. However, sometimes there’s an intersection between camera technology and ethics that is just as important, if less visually interesting.
An exhaustive investigation by The Washington Post found that surveillance cameras purchased using federal crime-fighting grants are being used to spy on, reprimand, and even evict public housing residents. Sometimes the alleged violations are minor.
Surveillance Systems are Hounding Residents
In Steubenville, Ohio, local officials installed a surveillance system under the guise of getting ongoing gang violence under control. However, as Washington Post reporter Douglas MacMillan writes, “…residents of Steubenville public housing soon learned the cameras were pointed at them.”
One resident was recorded spitting in the hallway, while another removed a cart from a communal laundry facility. In both cases, footage captured by the new surveillance cameras was presented as evidence in court, and each resident was evicted.
Another guest, Melanie Otis, 52, was recorded letting someone borrow her key fob, which is understandably a rules violation. However, Otis is visually impaired, and the person was her friend who was helping Otis out by delivering groceries. Otis wasn’t evicted after she explained the situation, but the surveillance evidence was initially used to threaten eviction.
‘Big Brother’ is Always Watching, and is Getting Smarter
“In public housing facilities across America, local officials are installing a new generation of powerful and pervasive surveillance systems, imposing an outsize level of scrutiny on some of the nation’s poorest citizens. Housing agencies have been purchasing the tools — some equipped with facial recognition and other artificial intelligence capabilities — with no guidance or limits on their use, though the risks are poorly understood and little evidence exists that they make communities safer,” writes MacMillan.
The use of artificial intelligence and facial recognition software is especially concerning. In Scott County, Virginia, cameras scan everybody who walks by, analyzing faces and searching for people who aren’t allowed to live in public housing.
In New Bedford, Massachusetts, specialized software chews through hours of recordings in pursuit of people that violate overnight guest rules.
Public housing officials in Rolette, North Dakota, have installed a mind-boggling 107 individual cameras to monitor 100 residents closely. MacMillan notes that that’s significantly more cameras per person than Red Hawk Casino and is reaching the territory of the infamous Rikers Island prison in New York.
Why so Many Cameras?
According to the data it collected from institutions and security vendors, The Post observes that in public housing in New York City, Omaha, Milwaukee, and Rolette, there’s one camera for every 19, 10, three, and 1.1 residents, respectively. That’s more cameras per person than the professional sports stadium Wrigley Field, the extremely busy Los Angeles Airport, and the Louvre in Paris.
The U.S. Department of Housing and Urban Development (HUD) is partially responsible for the surveillance surge. HUD provides federal crime-fighting grants, with the intent of increasing “safety” in public housing.
Housing agencies believe that this goal is being achieved. However, at what cost?
Taxpayers are footing the bill for public housing agencies to spy on their residents. In some cases, this surveillance undoubtedly helps document legitimate and genuinely dangerous behavior.
However, according to The Washington Post, in many cases, based on extensive interviews with residents, legal aid attorneys, court records, and correspondence with housing administrators, the federally funded cameras are being used to punish and sometimes evict people over “minor violations of housing rules.”
Excessive Surveillance Disproportionately Affects People of Color
The Washington Post acknowledges that no hard data shows how often surveillance cameras are used to evict people from public housing. However, the general practice of surveilling and evictions is well documented.
Further, the practice is disproportionately affecting people of color. The overwhelming majority of the 1.6 million Americans living in public housing are non-white residents. These residents are undergoing 24/7 surveillance.
The unfortunate residents who are evicted, even for minor lease violations, can struggle to find new housing, negatively affecting their ability to work and find new employment. A vicious circle is formed when someone is evicted from public housing, and people already experiencing challenges are subjected to an even more complicated existence. People living in public housing are often there because of financial limitations or other backgrounds that affect their ability to find a place to live.
The Government’s Response
HUD spokesperson Christina Wilkes writes to The Washington Post in an email that the agency didn’t intend for its safety and security grants to be utilized by housing administrators to punish and evict residents.
However, using the cameras for that purpose “is not a violation of the grant terms.”
Callousness of Eviction
Melody McClurg, the executive director of the Jefferson Metropolitan Housing Authority in Steubenville, Ohio, says that “people choose to get evicted by their actions.” She explains that the cameras, which are exclusively found in public areas, are just one of the authority’s ways of enforcing the rules that all tenants are expected to follow.
The spitting man, the woman who took the laundry cart, and even Melanie Otis were apparently being carefully observed for other reasons.
Tania Acabou of New Bedford, Mass., was evicted from public housing in 2021 when the local housing authority used its cameras to monitor Acabou over “several months.” A single mother, Acabou’s ex-husband came over to help care for the couple’s two children while Acabou worked days and attended school at night.
The housing authority believed that she had an overnight guest for more than 21 nights per year, the limit stipulated by the authority’s policy, and asserted that Acabou’s ex-husband was living at the property.
The housing authority used software to closely monitor Acabou’s front door, monitor her activity, and save every instance of motion detected at her door.
When the property manager suspected Acabou’s ex-husband of subverting surveillance by departing through the back door, she installed a portable camera in the backyard that pointed at the rear entrance.
“It got to the point where it was like harassment. They really made my life hell,” Acabou, 33, tells The Washington Post.
Sam Ackah, the security director for the New Bedford Housing Authority, says that the agency doesn’t aim to evict people and works to establish agreements with its residents. Ackah claims that to protect residents who follow the rules, the housing authority must monitor those who don’t.
In Acabou’s case, Ackah says they tried to work with Acabou to stop allowing her ex-husband to live there. Acabou maintains that her ex was not living there and claims to have presented the housing authority with evidence proving that he lived at a different residence.
Ackah notes that the New Bedford Housing Authority bought the surveillance cameras using its budget. Ackah also says that the cameras help catch violators and unregistered guests. He asserts that it’s crucial to monitor guests because registered guests undergo a vetting process to check for offenses and criminal behavior.
Racial Biases in Facial Recognition
While some residents have no issues with the cameras, and others aren’t concerned about being surveilled as long as they obey the rules, that overlooks an established issue with facial recognition technology used by many housing authorities.
Some states, including Alabama, Colorado, and Virginia, have recognized some of the dangers and risks of facial recognition and have banned law enforcement from using these tools. A federal study showing the racial bias of many facial recognition systems underlines the importance of that legislation.
False matches should concern all residents, as they may find themselves under increasing scrutiny for no reason other than poorly implemented technology. This is especially concerning for women and people of color, who are more often the victims of false matches by facial recognition software.
When The Washington Post delivered evidence of increasing use of surveillance tools by housing authorities to HUD, the HUD general deputy assistant secretary of public and Indian housing, Dominique Blum, said that the agency would prohibit future grant recipients from spending federal money on facial recognition. She said the tools “are not foolproof.”
“This sends a signal to the housing community that this is the type of technology that the department is cautioning against,” Blom said of the new grant guidelines.
Creating a Police State
Many security personnel in public housing authorities, including John Stasiulewicz, a former detective who works for the Steubenville housing authority, views himself as an “arm of the police department.”
Stasiulewicz, who goes by “Stosh,” monitors feeds from 161 cameras in public housing. MacMillan writes, “This means public housing residents — who are nearly three times more likely to be Black than other Steubenville residents, census records show — are about 25 times more likely to have their daily lives observed by government-controlled cameras.”
Even in cases where cameras help detect legitimate crime, most rule-abiding residents live in a pseudo-police state. In some cases, the surveillance systems in public housing are an extension of an existing police state.
The argument that those who aren’t doing anything wrong have nothing to hide holds little water when it’s been established that face recognition technology is only sometimes reliable and can sometimes misidentify people.
Further, the supposition that surveillance helps reduce, prevent, or resolve violent crime is debatable.
While some surveillance cameras have been used as admissible evidence in courts, sometimes helping lead to convictions, there’s not a pattern of evidence that suggests that an increase in surveillance cameras in public housing is having a tangible impact on crime reduction.
However, what’s without a doubt is the effect of constant surveillance on residents. In some cases, the surveillance directly or partially results in life-altering adverse outcomes that are seemingly unjustifiable, even if technically legal. There is sometimes a rift between what’s right and what’s legally permissible, and in the case of evicting someone from public housing, legally acceptable outcomes can be devastating.
Increasing Surveillance Results in More Evictions
The federal government enacted a moratorium on evictions in light of the COVID-19 pandemic. However, in late 2021, when the moratorium lifted, evictions increased, as expected. Princeton University’s Eviction Lab notes that in the 34 cities across 10 states that the lab tracks, there were at least 5,576 evictions by public housing authorities in 2022. That’s about twice as many as the previous year. Again, that’s to be expected given the moratorium lifting late in the prior year.
However, what’s not necessarily as expected is that public housing evictions grew faster than overall evictions, according to the Eviction Lab’s associate director, Peter Hepburn.
Are the higher eviction rates in public housing due to greater surveillance? It’s difficult to say. However, Gavin Bates, a legal aid attorney in New Bedford, says the New Bedford Housing Authority “regularly” uses its surveillance system during eviction proceedings.
“Quite often, when there is camera footage of an event, the event does show a rules violation of some kind. But there are also a lot of unrepresented folks who just believe that there is camera footage when they are told, the footage is never produced, and they do not know their rights and often make bad decisions as a result,” Bates says.
This alludes to an important point. The proliferation of surveillance creates a culture of paranoia among the observed. Even when the evidence isn’t presented, it’s easy to believe it exists. How many people that live in public housing are in the financial position to enter a drawn-out legal fight?
Algorithms Used with Malice
The Post has identified six cities in the U.S. where cameras with facial recognition technology watch public housing residents. Another seven agencies have cameras that can recognize faces, although they claim that the recognition isn’t in use.
While the complete list is available on The Washington Post‘s website, the general takeaway is that housing agencies are using facial recognition technology to identify criminal suspects, monitor for lease violations, assist police, look for banned individuals, grant tenant-specific access to certain buildings, and in one case, only for police emergencies, whatever that precisely means.
A lot of money is being spent on these camera technologies, but how much care is being given to the algorithms behind the cameras?
Computer algorithms and artificial intelligence are notoriously problematic when dealing with race.
ACLU writes, “There is ample evidence of the discriminatory harm that AI tools can cause to already marginalized groups. After all, AI is built by humans and deployed in systems and institutions that have been marked by entrenched discrimination — from the criminal legal system, to housing, to the workplace, to our financial systems. Bias is often baked into the outcomes the AI is asked to predict. Likewise, bias is in the data used to train the AI — data that is often discriminatory or unrepresentative for people of color, women, or other marginalized groups — and can rear its head throughout the AI’s design, development, implementation, and use.”
When problematic AI is implemented to control something as integral as housing access, it’s worthy of scrutiny — certainly more scrutiny than it seems to be getting by the very people using the technology as part of their work for housing authorities and agencies.
A 2019 study showed that Asian and Black people were up to 100 times more likely to be misidentified than white men by facial recognition systems.
The Public Housing Surveillance Industrial Complex
Surveillance system manufacturers and the associated software developers have received a lot of taxpayer money to institute surveillance networks in and around public housing.
A notice published by HUD on April 21, 2023, outlines a ban on automated surveillance and facial recognition technology, although the documentation doesn’t define what that includes. Further, the restriction doesn’t apply to agencies that already received grant money.
In recent years, grants for increased security have been around $10 million per year. However, individual agencies can use other federal funds on security cameras.
Finding Evidence to Fit a Purpose
It’s also worth considering how having cameras everywhere enable housing authorities to find ways to evict people they have preexisting issues with, whether those issues are lease violations.
Suppose a person has run afoul of a housing authority, security personnel can watch them with laser focus and simply wait for them to make a mistake, even if that mistake doesn’t negatively affect other tenants or the property itself. There are many ways to violate a lease.
As Acabou found out, some agencies have gone to great lengths to evict someone, even if that individual’s only apparent violation seems to be trying to improve their education and create a better life for themselves and their family.
Camera Technology Run Amok
Cameras are powerful tools to document life’s best and most important moments. However, they can also be pervasive and used poorly, even sometimes with noble intentions.
People living in many public housing settlements across the US are subjected to near-constant surveillance. While some argue that this improves safety and allows housing authorities to enforce rules and protect its residents more effectively, others believe that the increased surveillance disproportionately harms people of color.
Further, AI technology used by some surveillance systems has been shown to inherit some of the systemic and institutional racism that negatively impacts people of color every single day. Now people aren’t even safe from racism at home.
Improvements in camera technology and AI are helpful for many reasons. However, they should not be left unchecked. Not all advancements are improvements.