SUBSCRIBE NOW

SIGHT

Be informed. Be challenged. Be inspired.

Postcards: US renters fall foul of algorithms in search for a home

US New York real estate

CAREY L BIRON, of Thomson Reuters Foundation, reports on how automated screening programs are under scrutiny amid broader concerns about the potential of algorithms to lock in bias and perpetuate inequality…

Washington DC, US
Thomson Reuters Foundation

Candice’s latest rejection email for a housing rental in Washington, DC, looked much like all her others – a generic response with no proper explanation of the decision.

“It will just say, ‘unfortunately, right now we didn’t accept your application,'” said Candice, 36, who has lived in the US capital for most of her life and is now looking for a larger home for herself and her three children.

“I felt it was computer-generated. And of course, computers – they’re faulty,” she told the Thomson Reuters Foundation, asking to be identified by only her first name.

US New York real estate

A man looks at advertisements for luxury apartments and homes in the window of a real estate sales business in Manhattan’s upper east side neighbourhood in New York City, New York, US, on 19th October, 2021. PICTURE: Reuters/Mike Segar

Candice, who said she had received about 10 such rejections in recent months, is currently unemployed but benefits from welfare assistance that would cover her rent in full.

She and other would-be tenants think their applications for rental housing are falling foul not of landlords, but of automated screening programs that scan credit scores, eviction or criminal histories and even social media activity to determine if an applicant is a rental risk.

“For renters with housing vouchers and low-income renters…it’s making it harder for them to find housing in a city that’s already in the midst of a housing crisis.”

– Susie McClannahan, who manages the fair housing rights program at the Equal Rights Center civil rights group.

The widely used programs are facing increased scrutiny from lawmakers in Washington, DC, and beyond amid broader concern about the potential of algorithms to lock in bias and perpetuate inequality.

Susie McClannahan, who manages the fair housing rights program at the Equal Rights Center civil rights group and has worked with Candice, calls it the “black box of algorithmic discrimination”.

Rental applicants are “being denied at properties for reasons they don’t know, and that the provider might not even know,” McClannahan said, adding that some third-party screening systems mined data they were banned from using, such as old criminal convictions.

“For renters with housing vouchers and low-income renters…it’s making it harder for them to find housing in a city that’s already in the midst of a housing crisis,” she said.

City lawmakers are taking note. In September, they debated legislation to ban “discrimination by algorithms,” including in housing – one of several efforts nationwide.

And last month, the White House released a “Blueprint for an AI Bill of Rights,” warning that “discrimination by algorithms” is unacceptable.

Regulatory action on the issue is likely in the coming year, said Ben Winters, counsel at the Electronic Privacy Information Center (EPIC) watchdog group.

“We’re at a transition point,” he said.

House with For Rent sign

A rental sign in front of a house. PICTURE: Feverpitched/iStockphoto.

The tenant-screening industry, worth around $US1 billion, is drawing interest from tech startups and venture capital, according to the Tech Equity Collaborative, a watchdog group.

There are hundreds of tenant-screening tools available in the United States, supplanting a process traditionally undertaken by landlords, said Cynthia Khoo, a senior associate with Georgetown University’s Center on Privacy & Technology.

While that process was also open to discrimination, she said today’s automated tools operate far more efficiently, at greater scale and greater speed, and with access to far more data.

“These are new technological tools being used to carry out the same age-old discrimination we’re familiar with,” she said, adding that they were even less transparent.

As regulators in California and Colorado, and at the Federal Trade Commission, work on the issue, many are watching the capital’s Stop Discrimination by Algorithms Act (SDAA) as a potential blueprint.

“This is the most robust legislation in the US,” Khoo said of the bill.



The current draft states that algorithms cannot discriminate against any groups already protected under local law, said Winters, while applicants would have to be alerted to the use of these systems and given explanations if rejected. 

Most firms using these tools would have to audit their algorithms to make sure they knew what the programs were doing, he said, and applicants would be able to sue over potential infractions.

In response to a request for comment, the Consumer Data Industry Association, a trade group, referred to testimony it gave in opposition to the SDAA, as well as a letter sent to the DC Council in October by nine financial services groups.

The letter noted that companies were already prohibited from discrimination in credit or other financial services, and that the DC bill would increase the potential for fraud and hit credit access.

“Algorithms make credit decisions more accurate, fair, faster and more affordable by judging applicants on their credit worthiness,” the groups said.

“Algorithms also eliminate some of the risk of the biases that can be found in human interactions and can help identify products and services designed to benefit communities, including historically underserved populations, helping close the racial wealth gap.”

Houses in a US neighbourhood

Houses in a US neighbourhood. PICTURE: AlenaMozhjer/iStockphoto

Yet some question whether algorithms drawing on public data can be objective when the data itself is tainted, said Catherine D’Ignazio, an associate professor of urban science and planning at the Massachusetts Institute of Technology.

Often data such as credit scores that seems objective is actually the result of decades of racism or marginalization – thus baking bias into the math, she said.

The idea of algorithmic fairness suggests that “everyone starts equally and is treated equally. But history hasn’t treated people equally.”


We rely on our readers to fund Sight's work - become a financial supporter today!

For more information, head to our Subscriber's page.


Still, recognising this disconnect offers an opportunity for change for the better, D’Ignazio said.

“Tainted” historical data can also skew home valuations, said John Liss, founder of True Footage, whose company launched last year with an eye to addressing appraisal gaps between white and minority homeowners by using a combination of automation and human oversight.

For years, home appraisals often did not seem tied to data, Liss said – to the particular detriment of Black and Hispanic homeowners.

While bringing automation into the appraisal process helps to address this in part, he said, “automated valuation models are extremely dangerous because they’re tainted” by historical data.

For True Footage, he said, the key is to have human appraisers, increasingly drawn from historically marginalised communities, involved in interpreting the data.

“There’s a place for technology,” Liss said. “(But) having a human at the wheel to interpret the data is much more accurate.” 

 

Donate



sight plus logo

Sight+ is a new benefits program we’ve launched to reward people who have supported us with annual donations of $26 or more. To find out more about Sight+ and how you can support the work of Sight, head to our Sight+ page.

Musings

TAKE PART IN THE SIGHT READER SURVEY!

We’re interested to find out more about you, our readers, as we improve and expand our coverage and so we’re asking all of our readers to take this survey (it’ll only take a couple of minutes).

To take part in the survey, simply follow this link…

Leave a Reply

Your email address will not be published. Required fields are marked *

For security, use of Google's reCAPTCHA service is required which is subject to the Google Privacy Policy and Terms of Use.