Particle.news

Download on the App Store

SafeRent Settles $2.3M Lawsuit Over AI Tenant Screening Discrimination

The settlement requires SafeRent to stop using AI-generated scores for housing voucher applicants and addresses claims of racial and income bias.

  • SafeRent Solutions agreed to a $2.3 million settlement after being accused of discrimination in its AI-powered tenant screening system.
  • The lawsuit alleged that SafeRent's algorithm disproportionately scored Black and Hispanic applicants, as well as low-income tenants using housing vouchers, lower than others.
  • As part of the settlement, SafeRent will no longer use AI-generated scores or provide recommendations for applicants with housing vouchers.
  • The case highlighted broader concerns about AI systems in housing and other industries, with critics arguing such algorithms can unintentionally perpetuate biases.
  • The settlement marks one of the first legal challenges to AI-driven tenant screening tools and sets a precedent for increased accountability in algorithmic decision-making.
Hero image