Legislators at many levels of government in the United States have proposed prohibiting the use of software algorithms that depend on non-public data from competitors to assist sellers in making pricing decisions. Of the many proposals made in 2024,[1] only local ones in San Francisco and Philadelphia passed. This post summarizes the ongoing legislative efforts currently underway. Some of them focus on pricing software used by landlords who rent residential rental apartments. Others are broad in scope and would apply to the many other industries that employ pricing software, such as the hospitality, travel, and tourism industries.
On February 6, 2025, Senator Amy Klobuchar (D-MN) introduced the Preventing Algorithmic Collusion Act (S. 232),[2] which aims to prevent companies from using software algorithms to collusively raise prices, building on similar federal legislation that Congress failed to pass in 2024. The legislative effort reflects the evolving landscape of consumer protection in the new age of artificial intelligence software. The bill would make it presumptively unlawful for competitors to share their non-public information through a pricing algorithm to raise prices. It would prohibit the use of non-public, competitively sensitive information from a competitor to train an artificial intelligence algorithm. And it would require companies that use algorithms to set prices to disclose that fact and allow antitrust enforcers to audit the pricing algorithm to address concerns regarding harm to consumers.
California is contemplating similar legislation. The California Preventing Algorithmic Collusion Act of 2025 (SB 295)[3] was introduced by Senator Melissa Hurtado (D-Bakersfield) in the California State Senate on February 6, 2025 (and is similar to legislation the senator introduced in 2024). The Act would prohibit the use or distribution of pricing algorithms that incorporate undefined “competitor data.” It also mandates that entities with $5 million or more in annual revenue provide the California Attorney General with reports on pricing algorithms and disclose to customers when prices are set by such algorithms. Violations could lead to civil actions by the California Attorney General or district attorneys, with penalties including civil fines, forfeiture of corporate rights, and dissolution of the offending entity.
Although neither the federal nor California proposed legislation targets a specific industry, they respond in part to a contentious debate between landlords and tenants of residential rental apartments that has risen to prominence in the past few years. Owners of large multifamily residential buildings increasingly employ revenue management software to optimize rental pricing. Such software employs artificial intelligence to analyze nonpublic competitor data such as rents, occupancy rates and lease terms. The software vendor typically obtains the nonpublic data from its customer base of building owners who license the software. Landlords and the vendors of this software contend that these tools enhance price discovery and operational efficiency. However, various legislative and regulatory bodies have expressed concern that such tools may facilitate price-fixing, leading to inflated rents and reduced competition, particularly in tight housing markets. This has prompted legislative and regulatory responses at local, state, and federal levels, with a focus on protecting renters from unfair practices.
San Francisco and Philadelphia have taken the lead, enacting ordinances specifically targeting revenue management software used to set rents for residential rental apartments. A similar ordinance is under consideration in San Jose, California.
In July 2024, the San Francisco Board of Supervisors passed Ordinance No. 224-24,[4] prohibiting the sale or use of revenue management software for rental housing. It is the first such prohibition enacted in the United States. The ordinance defines the targeted software as algorithmic devices that set, recommend, or advise on rents or occupancy levels using nonpublic competitor data. Remedies may be sought by the city or tenants and include civil penalties up to $1,000 per unit per month, plus damages, restitution, and attorneys’ fees.
In October 2024, the Philadelphia City Council passed Bill No. 240823[5] prohibiting the use of algorithm-driven revenue management tools that use nonpublic competitor data to recommend residential rental prices, fees, terms or occupancy levels. The ordinance allows the city to sue on behalf of residents and establishes private rights of action. Remedies include treble damages and statutory damages up to $2,000 per unit per day.
In San Jose, city councilmembers proposed an ordinance in September 2024 that would ban the sale or use of revenue management software for rental housing, following San Francisco’s model. Penalties for violations would include the “return of illegal profits” and up to $1,000 per violation. The ordinance was considered and deferred in an October 2024 meeting of the council. In San Diego, the city council has directed the City Attorney to draft a proposed ordinance banning the use of algorithmic price fixing software.
Finally, the New Jersey State Legislature is also considering banning the use of algorithmic price fixing software in residential rental property markets. Assembly Bill A.4872,[6] introduced in September 2024, is currently pending before the Appropriations Committee. The bill would prohibit the use of software algorithms that employ non-public data to recommend rental prices, lease renewal terms, or occupancy levels to building owners. Doing so would violate the New Jersey Antitrust Act, under which enforcement actions could be taken and remedies sought.
The legislative efforts to regulate algorithmic price-fixing reflect a growing concern with the perceived risks posed by AI-driven pricing tools, particularly in markets where affordability is already a pressing concern. While laws in cities such as San Francisco and Philadelphia aim to shield consumers from inflated prices, the broader debate highlights the need for a nuanced approach that balances consumer protection with the advantages of technological innovation. As artificial intelligence continues to advance, regulatory frameworks must evolve in tandem, remaining flexible enough to tackle new challenges without undermining the efficiency gains these tools can provide. Collaboration among policymakers, industry leaders, and consumer advocates will be essential to develop policies that promote the legitimate interests of the many stakeholders involved in an increasingly digital economy.
[1] In 2024, proposed legislation failed to pass in California, Colorado, Connecticut, Illinois, New Hampshire, New York, Oklahoma and Rhode Island.
[2] https://www.congress.gov/bill/119th-congress/senate-bill/232/text/is?format= xml&overview=closed
[3] https://legiscan.com/CA/text/SB295/id/3107394#:~:text= The%20bill%20would%20prohibit%20a,is%20to%20that%20extent%20void
[4] https://www.sf.gov/information–sec-3710c-use-and-sale-algorithmic-devices-prohibited
[5] https://phila.legistar.com/View.ashx?M=F&ID=13548432&GUID=DAEBB175-3E16-4BFD-A7BF-4B1DCF3CB850
[6] https://legiscan.com/NJ/drafts/A4872/2024. Senate Bill S.3699 is the assembly bill’s companion legislation.
