The legal woes keep piling up for the Texas-based tech company RealPage, which is accused of using algorithmic software to enable landlords across the country to collude to inflate rents. The landmark cases will likely decide whether AI can be used to circumvent antitrust laws and evade other regulations.
The Washington, DC, attorney general, Brian Schwalb, recently became the first to sue RealPage (along with 14 of DC’s biggest landlords) for using the same algorithm program to set prices. More than 90 percent of large apartment buildings in the DC metro area (more than 50 units) use the RealPage software, according to the DC attorney general’s office. The suit accuses them of illegally colluding to set rent prices above competitive levels.
On Nov. 15, the US Department of Justice stepped into a separate massive antitrust lawsuit, backing tenants who are suing RealPage and apartment rental giants from around the country.
RealPage is accused of acting as an information-sharing middleman for real estate rental giants. The company is facing several lawsuits contending that the property managers agreed to set prices through RealPage’s software, which also allowed the companies to share data on vacancy rates and prices in many of the US’ most expensive markets.
The lawsuits against RealPage and the rental management companies contend that RealPage’s software covers at least 16 million units across the US, and private equity-owned property management companies are the most enthusiastic adopters of the RealPage technology. RealPage itself is a private equity-owned venture. Many of the rental markets dominated by large landlords have seen astronomical growth in rental prices in recent years (even before the pandemic), as well as a rising number of evictions and spikes in homelessness.
Reporting, the lawsuits, and RealPage’s own statements showed that the company’s software said that it was often more profitable for mega landlords to have higher vacancy rates and keep rents elevated, which contradicted the old landlord practice of getting heads in beds even if that meant lowering rents.
The following are some of the real estate goliaths named in the lawsuits who were using RealPage software to allegedly collude and keep rents artificially high:
- Greystar: The nation’s largest property management firm with nearly 794,000 multifamily units and student beds under management. In December, it was nominated for six( count ‘em, six!) 2022 Private Equity Real Estate Awards. Roughly 100,000 student beds under management.
- Trammell Crow Company, headquartered in Dallas, is a subsidiary of CBRE Group, the world’s largest commercial real estate services and investment firm.
- Lincoln Property Co. Manages or leases over 403 million square feet across the US.
- FPI Management. Currently manages just over 155,000 units in 18 states.
- Avenue5 manages $22 billion in multifamily and single-family assets nationwide.
- Equity Residential, the 5th largest owner of apartments in the United States, primarily in Southern California, San Francisco, Washington, D.C., New York City, Boston, Seattle, Denver, Atlanta, Dallas/Ft. Worth, and Austin.
- Mid-America Apartment Communities, which as of June 30, 2022, owns or has ownership interest in 101,229 homes in 16 states throughout the Southeast, Southwest, and Mid-Atlantic regions.
- Essex Property Trust (62,000 units). This fully integrated real estate investment trust (REIT) acquires, develops, redevelops, and manages multifamily apartment communities located in supply-constrained markets on the west coast.
- Thrive Community Management (18,700 units in Washington and Oregon).
- AvalonBay Communities, Inc. As of September 30, 2022, the Company owned or held a direct or indirect ownership interest in 293 apartment communities containing 88,405 apartment homes in 12 states and DC.
- Cushman & Wakefield, with a portfolio of 172,000 units.
- Security Properties portfolio reflects interests in 113 assets encompassing nearly 22,354 multifamily housing units.
- Cardinal Group Holdings, LLC. 89,000 units managed with more than 100,000 beds and a heavy presence in student housing.
- CA Ventures Global Services LLC. Manages more than 60,000 beds in 69 university markets.
- DP Preiss Co. Specializes in student housing and has more than 30,000 beds in 12 states.
The lawsuits against RealPage and these landlords, consolidated in federal court in Nashville, Tennessee, are being watched closely as it will have far-reaching consequences across nearly every industry where AI price-setting technologies are proliferating.
RealPage and the landlords are currently attempting to have the lawsuits dismissed based on two arguments:
- The defendants claim that the complaints must be dismissed because RealPage recommends, rather than mandates, certain prices.
The DOJ weighed in with the following response on Nov. 15:
The complaints allege that RealPage exerts pressure to enforce its recommendations and landlords have outsourced pricing decisions to RealPage. Additionally even sharing the info is unlawful: In each of these cases, the challenged price-fixing scheme disrupted the competitive process. It is not necessary for liability that the scheme succeed in raising prices… So long as the evidence shows a mutual understanding among the competing landlords to use RealPage’s prices as a starting point, the scheme is per se unlawful.
The second argument from RealPage and the mega landlords:
- The defendants also argue that their conduct is not price fixing and hence the per se rule cannot apply because “[c]ourts have little to no experience evaluating whether use of revenue management software is unlawful…”
The DOJ’s response: a defendant cannot “shield it[self] from the consequences” of organizing a price-fixing cartel merely because it organizes the cartel through disseminating a software program. Judicial experience is relevant only in determining whether to adopt “a new per se rule”— not in deciding whether to apply an established per se rule.
The DOJ’s argument is essentially that price-fixing law applies to AI. Should that be confirmed by the court, it would have an enormous impact across the country.
“Algorithms are the new frontier,” the DOJ said in their filing. “And, given the amount of information an algorithm can access and digest, this new frontier poses an even greater anti-competitive threat than the last.”
And revelations are now coming out in waves that show nearly every corner of the economy using AI algorithms to fix prices – or worse.
UnitedHealth is now facing lawsuits due to its scheme to use AI to cut off care for the elderly and disabled. NaviHealth, which is owned by UnitedHealth, is also used by Humana, the company with the second-highest number of private Medicare enrollees behind UnitedHealth.
Such practices aren’t unfortunately new, but outsourcing the job to AI might be. Reporting by STAT’s Casey Ross and Bob Herman showed that:
Internal documents show that a UnitedHealth subsidiary called NaviHealth set a target for 2023 to keep rehab stays of patients in private Medicare plans within 1% of the days projected by the algorithm. Former employees said missing the target for patients under their watch meant exposing themselves to discipline, including possible termination, regardless of whether the additional days were justified under Medicare coverage rules….
The stringent performance goal was part of a broader effort to reduce expensive nursing home care for frail patients with privatized Medicare plans, the internal documents show. The strategy was conceived and executed by former top Medicare officials whose policies became a blueprint for UnitedHealth to reap hundreds of millions of dollars annually by shredding the government’s safety net with payment denials backed by an algorithm.
(NaviHealth is also used by Humana, the company with the second most private Medicare enrollees behind UnitedHealth.)
Using AI instead of humans to make the decisions to deny care is presumably faster and more effective from the insurers’ perspective as UnitedHealth Group’s profits soared, according to STAT. The story also quotes a former UnitedHealth employee who says she was fired for approving coverage for care she felt was justified but that went against the AI system.
RealPage had a similar system for humans that dared to contradict its AI. From ProPublica:
An update to the software tracked not only clients’ acceptance rate, but also the identity of the landlords’ staff members who had requested a deviation from RealPage’s price, the lawsuit said. Compensation for some property management personnel was even tied to compliance with the company’s recommendations, it said.
Elsewhere in the world of algorithmic price fixing, the DOJ filed an antitrust lawsuit against Agri Stats Inc. in September for running anticompetitive information exchanges among broiler chicken, pork and turkey processors. Agri Stats allegedly collects, integrates and distributes price, cost and output information among competing meat processors, which allows them to coordinate output and prices in order to maximize profits. That, in turn, means grocery stores and consumers pay much more.
And there’s always Amazon:
The FTC claimed that Amazon temporarily suspended the algorithm during Prime Day and the holiday shopping season, times when there was more public and customer scrutiny.
Read more: https://t.co/yngQ7zOoq9
— unusual_whales (@unusual_whales) November 19, 2023
Algorithmic collusion is relatively new – or at least understanding and identifying it is relatively new. It remains an open question as to how much the widespread implementation of AI algorithms could be contributing to higher prices for everything, but as the Kansas City Fed noted back in January, “Markups could account for more than half of 2021 inflation.” Principal Deputy Attorney General Doha Mekki admitted as much early this year at an antitrust conference in Miami:
An overly formalistic approach to information exchange risks permitting – or even endorsing – frameworks that may lead to higher prices, suppressed wages, or stifled innovation. A softening of competition through tacit coordination, facilitated by information sharing, distorts free market competition in the process.
Notwithstanding the serious risks that are associated with unlawful information exchanges, some of the Division’s older guidance documents set out so-called “safety zones” for information exchanges – i.e. circumstances under which the Division would exercise its prosecutorial discretion not to challenge companies that exchanged competitively-sensitive information. The safety zones were written at a time when information was shared in manila envelopes and through fax machines. Today, data is shared, analyzed, and used in ways that would be unrecognizable decades ago. We must account for these changes as we consider how best to enforce the antitrust laws.
A 2021 Yale study found that prices can rise above competitive levels when between two and seven competitors in a field use algorithmic pricing. So now comes the task of enshrining its illegality (and enforcing it), which shouldn’t be that complicated according to the SMU Science and Technology Law Review. Researchers there wrote earlier this year that while the use of AI might be new, the antitrust laws that govern this type of behavior have been around for more than 140 years. And while AI might increase the speed of price-fixing based on up-to-the-second data, they’re otherwise nothing new:
Let’s just change the terms of the hypothetical slightly to understand why. Everywhere the word “algorithm” appears, please just insert the words “a guy named Bob.” Is it ok for a guy named Bob to collect confidential price strategy information from all the participants in a market, and then tell everybody how they should price? If it isn’t ok for a guy named Bob to do it, then it probably isn’t ok for an algorithm to do it either.