We all love Zillow browsing. But there’s a dark side to the real estate app.
The clear and present danger of artificial intelligence is not robots enslaving humans, but the capacity of A.I. to dehumanize our lives, work and decision-making in thousands of subtle ways we do not always see. One of those ways has been in real estate markets, where home values are assessed instantly via algorithms written by companies like Redfin, Realtor.com and—to its great regret—the online real estate company Zillow.
In the last two years, Zillow and its rivals have participated in a real estate boom fueled by low interest rates and Covid-19 stimulus checks. The homebuying frenzy, along with a housing shortage worsened by a decline in the construction of single-family homes, has led to a dramatic spike in home prices. Just in the second quarter of 2021, the median price of a single-family home climbed 22.9 percent, to $357,900, the biggest such jump since the National Association of Realtors began keeping records in 1968. That is good news for real estate investors and house-flippers, but a problem if you care about offering every American a chance at an affordable home.
Zillow failed to appreciate how algorithms sometimes cannot grasp the nuances of human thinking and decision-making.
Housing prices have risen and fallen in the past, but artificial intelligence is a new factor in this cycle, thanks in part to Zillow. Earlier this month, The Wall Street Journal reported, the online real estate company killed off a subsidiary business called iBuyer (for “instant buyer”) that it had started in 2018; iBuyer had purchased homes that its algorithm said were undervalued, then renovated and sold them at profit. This strategy had helped contribute to higher home prices and the speculative boom. But as things turned out, iBuyer underestimated the dangers of letting A.I. make important choices about housing, and it failed to appreciate how algorithms sometimes cannot grasp the nuances of human thinking and decision-making.
Zillow thought it had a competitive edge thanks to its Zestimate app, which calculates the value of a home by looking at its location, size and other variables; Zillow has been used by everybody, from families looking for new homes to people gawking at their neighbors' mansions. This summer, there were reports of Zillow offering homeowners tens of thousands of dollars more than their asking price—and in cash, a proposition difficult to refuse. It was an example of A.I. accelerating a trend, in this case ballooning real estate prices, and perhaps contributing to gentrification in certain urban neighborhoods by encouraging people to move out of their homes.
But this strategy didn’t work, because, it turned out, the algorithm could not accurately simulate what exactly humans value when they buy property. It likely overvalued some property characteristics but neglected intangibles like hometown loyalty, the quality of local school districts and proximity to parks. As a result, Zillow said it expected to lose between 5 and 7 percent of its investment in selling off the inventory of some 18,000 homes it had acquired or committed to buy.
The company, which had once said it could make $20 billion a year from iBuyer, now says it will have to reduce its workforce by 25 percent. “We’ve determined the unpredictability in forecasting home prices far exceeds what we anticipated,” Zillow chief executive Rich Barton admitted in a company statement.
This is a story about the limits of algorithmic decision-making: Even during the salad days of a profitable industry, A.I. failed to make money. In that way, it was all too human.
It was an example of A.I. accelerating a trend, in this case ballooning real estate prices, and perhaps contributing to gentrification in certain urban neighborhoods by encouraging people to move out of their homes.
But the Zillow misadventure also clarifies a broader dysfunction in the economy and a moral problem. In “Fratelli Tutti,” Pope Francis defended the right to private property but observed that it “can only be considered a secondary natural right, derived from the principle of the universal destination of created goods.” As Francis observed, “it often happens that secondary rights displace primary and overriding rights, in practice making them irrelevant.”
Housing is one of the most important goods that must be “universally destined.” And besides meeting the need for shelter, Georgetown University’s Jamie Kralovec told me, better urban planning has the potential “to build just and equitable use of the neighborhood, and bring about all these things Pope Francis talks about, like social friendship and solidarity.” Like hometown loyalty, these concepts are hard to plug into algorithms.
Investors and speculators of all types seek to make as much money as they can, and thanks to A.I., they now have better tools to do it. The New York Times last week profiled a California-based real estate investor looking to build up a property portfolio in Austin, Tex. The investor, the Times reported, used online searches and algorithms and “resolved to acquire 10 homes within a 12-minute drive” of Apple’s offices. “For $1 million down,” the piece read, “he’d own $5 million in assets that he would rent out for top dollar and that he believed would double in value in five years and double again by 12 years.”
That is an example of a human using A.I. as a machine to increase their productivity, but it underscores the risk that “A.I. systems can be used in ways that amplify unjust social biases,” as Shannon Vallor, a professor of philosophy now at the University of Edinburgh, told me as I was researching a 2018 story on the ethical questions surrounding artificial intelligence. “If there’s a pattern, A.I. will amplify that pattern.”
In other words, A.I. is a tool that can make bad trends worse and good trends better. When it comes to housing, our society will have to choose a direction.