Mortgage Daily

Published On: January 5, 2012

Like most things today, the residential real estate space is driven by technology. It is common for appraisers to use handheld devices in their everyday work, making laptops seem clunky by comparison and clipboards with paper thoroughly outmoded. Loan officers are virtual and mobile, and field services providers update their clients using cloud technology and geocoding to help validate their work.

So it is no surprise that automated valuation models have also made tremendous strides in the past few years, and while greeted with skepticism when first introduced more than a decade ago, they have made remarkable advances in accuracy and usefulness.

But how accurate are they? Have they evolved sufficiently to become valuable tools for originators, servicers and everyone with a rooting interest in obtaining precise values for residential real estate? The answer is definitely yes, but users need to understand more about them than ever before in order to select the appropriate AVM(s) for their use and to implement them properly.

Where once there were only a few AVMs commercially available, there are now multiple options, and many specialized for particular uses. Though AVMs once struggled to fill holes in available data, there are now almost too many sources of data such that the first challenge for developers is to understand what to do with the intel they are receiving.

Fueling AVM calculations today is a massive amount of data unlike anything available a decade ago. It is like an information-rich version of taking a drink from a fire hose; there’s a massive volume of data available to flow into the AVM models. Ten years ago the sources for data were finite, well guarded by their possessors or simply not accessible in a usable, digital format. Buying data to stream into your automated valuation model was not inexpensive for providers, and as a result, the market availability for AVMs was limited. When data sources were more scattered and less reliable, the industry struggled to get some of the essential building blocks for the models and AVM developers struggled to improve hit rates and accuracy together.

Today there is so much data, it must be deployed with great care. AVM quality has become a matter of data purity now, and understanding that all data is not of comparable accuracy.

Fortunately, at the same time developers have seen improvements to data sources, there has been a vast increase in computing power to support the new requirements of managing AVM inputs. Algorithms are better than ever and modelers understand a lot more about markets and trending than they did in the late 1990s when AVMs began to take root, resulting in improved interpretation of the higher volume of information.

There may be multiple sources of data regarding square footage, for example, or differing counts of bedrooms and bathrooms. Short sales, foreclosures and serial refis have also had their effects on valuation data and inconsistencies on many property attributes in databases. The key then, is careful identification of these discrepancies and knowing which inputs are the most reliable in order to improve the accuracy of the model.

Just the same, with the inclusion of more complex algorithms and richer data sources, such as MLS and others, come more accurate AVMs. But there is one final hurdle before the AVM is truly useful: they must be managed and tested. The job of AVM management includes the continual task of seeking out discrepancies. It is essential to keep data updated and accurate, and to monitor continually for performance, such that accuracy observed in testing correlates directly to the AVM’s use in real-world scenarios

Confidence scores, an integral part of most AVMs, help users understand and appreciate the median errors present in the reports. Another common metric for measurement is the forecasted standard deviation, or FSD, which provides an alternative statistical approach to determining error rates. There are dozens of metric standards, and the numerous AVM vendors use different methodologies in illustrating the reliability of their AVM(s), but the intent is similar — making the information more useful for the various business models present among mortgage lenders.

Different lenders have different lending footprints. Some may be stronger in the non-disclosure states, so they may prefer AVM vendors with an affinity for valuations in those states. Another lender may emphasize California lending, and will want to use an AVM provider showing proficiency in the West. Like other products, AVMs often have geographic distinctions — some do very well in the Midwest while others are stronger along the eastern seaboard. It is another reason why lenders must test and come up with the metrics that show which AVMs work best for their specific purposes. These testing parameters can be subjective from one AVM user to the next, but the importance of testing itself in order to meet specific business objectives and maintain compliance with regulatory standards is immediately clear.

As for the future of the AVM, it would be reasonable to expect AVM developers are embarking on their own ways to adjust their models and better interpret the available data to improve accuracy until such a time that a truly new data source becomes available to incorporate into the algorithms. Until that point in time, developers will approach a plateau for AVM accuracy where the error range may become so small — perhaps down to under plus or minus five percent — we will technically be dealing with the psychological thresholds impacting valuation.

AVM providers will universally experience great difficulty crossing this barrier because it can be influenced by factors that have little to do with trends and data and more to do with subjective human interactions. For example, a house listed for (and previously valued at) $200,000 may receive a lukewarm offer at $180,000 and be accepted because a motivated owner had a specific need to sell. On the other hand, the same property might have been visited on an especially beautiful day when a lot of shoppers were out, and channeled multiple offers to a seller interested in playing out the highest deal, ultimately accepting at $210,000. This type of scenario happens frequently due to various psychological motivations by human parties. It will prove exceedingly difficult for an AVM to close that last five-percent gap as even the on-site valuation categories (appraisals and BPOs) are not able to account for this interpersonal influence.

The industry will continue to see improvements in data and models, but is currently in the latter part of the cycle and the error curve is flattening out. AVMs are far more accurate than they were a short decade ago and are an exceedingly helpful tool in a lender’s arsenal.

FREE CALCULATORS TO HELP YOU SUCCEED
Tools for Your Next Big Decision.

Amortization Calculator

Affordability Calculator

Mortgage Calculator

Refinance Calculator

FHA Mortgage Calculator

VA Mortgage Calculator

Real Estate Calculator

Tags

Pre-Approval Resources!

Making well educated decions in a matter of minutes and stay up to date on the latest news Mortgage Daily has to offer. Read our latest articles to stay up to date on what’s going on…

Resource Center

Since 1998, Mortgage Daily has helped millions of people such as yourself navigate the complicated hurdles of the mortgage industry. See our popular topics below, search our website. With over 300,000 articles, we are guaranteed to have something for you.

Your mortgages approval starts here.

Add 1-2 sentence here. Add 1-2 sentence here. Add 1-2 sentence here. Add 1-2 sentence here. Add 1-2 sentence here.

Stay Up To Date with Today’s Latest Rates

ï„‘

Mortgage

Today’s rates starting at

4.63%

5/1 ARM
$200,000 LOAN

ï„‘

Home Refinance

Today’s rates starting at

4.75%

30 YEAR FIXED
$200,000 LOAN

ï„‘

Home Equity

Today’s rates starting at

3.99%

3 YEAR
$200,000 LOAN

ï„‘

HELOC

Today’s rates starting at

2.24%

30 YEAR FIXED
$200,000 LOAN