
Some in the real estate world think ChatGPT represents a great new tool to save money and serve customers better. Unfortunately, it’s not ready for prime time.
Some in the real estate industry are arguing that AI answers to lead-qualifying and simple questions from buyers and sellers are a far more accurate, dependable and inexpensive alternative to existing chat functions on brokerage websites or dedicated phone staff. Nothing could be further from the truth.
Agency and property disclosures are the two primary sources of major residential real estate litigation in the U.S. Unfortunately, if some real estate professionals’ advice is followed, buyers and sellers will have to increasingly rely on AI for these types of answers. Rather than making our disclosure obligations easier, AI answers that are riddled with misinformation will make them infinitely more difficult.
To show you how many problems ChatGPT can create for your business, I asked it five common questions buyers ask agents. To be fair, ChatGPT did give partially correct advice, but in each example below, the correct advice was coupled with the incorrect advice that could lead to litigation.
Misinformation and Risky Recommendations
Question 1: What is the condition of the property?
Seems simple, right?
Here’s what ChatGPT told me: “If the agent has personally viewed the property, they can describe any visible defects or issues that they have noticed, such as signs of wear and tear, damage, or needed repairs. They can also provide information on the age and condition of major systems and components, such as the roof, HVAC, electrical, plumbing, and appliances.”
Most agents should be able to spot the problems in this right away. When buyers inquire about the condition of the property, the only correct answer is to advise them to hire their own inspector and to do their own due diligence. Under no circumstances should agents try to evaluate any of the home’s systems or components, nor should they attempt to determine and age and condition of those systems.
Question 2: What are the average closing costs and how much should I expect to pay?
ChatGPT missed something here: “For a homebuyer, typical closing costs may include fees for the mortgage loan, such as application fees, appraisal fees, and loan origination fees. Other costs may include title search and insurance fees, survey fees, property taxes, and homeowner’s insurance. On average, buyers can expect to pay between $3,000 and $7,500 in closing costs. For a home seller, typical closing costs may include fees for the real estate agent, title search and insurance fees, transfer taxes, and attorney fees. On average, sellers can expect to pay between 1% and 3% of the sale price of the home in closing costs.”
ChatGPT correctly states what types of fees are included in the closing costs for the buyer, but as soon as it veers off into stating the potential amounts of closing costs, it’s now providing the consumer and agent with information that is highly regulated by TRID (TILA-RESPA Integrated Disclosures). TRID was implemented in 2015 with the purpose of harmonizing disclosure requirements and regulations in the Real Estate Settlement Procedures Act (RESPA) and Truth in Lending act (TILA).
Agents should advise their clients that they will receive an estimate of the mortgage and closing costs from the escrow/title/lender that follows the established TRID guidelines.
Where GPT gets it entirely wrong is stating that the “1 percent to 3 percent” of the sale price covers both the Realtor fees and the closing costs. This is an egregious error.
Steering and Bad Legal Advice
Question 3: How should Realtors advise their buyers what to do if the buyer wants to back out of their deal?
What ChatGPT said: “If a buyer wants to back out of a deal, it is the realtor’s responsibility to advise them on the legal and financial consequences of doing so.”
This is another egregious error. Realtors cannot give legal advice, nor can they comment on the financial consequences of cancelling a transaction. The best course of action is for the agent to turn this over to their manager/broker/owner who can consult with the firm’s attorney on the specifics involved in each particular case.
GPT did correctly suggest one other thing the Realtor can do: “Explore alternatives such as renegotiating terms of the contract.” Nevertheless, it followed that advice with another error, the buyer’s agent should consider finding, “a new buyer to take over the deal.”
Question 4: What is the neighborhood like?
ChatGPT chose to jump right on this common landmine with both feet: “The Realtor should provide a detailed response that covers all the important factors that the buyer is likely to be interested in, such as the safety of the area, local amenities, public transportation, and property values. The Realtor should also use data and statistics to support their response such as crime rates, school ratings, and property appreciation rates.”
There are multiple issues with the GPT statement above including advising the Realtor to discuss the safety of the area, crime rates, school ratings and property appreciation rates. A primary danger here is violating Fair Housing laws prohibiting “steering.”
Here’s how the National Association of Realtors defines that illegal and unethical practice: “‘The process of influencing a buyer’s choice of communities based upon the buyer’s race, color, religion, gender, disability, familial status, or national origin.”
Steering limits the housing opportunities available to that buyer. If an agent expresses his or her own positive or negative views about certain communities or schools, the purpose of which is to direct a buyer either towards or away from a community, then that agent may be stating a housing preference based on race or familial status or religion. These would be violations of the Fair Housing Act and of NAR’s Code of Ethics.
Nowhere is this more of an issue than when the question of schools comes up during the homebuyer search. Discussions about schools can raise questions about steering if there is a correlation between the quality of the schools and neighborhood racial composition – or if characterizations such as “a school with low test scores” or “a community with declining schools” become code words for racial or other differences in the community.
What you can do is to provide your buyers with a report from NARRPR.com (Realtors Property Resource) that provides not only neighborhood information, but the latest statistics on recent sales as well as where the market is currently trending. A second site is Attom Data’s HomeDisclosure.com that provides a wealth of additional data. Both sites are free to Realtors.
Going Beyond Its Remit
Question 5: “What kind of mortgage options are available to me?”
GPT did correctly advise: “Encourage the buyer to get pre-approved by a lender before making an offer on the property.” It should have stopped there.
Instead, GPT advised the agent to: “Gather information about the various types of mortgages available and to discuss the pros and cons of each type of mortgage taking into consideration the buyers’ credit score and debt ratios.”
Agents have no business discussing pros and cons of the various types of financing, nor are they authorized to obtain and discuss credit scores or debt ratios. They can tell their clients that there are fixed and adjustable-rate mortgages, low-down payment loans such as FHA, USDA and VA, and that down payment assistance is available for qualified buyers. Any discussion of these programs, however, should be left to a mortgage professional.
The best course of action is to have all buyers pre-qualified before prior to taking them out to look at property. Also, if you refer your buyers to a lender affiliated with your brokerage, be sure to make the appropriate affiliate disclosures to the buyer.
As you can see, ChatGPT is loaded with landmines. Right now, it’s best that real estate professionals handle it with extreme caution.
Bernice Ross is a nationally syndicated columnist, author, trainer and speaker on real estate topics. She can be reached at bernice@realestatecoach.com.