Criminals are using artificial intelligence to impersonate real estate professionals and infiltrate transactions – do your clients know the difference?
Sophisticated scammers can easily take control of real estate transactions using deepfake audio and video technology to mimic agents and parties involved, according to security experts. Last month, police in Hong Kong warned the public about one financier who was fooled into wiring $25 million to scammers while verifying account information via live videoconferencing – they learned after making this transfer that deepfake technology had been employed on his call that sounded and looked similar to colleagues working at his company’s CFO or staff members from live video conferencing!
CertifID’s recent report “2024 State of Wire Fraud(link is external)” documents this trend as part of their future-facing real estate fraud threat landscape, where bad actors use artificial intelligence tools to impersonate real estate professionals’ written and voice communications using fake real estate agent names, reports CertifID. Tyler Adams, CEO of CertifID, laments how fraudsters’ use of artificial intelligence (AI) makes it even harder to discern legitimate information in real estate transactions. “Older generations could rely on misspellings in an email address as an early indicator that communications could be fraudulent, says Adams. However, those days have come and gone – communications now appear as legitimate and there’s becoming less opportunity to identify what’s fraudulent.
Adams states that real estate professionals should increase awareness efforts with their clients and advise them to always double check information they receive. “Education early and often is the key,” according to Adams, and we need to continuously remind consumers there may be fraud-induced transactions at any stage in a transaction process; thus creating more awareness early is so essential so they can remain vigilant against it.
Cara Carlin, Director of Communications for the Better Business Bureau in Arkansas has raised alarm about AI risks to real estate. “AI-generated listings, AI and “spider” seller personas and generated conversations–whether with property owners or even voice impersonations are threatening our industry,” Carlin warned CBS THV-11 News(link is external) from Little Rock. Unfortunately there aren’t a lot of telltale red flags; but you might notice scammers’ attempts at deception such as unwillingness of owners/agent to meet face to meet in person.”
Deepfake videos could lead real estate agents into listing properties that don’t exist or writing virtual sales contracts that don’t accurately represent home conditions. AI could even gain access to client data, says real estate safety expert Tracey Hawkins of REALTOR(r) Magazine’s podcast series Drive With NAR: The Safety Series. To protect buyers and sellers against such risks, real estate safety expert Tracey Hawkins recommends taking steps such as verifying identities, implementing secure payment procedures and adopting stringent security measures – these steps could include verifying identities, implementing secure payment procedures and adopting stringent security measures.
Face-to-face contact may become even more critical in today’s deepfake world. Security experts emphasize the importance of verifying sensitive information from sources other than emails containing possible fraudsters; Hawkins urges clients and colleagues to use secure channels, encrypted emails, messaging apps and secure communication channels when discussing sensitive real estate data with each other.