A warning for brokers: ‘The AI said so’ won’t work with mortgage regulators

Brokers and lenders must be vigilant with the use of AI in the mortgage process

A warning for brokers: ‘The AI said so’ won’t work with mortgage regulators

Ask anybody in the mortgage industry what one of the major storylines of 2026 is likely to be, and almost every one of them will say it’s artificial intelligence (AI).

Recent trade shows in Las Vegas and Nashville had vendor halls packed with companies selling their latest and greatest technology to eager, if not overwhelmed, mortgage professionals.

Before turning that new AI technology loose on mortgage files, one mortgage attorney wants to remind brokers to ensure that new tech doesn’t affect loan compliance rules.

Peter Idziak (pictured top), a senior associate and mortgage attorney at Polunsky Beitel Green, said the first step is to figure out exactly what this new technology actually does.

“There are a couple of aspects, like when you're talking about AI, what are you really talking about?” Idziak told Mortgage Professional America. “Because every vendor is slapping AI on something. Is it really doing any generative AI? Is it really doing any thinking for itself? Or is it more like a very smart data analysis?”

‘The AI said so’

Idziak said one of the first things he tells clients is to ensure they understand how they handle data in relation to AI technology.

“One of the first things that we have to tell our clients is to be aware that if you have consumer data, you can't just export that off your system into ChatGPT,” Idziak said. “Because people want to experiment. But recognize that if you have privacy protections and data security, you need to respect that.”

Because there are so many rules and regulations in the mortgage industry, it is up to brokers and lenders to question vendors thoroughly to ensure their software complies with all applicable laws, including the Equal Credit Opportunity Act (ECOA) and the Fair Housing Act (FHA).

“Whenever you're engaging vendors, what's their training data?” he said. “From a fair lending perspective, we have ECOA, so you can't discriminate based on sex, race, or national origin. But if you have a vendor from outside the space that's come in saying, ‘Hey, I'm gonna help you underwrite your loans.’ One, what data do they train on? Two, how does it do its thinking, and how is it producing the result?

“Because, for ECOA adverse action notices, you need to have a reason. You can't just say ‘The AI said so.’ Well, why did it say so? ‘I don't know. It's a black box.’  So that's what people consider the more generative side.”

For agentic AI, which is typically more of an autonomous product, Idziak cautions clients to hold their AI to the same standards they would a human employee. Because if it messes up, the courts will likely hold you liable regardless of whether AI or a human made the mistake.

“With agentic AI, the chatbots, what's your testing regime?” Idziak said. “How are you going to treat this? I've been of the view that you should treat it as a human employee. If you wouldn't tolerate something from a human employee, or if you would be held liable for what your human employee does, consider the agentic AI to be the same.

“Consumers will try to jailbreak your AI. They will try and get it to give you answers and then come to you and say, ‘Well, this is what your chatbot told me, you know.’ And you’ll say, ‘That's because you figured out a clever way to ask it to do something.’”

Cost savings? Maybe not right away

One of the sales pitches vendors make is how much money AI technology will save a company. Idziak said that while that’s true over the long haul, because companies are trying out so many new tech systems, it might be costing them money in the short term.

“Anything that can lower the cost of origination,” he said. “Although anecdotally, you hear the talk right now, it's increasing your cost of origination, because you have to buy all these new tools. You've got to test them out and see what works.”

And while brokers and lenders will need to ensure their new AI technology complies with all mortgage rules, Idziak sees value in automating menial tasks like data entry. This could prevent mistakes down the line and save money on costly corrections.

“Where you've seen a lot of clients that have really seen value from integration across their systems, and then being able to either directly import and export or extract data from the documents you have,” Idziak said. “There are a lot of areas in our industry where people are physically typing in information that has already been typed in once before. Not only is it slower, but there's a risk of error.

“In a lot of states, if you make a mistake in the deed of trust, you have a correct affidavit. It takes employee time and takes recording fees. It's not a trivial cost. So to the extent that you can eliminate those errors, I think it's valuable, and from a compliance side, there's minimal risk to that.”

Stay updated with the freshest mortgage news. Get exclusive interviews, breaking news, and industry events in your inbox, and always be the first to know by subscribing to our FREE daily newsletter.