AI could do nearly 12% of your work already says new study

Major University study shows mortgage industry could already slash costs - substantially

AI could do nearly 12% of your work already says new study

Artificial intelligence is no longer just a buzzword tossed around at fintech conferences. For Australia’s mortgage industry, it is rapidly becoming a force inside broker CRMs, bank credit centres and non‑bank processing hubs — and new research suggests it could already perform a sizeable share of the work those systems contain.

MIT’s Project Iceberg, a large‑scale simulation of the U.S. labour market, concludes that today’s AI tools are technically capable of performing tasks worth 11.7% of total U.S. wage value — about US$1.2 trillion a year. The model represents roughly 151 million workers across 923 occupations, mapped to more than 32,000 distinct skills. 

READ MORE: Yes, AI is smarter than you sometimes, but can it help clients with mortgages?

For Australian mortgage professionals, those numbers land close to home. Few sectors blend financial analysis, document review and administrative routines as tightly as home lending does.

 

A microscope on the work behind every loan

Project Iceberg does not try to forecast next year’s unemployment rate. Instead, it asks a more precise question: for each occupation in the economy, what fraction of its skills are ones that AI has already demonstrated it can perform in at least one real context? 

To answer it, the researchers treat each worker as an “agent” with a bundle of tasks and skills, then match those against thousands of existing AI tools. The resulting measure, the Iceberg Index, captures what they call technical exposure — “where AI can perform occupational tasks, not displacement outcomes or adoption timelines.” 

In plain terms, the 11.7% figure is not a prediction of layoffs. It is a map of where employers already have a choice: keep using humans, deploy machines, or design some hybrid of the two.

In mortgages, where a loan application can pass through dozens of small, rule‑driven steps — many still propped up by spreadsheets and email chains — that choice is moving from theoretical to practical very quickly.

 

The real action is below the surface

It is tempting to think AI risk lives mainly with software engineers and data scientists. Project Iceberg suggests otherwise.

When the researchers look only at current AI adoption in computing and technology occupations, they find exposure of about 2.2% of total wage value — roughly US$211 billion. They label this the “Surface Index”, and describe it as “only the tip of the iceberg.” 

Beneath that visible tip sits a much larger mass of potential automation. Technical capability “extends far below the surface through cognitive automation spanning administrative, financial, and professional services,” they write, bringing total exposure to 11.7% and about US$1.2 trillion in wages. 

One summary notes that high Iceberg scores in several U.S. states are driven by “cognitive work—financial analysis, administrative coordination, and professional services.” For Australia, where brokers, banks and non‑banks all rely on large operational teams to sift through documents, apply credit policy and communicate with borrowers and brokers, the parallel is obvious.

 

How AI already “sees” an Australian mortgage file

The Iceberg report does not single out “mortgage broker” or “credit assessor” by name. Instead, it focuses on tasks AI has already mastered: reading documents, extracting data, applying rules and producing standard text.

Those are not abstract activities in this industry.

Document and data processing

The researchers point out that financial institutions now deploy AI for “document processing and analytical support,” while healthcare systems automate “administrative tasks.” 

Australian mortgage operations combine both patterns. Every day, systems — and the people behind them — ingest payslips, tax returns, BAS statements, bank statements, credit reports and contracts of sale. They extract incomes, liabilities, living expenses and red flags into lender systems. They classify documents and chase missing items.

Project Iceberg’s message is that these are exactly the kinds of tasks AI is already good at.

Rules, thresholds and triage

The same applies to policy checks and triage. Eligibility against product matrices, debt‑to‑income and net‑surplus calculations, and routing of clean versus messy files are all rules‑based operations. Technically fiddly, but conceptually well within the reach of software designed to spot patterns and apply conditions.

Iceberg suggests that, at a technical level, current AI could already handle a substantial share of this work.

Templates, not novels

A lot of the industry’s written output — conditional approvals, variation letters, standard emails to brokers and customers — follows strict templates shaped by legal, credit and compliance teams. The variables change; the structure remains constant.

For generative models, this kind of constrained writing is straightforward.

None of that means loan writers, processors and assessors are about to disappear. It does mean that the portions of their day dominated by data entry, document classification and boilerplate communication are under real pressure.

 

A profession with decisions to make

Although Iceberg was built with governments in mind, its questions are increasingly landing on the desks of Australian lenders and aggregators.

One of the project’s goals is to help policymakers “identify exposure hotspots, prioritize training and infrastructure investments, and test interventions before committing billions to implementation.” For the mortgage profession, the same logic applies: understand where AI can bite, before you re‑platform, restructure or outsource.

Three questions stand out.

Where is your own iceberg?

Because the Iceberg Index is built on skills rather than titles, exposure varies widely inside the same role. A credit assessor who spends most of her day negotiating tricky self‑employed files is not in the same position as one who spends hours keying in data on straightforward PAYG applications.

For any given lender or brokerage, the first step is to break roles into tasks: gathering documents, entering data, running calculators, applying standard rules, explaining outcomes, negotiating with borrowers and brokers. Only then can leaders estimate what share of their wage bill sits in work that looks like the document processing and administrative tasks the report highlights. 

What do you want to do with that 10–15%?

Iceberg’s headline number invites a strategic choice. Some institutions will use AI predominantly to cut costs in operations and credit, leaning hard on automation to improve cost‑to‑income ratios. Others may aim to keep headcount steadier, allowing AI to absorb volume and re‑work while redeploying people into areas machines still struggle with: complex lending, hardship and arrears support, regional and broker relationships, and outreach to segments that have historically been underserved.

In a market where borrowers complain about turnaround times and opaque decisions, the way that choice is made will shape reputations as much as balance sheets.

Can you explain your models — to ASIC, APRA and your customers?

The researchers note that traditional macro indicators such as GDP, income and unemployment explain “less than 5% of this skills-based variation” in exposure. That is a warning that change will show up deep inside processes before it ever appears in headline numbers.

For Australian lenders, it also raises familiar conduct questions in a new key: If AI influences credit decisions, can you still explain those decisions clearly to customers and regulators? Are vulnerable borrowers treated fairly? Can you show that automated rules and models don’t produce unintended bias across different groups or regions?

 

A narrowing window for the industry

Project Iceberg is, at heart, an early‑warning system. It exists so societies can prepare for AI‑driven change before it shows up as closed branches or long‑term unemployment. The authors are blunt: “The window to treat AI as a distant future issue is closing.” 

For the Australian mortgage profession, that window feels particularly narrow. Home lending sits at the intersection of finance and paperwork, shaped by responsible‑lending rules and rising customer expectations, and competing on both speed and trust.

AI is already capable of doing a meaningful slice of the work behind every approval and decline: scanning and sorting documents, running first‑pass eligibility checks, assembling disclosures and standard letters. The question now is less about what the technology can do, and more about what the industry chooses to do with it.

Used well, it could mean faster, clearer decisions and more time for genuine advice and support. Used poorly, it risks becoming just another blunt tool for cost‑cutting, with borrowers and frontline staff absorbing the shock.

Project Iceberg suggests the technology will not wait. The decisions — in credit policies, technology road maps and boardrooms — are now firmly in the mortgage profession’s hands.