Artificial intelligence is no longer a laboratory curiosity in housing finance. From selecting borrowers and pricing credit to powering anomaly-detection systems for regulators, AI is reshaping mortgage origination at an unprecedented pace. The Government Accountability Office’s new study, Artificial Intelligence: Use and Oversight in Financial Services (May 19 2025), released on May 19 2025, provides a comprehensive look at this transformation. It confirms that advancements in algorithms, vast data availability, and the surge in AI enthusiasm post-ChatGPT have embedded AI deeply into financial services, including the mortgage sector.
At Polygon Research, we see AI’s rapid adoption as both an opportunity and a challenge. The GAO report highlights significant benefits like cost savings and faster loan approvals alongside critical risks, such as bias and regulatory gaps. Our tools, HMDAVision and CensusVision, built on granular microdata, offer a solution: transparent, high-quality data to ensure AI in housing finance is fair, defensible, and effective. Here’s how the GAO’s findings align with our mission, and why our tools are essential for navigating this AI-driven landscape.
The GAO report details how AI is delivering tangible benefits to housing finance. Lenders are leveraging AI to streamline mortgage origination, with industry estimates suggesting that AI-driven underwriting can reduce loan processing times by up to 50%. This efficiency translates into lower operational costs and faster approvals, making homeownership more accessible for borrowers. For example, chatbots powered by AI can handle routine inquiries, think balance checks or address updates, saving an estimated $0.70 per interaction compared to human agents, according to a 2017 Juniper Research study cited by GAO stakeholders.
Beyond efficiency, AI holds the potential to enhance financial inclusion. The report notes that some AI vendors report a 40% increase in loan approvals for women and borrowers of color after replacing traditional scorecards with AI models. By incorporating alternative data like rent and utility payments, AI can assess creditworthiness for those with thin or no credit histories, a group often excluded from conventional lending. This aligns with broader industry trends: the Financial Stability Board, referenced in the GAO report, highlights AI’s ability to extend credit to underserved populations, potentially reshaping access to housing finance.
Yet, the GAO report is clear that AI’s benefits come with significant risks. One of the most pressing is bias. AI models trained on historical lending data can inadvertently perpetuate past discrimination. For instance, if legacy practices favored certain demographics, say, by prioritizing zip codes with higher property values, an AI model might learn to replicate those patterns, leading to unfair credit denials or higher rates for protected classes. The report cites testimony from FinRegLab’s CEO, Melissa Koide, warning that complex AI models can infer race or gender from seemingly neutral data, amplifying disparate impacts that are harder to detect as models grow more sophisticated.
Data quality is another concern. Poor or incomplete inputs can lead to inaccurate predictions, risking financial losses for lenders and unfair outcomes for borrowers. The GAO also flags “hallucinations” in generative AI, credible but erroneous outputs, a particular worry when such models influence high-stakes decisions like mortgage approvals. Explainability compounds these issues: opaque AI systems make it difficult to justify adverse actions, potentially violating fair lending laws like the Equal Credit Opportunity Act (ECOA).
These risks aren’t abstract. They’re already prompting scrutiny from regulators and lenders alike, underscoring the need for robust data and oversight to keep AI in check.
The GAO report tells a story of a fragmented regulatory landscape for AI in financial services, with implications for housing finance. Most banking regulators, such as the Federal Reserve, FDIC, and OCC, rely on existing model-risk management and fair-lending frameworks to oversee AI. Some have issued AI-specific guidance or conducted targeted examinations, but the approach varies. The National Credit Union Administration (NCUA), which oversees credit unions critical to affordable housing, stands out for its limitations:
The GAO reiterates its call for Congress to grant NCUA third-party examination powers and recommends updating its guidance. Until these changes materialize, credit unions and their borrowers face heightened exposure to AI’s pitfalls—a gap our tools are designed to bridge.
Regulators aren’t just overseeing AI; they’re using it. The GAO report details how agencies like the Federal Reserve, FDIC, and SEC deploy AI to enhance supervision, searching documents, detecting legal violations, and flagging outliers in financial reports. For example, the Federal Reserve uses AI to sift through bank examination files, cutting manual review time for documents that can span hundreds of pages. The NCUA employs AI to predict credit union financial health, pairing these insights with human judgment. Crucially, regulators treat AI as a decision-support tool, not an autonomous arbiter. Outputs are cross-checked against traditional supervisory data, ensuring accountability. This cautious adoption mirrors the industry’s need for balance: leveraging AI’s power while grounding it in transparent, reliable data.
Regulators treat AI as a decision-support tool, not an autonomous gatekeeper.
The GAO report consistently returns to a core principle: fair and effective AI requires transparent, high-quality data. This is where Polygon Research excels. Our tools, HMDAVision and CensusVision, bundled as Polygon Vision, provide the granular, authoritative data needed to audit, understand, and improve AI models in housing finance.
Together, Polygon Vision creates a transparent test bed for AI in housing finance. Lenders and regulators can:
In an era where AI drives high-stakes decisions, data literacy and transparency are non-negotiable. Polygon Vision delivers both, empowering users to harness AI’s potential while mitigating its risks.
The GAO’s recommendations, expanded NCUA authority and updated guidance, may take months or years to become policy. Meanwhile, AI’s role in housing finance is accelerating. Lenders can’t afford to wait for regulatory clarity; they need tools today to ensure their AI models are fair, compliant, and effective.
Polygon Vision provides that edge. With our platform, you can explore the newly modeled 2024 HMDA data, released annually by the CFPB; link loan-level insights to census-tract and household microdata in seconds, and even stress-test AI underwriting or marketing models for inclusion, bias, and performance.
Don’t let regulatory gaps or opaque data slow you down. Start a free trial of Polygon Vision today and gain the clarity and confidence to lead in an AI-powered housing finance landscape.
Visit polygonresearch.com/pricing to activate your account. AI in housing finance is evolving rapidly. Polygon Vision ensures you set the standard.