<img src="https://ws.zoominfo.com/pixel/pIUYSip8PKsGpxhxzC1V" width="1" height="1" style="display: none;">

Mortgage Sellers and Servicers: Will You Be Ready to Comply with Freddie Mac’s AI Requirements by March 3?

author
8 min read
Jan 9, 2026

If you’re selling or servicing loans to Freddie Mac and artificial intelligence (AI) or machine learning (ML) shows up anywhere in your process — underwriting automation, document intelligence, chatbots, fraud detection, income calculation, payment processing, borrower outreach — you have a deadline: March 3, 2026. 

Freddie Mac updated its Seller/Servicer Guide in March 2025, adding a new section addressing the governance of AI and machine learning. That change was formally announced later in Bulletins 2025-16 and 2025-17, issued in December 2025, which established March 3, 2026 as the effective date for compliance. This isn’t guidance, and it isn’t optional. It’s a requirement — and it changes how mortgage sellers and servicers are expected to oversee AI-enabled vendors and tools across the entire loan lifecycle, from origination to servicing. 

Related: State Fair Lending Enforcement Is Heating Up: Massachusetts Hits Lender with $2.5 Million Settlement 

What are Freddie Mac's new AI requirements?

The new section Freddie Mac added to its Seller/Servicer Guide covers AI/ML governance. It establishes a framework for how you are expected to develop, deploy, oversee, and monitor artificial intelligence used in connection with Freddie Mac loans. 

The framework focuses on three core principles: transparency, accountability, and ethical stewardship. In other words, you need to know where you're using AI, who's responsible for it, and how its use is controlled and monitored over time. 

Importantly, Freddie Mac has not defined “artificial intelligence” or “machine learning” in the Guide. That creates a real governance risk. Seller/Servicers and their vendors may not be operating under the same understanding of what qualifies as AI or ML— especially when those capabilities are embedded in tools and marketed as “features” rather than models. 

That ambiguity doesn’t reduce responsibility. It increases it. 

While Freddie Mac encourages sellers/servicers to get started early, the hard deadline is March 3, 2026 — any mortgage with an application received date on or after that must comply. 

What does this mean in practice? Seller/servicers must be able to demonstrate: 

  • Risk management. How you identify, measure, and manage AI risks across your organization. This isn't a one-time assessment — it's an ongoing program.
  • Transparency. Clear understanding of how your AI systems work, what data they use, and how decisions are made. No black boxes allowed.
  • Accountability. Documented oversight showing who's responsible for AI governance, how decisions get escalated, and where the buck stops.
  • Ethical standards. Processes to ensure AI is used responsibly, fairly, and in compliance with all applicable regulations.
  • Audit trail. The ability to prove all of the above when Freddie Mac comes knocking. 

Related: What is AI Auditing and Why Does It Matter? 

The Hidden Problem: Vendors Using AI

Here’s where this gets complicated: your vendors. 

Most mortgage sellers/servicers can name their core technology providers. What they often can’t do is say which of those vendors are using AI, how it’s being used, or whether any real governance exists behind it. 

And AI is almost certainly already embedded across the loan lifecycle. Your LOS may use machine learning to optimize workflows or flag risk. Document automation tools rely on AI to extract and classify data. Income verification, fraud detection, chatbots, appraisal tools, and pricing engines increasingly depend on predictive models and automated decisioning. On the servicing side, AI is used for payment processing, payment reminders, chatbots, and identifying borrowers at risk of delinquency — even when those capabilities are presented as routine functionality, not AI. 

That’s where Freddie Mac’s new framework exposes gaps. 

The question is no longer whether vendors are using AI. It’s whether you can demonstrate governance over it. Can you explain what data is being used, how decisions are made, how errors or bias are monitored, and who is accountable when outcomes go sideways? Can you show that vendor practices align with Freddie Mac’s expectations — not just in theory, but in practice? 

For most sellers and servicers, the honest answer is some version of “no” or “we assume so.” 

And that’s the problem. Under Freddie Mac’s framework, responsibility doesn’t transfer to your vendors. It stays with you. 

Related: How to Manage Third-Party AI Risk: 10 Tips for Financial Institutions 

What It Takes to Comply with Freddie Mac's AI Governance Requirements

It’s easy to read Freddie Mac’s new requirement and think of it as a policy update. In practice, it’s a scope problem. 

Compliance starts with visibility. Sellers/servicers must identify every place where AI or machine learning touches the loan lifecycle — not just the obvious systems, but every vendor, tool, and integration that influences borrower data or decision-making. For a mid-sized lender, that can mean two or three dozen vendors. For larger originators, it’s often far more. 

And inventory is just the beginning. 

Once AI use is identified, sellers/servicers have to assess governance at the vendor level. That means asking how models work, what data they rely on, how performance is monitored, how issues are audited, and whether those practices align with Freddie Mac’s expectations. It also means tracking responses, following up when vendors are slow — or vague — and determining when gaps rise to the level of risk that requires escalation. 

None of that work is theoretical. It’s operational, time-consuming, and difficult to do well without structure. Manual processes turn quickly into version-control problems, stale information, and oversight that looks solid on paper but collapses under scrutiny. 

Governance also has to be formalized internally. Sellers/servicers must be able to show — not just say — how AI risk is identified, who owns it, how oversight works, and what happens when issues surface. That requires documented policies and procedures, executive involvement, and clear communication across the organization. This isn’t a one-page policy exercise. It’s a program. 

And it doesn’t end once documentation is approved. 

Vendors change. AI models evolve. Data sources shift. New capabilities get rolled out quietly as “enhancements.” Governance only works if monitoring keeps pace with those changes — if risk is reassessed when systems are updated and oversight adapts as exposure grows. 

That’s where many compliance initiatives fail. Treating AI governance as a one-time project almost guarantees that, by the time assessments are complete, the earliest decisions will already be outdated. 

Eventually, Freddie Mac will ask sellers/servicers to show their work. At that point, lenders need to be able to produce a coherent picture: where AI is used, how vendors are governed, how oversight is maintained, and how leadership stays engaged. If that evidence lives in email threads, shared drives, and individual inboxes, the problem isn’t documentation — it’s governance. 

Related: 2026 Mortgage Industry Regulatory Update

Freddie Mac AI Governance Readiness for Mortgage Sellers/Servicers

Governance Area What Freddie Mac Expects to See What This Enables
Regulatory Change Management Freddie Mac updates are identified promptly, assigned for impact analysis, and tracked through implementation with documented decisions.  Fewer surprises, more implementation runway, and defensible compliance decisions. 
Vendor Inventory and AI Visibility A centralized inventory of all loan lifecycle vendors, with clear identification of where AI or ML is used and what decisions it influences.  Clear ownership, faster reviews, and the ability to govern AI deliberately instead of reactively. 
Risk Tiering and Materiality Vendors are tiered based on impact to borrower outcomes and delivery risk, with AI use factored into risk classification.  Focused oversight where it matters most — and less noise where it doesn’t. 
Standardized Vendor Oversight Consistent AI governance questions, documentation requests, and review standards applied across vendors.  Comparable assessments, easier escalation, and defensible consistency under review. 
Ownership and Accountability Clear ownership for AI governance, defined escalation paths, and documented risk acceptance decisions.  Decisions that can be explained, defended, and sustained over time. 
Ongoing Monitoring Vendor changes, certifications, and AI updates trigger reassessment and documentation updates.  Fewer blind spots as vendors evolve and risk profiles change. 
Decision Traceability Approvals, exceptions, and risk acceptance decisions are documented and remain visible over time.  Reduced rework, fewer fire drills, and stronger institutional memory. 
Audit-Ready Evidence Documentation can be produced quickly to show AI use, vendor governance, and oversight.  Faster responses, lower delivery risk, and confidence during reviews. 

 

Section 4: The Real Cost of Getting This Wrong

There are financial, operational, and competitive consequences for not implementing the new Seller/Servicer Guide AI/ML governance requirements. 

Financial risk. If you can’t demonstrate compliant AI governance, Freddie Mac can decline to purchase the loan or require it to be returned. 

When a loan is reviewed and AI-enabled tools played a role, Freddie Mac will look for evidence that those tools were governed appropriately at the time of delivery. If documentation is missing, inconsistent, or incomplete, the issue isn’t whether the tool worked as intended. It’s whether the delivery requirements were met. 

Operational risk. The operational consequences show up quickly if you don’t have a strong system in place. Teams end up reconstructing vendor approvals after the fact. Information is scattered across emails and spreadsheets. Different answers surface depending on who’s asked. What should be a straightforward explanation turns into uncertainty — and uncertainty doesn’t hold up well under review. 

Competitive disadvantage. Sellers/servicers with centralized governance and repeatable oversight can adapt to new requirements without slowing down. Others lose time and momentum, not because they lack expertise, but because their processes don’t scale. 

From here to March 3, there’s limited room to fix foundational gaps. Inventorying AI use, engaging vendors, assessing governance, documenting oversight, and aligning leadership all take time — especially when vendor responses lag. Manual processes compress that window even further. 

This is why AI governance matters now. The real risk isn’t abstract. It’s a loan you can’t sell — or one you’re forced to take back. 

Related: Guide to AI Risks  

What Good Governance Looks Like

The sellers/servicers who will sail through Freddie Mac's new requirements have a few things in common: 

  • Vendor visibility. They know exactly which vendors are using AI, what those systems do, and how they're governed. No surprises, no gaps, and no "we think they might be using AI, but we're not sure."
  • Standardized processes. Every vendor goes through the same due diligence process. Every assessment asks the same questions. Every gap gets flagged the same way. There's no variation based on who's doing the work.
  • Real-time status updates. They can tell you at any moment which vendors have been assessed, which are in progress, which are overdue for review, and which have outstanding issues. No hunting through email or asking around.
  • Centralized documentation. Everything lives in one place. When Freddie asks for proof of AI governance, they pull a report. When a vendor updates their systems, the documentation gets updated. When executives need a dashboard, it's already there.
  • Proactive monitoring. They're not waiting for annual reviews to catch problems. They're getting alerts when certifications expire, when vendors report changes, when new risks emerge. They're staying ahead of issues instead of reacting to them. 

The common thread? These sellers/servicers have moved beyond manual processes. They've recognized that managing AI governance across dozens of vendors isn't something you can do at scale with spreadsheets and email. 

Related: How to Build Better Governance with Stronger Policies 

The Bottom Line for Mortgage Sellers/Servicers

Freddie Mac’s AI governance requirements aren’t really about AI. They’re about whether your compliance and third-party risk programs are keeping pace with change. 

If this update came as a surprise, that’s an important signal. It suggests your compliance management program isn’t consistently surfacing regulatory changes that materially affect how you operate and oversee vendors. Early visibility matters: it creates time to assess impact, assign ownership, and translate new requirements into policies, oversight, and vendor expectations — before deadlines start driving decisions. 

It also often means AI risk is being managed informally, through individual vendor relationships and point solutions, rather than through a centralized, repeatable third-party risk (TPRM) program. 

Sellers/servicers that are well positioned for March 3 already have the fundamentals in place. They know their vendors. They understand where critical decisions are being influenced. They apply consistent due diligence, have clear ownership and documented risk tolerance, and maintain ongoing monitoring that doesn’t depend on institutional memory or individual inboxes. 

That’s what Freddie Mac is testing for. Not whether you can define AI, but whether you can govern third-party risk at scale. 

For organizations still relying on spreadsheets, email, and one-off assessments, this requirement exposes a broader issue. AI just happens to be the pressure point. The real challenge is managing vendor risk in a way that’s consistent, defensible, and sustainable as complexity grows. 

This is where strong compliance infrastructure matters — not as a reaction to a single bulletin, but as the foundation for absorbing regulatory change without starting over each time. 

March 3 isn’t the finish line. It’s a checkpoint. Sellers/servicers that recognize that will be far better positioned for whatever comes next. 

Ready to transform your third-party oversight? Take a tour of our automated vendor management solution and see how you can gain better visibility, automate compliance checks, and mitigate risks across the vendor lifecycle. 

Take a Product Tour

 


Subscribe to the Nsight Blog