The Growth Series
new-gs-logo_250x250
large-row-color-pattern-background-3480x780

Navigating AI: A broker’s guide to understanding the FCA’s view

Many firms are already using AI and the pace of innovation is such that the regulator is finding it challenging to stay ahead of AI developments. 

The Financial Conduct Authority (FCA) has recently released an AI Update, which provides insights and guidelines for mortgage brokers. Based on this document, here’s a summary of what you need to consider to ensure compliance and leverage AI responsibly and effectively.

Embracing Innovation with Compliance

The FCA emphasises the transformative potential of AI in financial services. AI can streamline operations, enhance customer service, and support the advice process. However, alongside these benefits, they highlight the importance of adopting AI in a manner that ensures consumer protection, market integrity and effective competition.

Key FCA guidelines for mortgage brokers

1. Proportionality and principles-based approach

The FCA’s regulatory framework is technology-agnostic, focusing on principles and outcomes rather than prescribing specific technologies. This allows for flexibility and innovation while ensuring that the adoption of AI aligns with regulatory objectives. Mortgage brokers should ensure that AI systems are integrated in a way that aligns with the FCA's principles of fairness, transparency, and accountability.

It can be difficult working within a principles based approach as clear and direct rules are much easier to interpret. With AI innovation moving at such a pace it’s understandable that the FCA takes this approach but it doesn’t make interpreting the guidance easier.

2. Safety, security, and robustness

AI systems must operate safely and securely throughout their lifecycle. This includes continuous risk identification, mitigation, and management.

Where you use AI, ensure that the systems and tools for managing them are robust, with strong security measures and controls in place to protect against operational disruptions.

3. Fairness and non-discrimination

AI should not lead to unfair outcomes or discriminate against individuals. Mortgage brokers must consider the ethical implications of AI, ensuring that it does not perpetuate biases or create disparities in your service provision.

The FCA’s Consumer Duty requires firms to act in the best interests of customers, which includes designing AI systems that are fair and transparent.

The risk of bias in Large Language Models (LLMs) is a concern, as these models can inadvertently amplify existing biases present in their training data. The FCA are worried that this could lead to unfair outcomes, such as discriminatory advice practices or exclusion of certain customer groups. 

4. Transparency and explainability

The FCA mandates that AI systems should be explainable. This means that decisions made by AI, especially those affecting customers, should be understandable and justifiable.

Brokers must ensure that they can explain how their AI systems work and the basis of any decisions made by these systems to both regulators and customers. This may not be easy due to the idea that AI is a ‘black box’. 

The black box theory refers to the challenge of understanding how AI systems, particularly complex models like neural networks, arrive at their decisions. This opacity can pose significant risks, especially in relation to advice and lending, where not knowing the rationale behind AI-generated decisions can lead to compliance issues and undermine trust.

5. Accountability and governance

Clear governance structures should be in place to oversee the use of AI. Senior management must be accountable for the deployment and operation of AI systems.

The FCA’s Senior Managers and Certification Regime (SM&CR) underscores the importance of accountability in managing AI risks.

Managing AI risks: approaches for small and large businesses

Both small and large businesses need to manage the risks associated with AI, but their approaches may differ significantly. Larger businesses typically have more resources to invest in comprehensive risk management frameworks, including dedicated AI governance teams and advanced technological solutions for monitoring and controlling AI systems. They are likely to have in-house expertise to conduct extensive risk assessments and implement robust security measures. 

Conversely, small businesses may need to adopt a more focused approach, leveraging third-party solutions and seeking external expertise to manage AI risks effectively. They might prioritise specific areas such as ensuring compliance with transparency requirements and focusing on mitigating immediate operational risks. For both types of businesses, aligning AI use with the FCA’s guidelines on fairness, transparency, and accountability remains crucial.

Preparing for the future

The FCA is committed to fostering a pro-innovation environment while ensuring robust regulatory oversight. This includes:

  • Collaborative efforts: Engaging with other regulators, industry stakeholders, and international bodies to develop cohesive AI regulations.
  • Ongoing research: Continuing to monitor AI adoption in financial markets and conducting research to understand its impact on consumers and market dynamics.
  • Supportive infrastructure: Providing tools and environments such as the Digital Sandbox and Regulatory Sandbox to test and develop AI solutions safely.

    Some action points for you to consider…

1. Assess your AI systems: Conduct a thorough review of your AI systems to ensure they comply with FCA guidelines on safety, fairness, and transparency.

2. Enhance governance: Strengthen your governance frameworks to ensure clear accountability for AI use within your firm. Consider making an individual responsible for this within your firm.

3. Stay informed: Keep abreast of ongoing regulatory developments and participate in industry discussions on AI to stay ahead of compliance requirements.

Integrating AI into your business presents significant opportunities for innovation and efficiency. It’s important to align with the FCA’s guidelines to ensure that you can harness the power of AI while ensuring compliance and safeguarding customer interests. As the regulatory landscape evolves, staying informed and adaptable will be key to leveraging AI successfully in your business.

Click here to view the FCA AI update publication

Learn more: A Broker’s guide to AI

Learn more: Choosing the right AI tools for you

Learn more: How mortgage brokers can use AI with social media

Get notified of new content

Related Content

Accord_TheGrowthSeries_Pattern_RGB_tint_V2

Latest Blogs

Marketing messages for brokers in 2025

Marketing messages for brokers in 2025

Added 13/12/24 - 4 min read

How blended working boosts collaboration and mentorship

How blended working boosts collaboration and mentorship

Added 09/12/24 - 4 min read

A broker's guide to financial crime

A broker's guide to financial crime

Added 29/11/24 - 1 min read