Accord Mortgages - Growth Series Blog

AI etiquette and compliance for brokers

Written by Jeremy Duncombe | Nov 17, 2025 10:06:38 AM

More brokers are weaving AI into their everyday workflows, from first contact to ongoing support. Whether it’s drafting emails, summarising conversations, or creating product comparisons, AI can help reduce admin and improve response times. But that comes with new responsibilities.  

Beyond simply meeting regulatory obligations, how AI is used is essential for fostering trust, professionalism and good advice among your clients. That’s where AI etiquette comes in; a layer of thoughtful practice that ensures clients feel informed, protected and well-served.

As more brokers bring AI into their daily routines, it’s worth taking a step back to see how and where it’s being used. 

Five AI etiquette rules every broker should follow

1. Always disclose when AI is used

Clients should know when technology has played a role in creating communications or summaries. If you’ve used AI to automate a follow-up email, pull together mortgage options, or create meeting notes, include a short note to make that clear. For example, “This message includes content generated with the help of an AI tool and has been reviewed before sending.”

Clients value transparency, especially when it comes to financial advice. Being upfront reduces confusion and avoids the risk of someone feeling misled later on.

2. Get consent before recording or transcribing

Using AI to transcribe or summarise a meeting is helpful, but only if the client agrees. Permission should be gained before any kind of recording or transcription happens.

Make this part of your process. You might include a consent checkbox in your meeting booking form or confirm it in writing by email. Store that consent alongside the client’s file or within your CRM.

3. Keep a human in the loop

AI tools can help with speed and structure, but final responsibility always stays with the individual. Anything drafted or suggested by AI should be reviewed and adapted before sharing with a client. 

For example, if you’ve used AI to compare two product options, double-check that the facts are correct and the explanation matches your intended advice. Tone also matters - make sure what’s being said sounds like something you would naturally say or write.

4. Remove personal data from prompts

When using AI tools, avoid using client names, financial figures or identifiable details in your input - unless you’re working in a secure and compliant environment.

Instead of copying and pasting full case details, reword them in general terms. For example, use a “client with average credit” or a “landlord with three properties” rather than including exact figures or postcodes. This helps protect data and lowers the risk of exposing personal information in tools that may not be built for regulated environments. 

5. Make it sound like you

AI can give you a solid first draft, but the message still needs to sound personalised to you. Whether it’s a recommendation email, a social post or an explainer for a hesitant buyer, use your brand’s unique tone of voice.

That might mean adding a line of reassurance you always use, changing a word that feels too stiff, or adjusting the structure so it reads how you’d naturally explain it. If you’ve got a team, consistency matters too. Don’t let your voice get lost in generic outputs. 


Compliance checklist: Staying on the right side of regulation

Using AI doesn’t change your obligations, it just means you need to be clearer about how decisions are made and recorded.

Start by saving the inputs and outputs of any AI-generated content that’s used in your process. That includes follow-up notes, advice summaries, or any information that ends up in a client file. Add those records to your CRM or folder structure so they’re easy to find if needed.

Don’t rely on AI to recommend products. Let it support you instead of taking its guidance as gospel.

It’s also worth writing down how you use AI across your firm: who can use it, when it’s allowed, and what the sign-off process looks like. That shows intent and gives your team a clear reference point.

For more on how this connects to regulation, check out the FCA’s expectations here:

Navigating AI: A broker’s guide to understanding the FCA’s view

How it works in practice - Example

You’ve got a remortgage meeting booked with a client who prefers email summaries. At the start of the call, you explain that it’ll be recorded to help generate a clean set of notes, and they agree. 

After the call, you upload the audio into an AI transcription tool. It creates a summary, picks out the action points and flags product mentions. You read through it, fix anything that feels off and add in a short note about next steps. 

The follow-up email includes a line saying the summary was supported by AI and reviewed for accuracy. You save the transcription, your edits and the final email in the client record for audit purposes.

Making AI work for you

AI can improve how you work, but how you use it still matters. Clear etiquette helps maintain trust, reduce risk, and make sure your advice stays advice. If you're bringing AI into your daily workflow, take time to shape it around how you already support clients.