On 5 September 2023, the Financial Services Consumer Panel (FSCP) published a final report on financial services firms’ personal data use, in which it considers whether that use is leading to bias and detriment for consumers with protected characteristics.
The report addresses the assumption that some groups of consumers are experiencing bias and detriment, related to their protected characteristics, due to the way in which financial firms are using personal data and algorithms.
The report highlights the following key findings:
- Despite strong anecdotal evidence, categorically evidencing that algorithmic decision-making is the cause of bias is challenging because systems are opaque, making it difficult for third parties to truly understand what is happening ‘behind the scenes’.
- There are important debates to be had about where to draw the line in terms of ethical use of personal data to make risk-based decisions, in terms of what is and is not fair, reasonable or proportionate.
- Clear areas of concern include consumers experiencing unfair bias relating to their ethnicity in terms of access to products, pricing of products and services received; and unfair outcomes in relation to disability. Evidence about impact in relation to other types of protected characteristics is thinner, although this does not mean they should not remain an area of focus.
- Evidence and experts note the critical role that regulation will have in addressing the issues and concerns raised about the use of personal data and algorithms and call for greater pressure on firms to have oversight of personal data and artificial intelligence (AI) they use.
- The extent of evidence around this topic, anecdotal or otherwise, justifies the FSCP’s concern on this subject.
In terms of actions to take as a result of these findings, the FSCP notes that:
- There is a need for conversation and public debate on the topic of bias and AI in financial services, and time is of the essence.
- Debate should focus on and culminate in a set of agreed principles, around what personal data is reasonable to use to determine access, price and outcomes in financial services when risk-based decisions are made.
- There is a need for more transparency about how data is being used to evidence areas of concern, and the FCA has a strong role to play in operationalising this.
- There is a need for firms to be more proactive in evidencing no bias is occurring. Regulation would be the most effective way to drive change in this area and provide clear lines of accountability for firms in terms of what they need to do and evidence.