top of page

Interim report: Review Into Bias in Algorithmic Decision-making

This review seeks to answer three questions, focusing on 4 sectors - policing, financial services, recruitment, and local government:

1) Data: Do organisations and regulators have access to the data they require to adequately identify and mitigate bias?

2) Tools and techniques: What statistical and technical solutions are available now or will be required in future to identify and mitigate bias and which represent best practice?

3) Governance: Who should be responsible for governing, auditing and assuring these algorithmic decision-making systems?

Their emerging insights are:

1) While data is often the source of bias, it is also a tool in overcoming issues arising from bias. Example given is of collecting diversity data in order to assess whether decisions are leading to biased outcomes or not (you will not know if you’re discriminating against a gender, without knowing people’s gender).

2) Tools that identify and mitigate bias are already being developed, particularly in areas like finance. But awareness of what currently exists, or is being developed, and what constitutes best practice is limited.

3) Systems need to be trustworthy and as such we may need third party auditors to verify claims about how algorithms operate.

Read here.

bottom of page