Guidance

This is required guidance

It is legally required and it is an essential activity.

This Guide covers:

- England

- Scotland

- Wales

From:

- Equality and Human Rights Commission (EHRC)

Page last reviewed: 06 Feb 2023

Meeting your public sector equality duties

Public bodies should consider the public sector equality duty when thinking about whether to use digital healthcare technologies. This also applies to any digital healthcare technologies that public bodies are already using or that others are developing or using on their behalf.

Understanding the public sector equality duty

The public sector equality duty (the equality duty) was created under the Equality Act 2010. It covers the 9 protected characteristics: age, disability, gender reassignment, pregnancy and maternity, race, religion or belief, marriage and civil partnership, sex and sexual orientation.

Using digital technologies may lead to discrimination and deepen inequalities in health or social care. This is because of inherent biases in the training and development of digital technologies, including the data used to train them. Biases can accumulate over time as a technology is used.  You should monitor for discriminatory outcomes to make sure you are able to identify and tackle any bias or unintended impacts on people with one or more protected characteristics.

For an overview of the equality duty see the guides for public authorities in England, Scotland and Wales from the Equality and Human Rights Commission (EHRC). Note that equality is an ‘ongoing duty’. You should regularly monitor and evaluate digital technologies to make sure they are working as intended and not causing any unlawful discrimination.

The equality duty has 2 parts:

  • the general duty applies to public authorities and organisations carrying out public functions
  • specific duties apply only to public authorities named (or listed) in specific duties regulations

Meeting your general equality duties

The general equality duty requires public authorities and organisations to have due regard to the need to:

  • eliminate unlawful discrimination, harassment and victimisation and other conduct prohibited by the Act
  • advance equality of opportunity between people who share a protected characteristic and those who do not
  • foster good relations between people who share a protected characteristic and those who do not

Meeting your specific equality duties

The specific equality duties relevant to digital healthcare technologies are likely to be those relating to:

  • assessing equality impact (this applies in Scotland and Wales only)
  • procurement and commissioning (this applies in Scotland and Wales only)
  • setting equality objectives and publishing information to show compliance with the general duty (and with equality outcomes in Scotland)

Workforce-related obligations are also likely to be relevant if you are using digital healthcare technologies in your employment.

Doing an equality impact assessment is not a legal requirement in England but is good practice. Public bodies, if challenged, should be able to evidence how they have considered the potential equality implications of the digital healthcare technologies they are using or proposing to use. Doing a risk assessment is a legal requirement in England to meet the safety standard DCB0160, and this could include an assessment for bias.

Useful resources

See the EHRC’s guide to artificial intelligence in public services, which includes a checklist for public bodies in England and non-devolved and cross-border public bodies. It explains how to comply with the equality duty if you do not have a specific duty to do an equality impact assessment.

See the NHS Race and Health Observatory for resources to identify and tackle health inequalities experienced by Black and ethnic minority communities in England.

Note that inappropriate use of digital healthcare technologies may lead to breaches of laws such as the Data Protection Act 2018 and the Human Rights Act 1998.

Is there anything wrong with this page? Let us know