
Researchers from the University of Westminster have developed a self-assessment chatbot as part of a strategy to reduce health inequalities among different ethnic groups.
The Chatbot-Assisted Self-Assessment (CASA) protocol has been co-developed with input from underrepresented groups, and been used in developing the conversational AI tools.
The university said this is breaking new ground in using the technology and providing ethnically diverse users with personalised health assessments and actionable recommendations.
It has reported that participants in the UK based research study said they were comfortable in disclosing anonymous sensitive health information to secure chatbots, which shows their potential as supplementary tools for health education and self-assessment.
It also revealed that participants said chatbots that provide explanations for medical inquiries used in self-assessment are appropriate for discussing sensitive issues such as sexual health screening.
They also emphasised the importance of anonymity and trust in AI systems.
Tackling inequalities
Dr Tom Nadarzynski, who led the study at the University of Westminster, commented: "The CASA protocol demonstrates how AI can be co-designed with diverse communities to enhance engagement, trust and accessibility in healthcare. By ensuring that chatbots are inclusive, we can tackle longstanding health inequalities."
The university said that, while the study was initially applied to sexual health, the CASA protocol is adaptable to other areas, including chronic disease management and mental health support, and holds potential for wider healthcare applications to address critical health disparities.
The work has been funded by the NHS AI Lab and The Health Foundation.
Read the full study in PLOS Digital Health.