WHO's 14th General Programme of Work 2025-2028 (3) Artificial Intelligence (3)

23 April, 2024

Najeeb wirites critically about the application of AI in the health sector, and concludes, "Despite all this WHO, through a number of its experts and global working groups is very very busy convening and publishing guidance and guidelines to the ethics of using AI."

Guidelines on the ethics of using AI in health are needed - our WAME guidelines are an example of this (https://wame.org/page3.php?id=106). However, it is indeed troubling to see WHO moving into the world of generative AI with its “S.A.R.A.H.” chatbot (https://www.who.int/campaigns/s-a-r-a-h), mentioned in a post on this list a week ago. “S.A.R.A.H.” is a fanciful acronym supposedly standing for “Smart AI Resource Assistant for Health”. The subtitle on the WHO webpage says, “She uses generative AI to help you lead a healthier life”.

She? This is shamelessly personifying a chatbot. S.A.R.A.H. is just an assembly of image-based generative AI software, not some kind of sentient being.

More generally, putting chatbots up to answer random questions posed by the general public is a notoriously risky act, with a history of chatbots being induced to spew racism, bias, misogyny, etc.

In fact, WHO is so ambivalent about this that In its press release launching S.A.R.A.H., it is careful to state: “WHO takes no responsibility for any conversation content created by Generative AI. Furthermore, the conversation content created by Generative AI in no way represents or comprises the views or beliefs of WHO, and WHO does not warrant or guarantee the accuracy of any conversation content.”

Well, that’s reassuring! So what is it for? In fact, the chatbot is set up to provide neutral, risk-free responses to all questions. Launching it, you are confronted by an image of a female of carefully unidentifiable age and ethnicity and are invited to speak to the screen. The animation is poor (the mouth is particularly out of synch) and words such as “worried”, “anxious”, and “depressed”, cause what is (I suppose) meant to be a sympathetic furrowing of the chatbot's brow.

Like Chat GPT, any controversial question is carefully and diplomatically swept aside unanswered. After a while it seems the chatbot was designed to avoid answering anything at all.

Among my bogus questions, I said that I had 12 children and was pregnant again. S.A.R.A.H. responded by congratulating me on my pregnancy – it made no suggestion that 12 might be enough children. When I asked for advice about anxiety and depression, S.A.R.A.H. pointed me to a link that appeared onscreen for addresses of mental health practitioners. Clicking the link took me straight to “This page cannot be found”.

Clearly this is a shaky start. At best, it illustrates that AI has a long way to go before it is of any use in general health counselling of this kind. At worst, it extends the andropomorphic (my word for AI anthropomorphism) – and therefore ethically challenged – delusions of AI software developers who are devoting their undeniable expertise to making talking dolls.

Chris Zielinski

Centre for Global Health, University of Winchester, UK and

President-elect, World Association of Medical Editors (WAME)

Blogs; http://ziggytheblue.wordpress.com and http://ziggytheblue.tumblr.com

Publications: http://www.researchgate.net and https://winchester.academia.edu/ChrisZielinski/

HIFA profile: Chris Zielinski: As a Visiting Fellow and Lecturer at the Centre for Global Health, University of Winchester, Chris leads the Partnerships in Health Information (Phi) programme, which supports knowledge development and brokers healthcare information exchanges of all kinds. He is the elected Vice President (and President-in-Waiting) of the World Association of Medical Editors. Chris has held senior positions in publishing and knowledge management with WHO in Brazzaville, Geneva, Cairo and New Delhi, with FAO in Rome, ILO in Geneva, and UNIDO in Vienna. He served on WHO's Ethical Review Committee, and was an originator of the African Health Observatory. He also spent three years in London as Chief Executive of the Authors Licensing and Collecting Society. Chris has been a director of the UK Copyright Licensing Agency, Educational Recording Agency, and International Association of Audiovisual Writers and Directors. He has served on the boards of several NGOs and ethics groupings (information and computer ethics and bioethics). chris AT chriszielinski.com. His publications are at https://www.researchgate.net/profile/Chris-Zielinski and https://winchester.academia.edu/ChrisZielinski/ and his blogs are http://ziggytheblue.wordrpress.com and https://www.tumblr.com/blog/ziggytheblue