Artificial Intelligence – an overview by James Norris

Categories: Opinion.

Artificial intelligence (AI) has become a very ‘hot topic’ over the past few months. Its mass adoption and usage is transformative in both professional and personal contexts. AI also evokes a number of sociological and ethical questions about authenticity, ownership and the validity of content generated. Legal frameworks, rules and regulations are now being developed to try and provide guidance and safeguarding whilst this area continues to grow.

The recent ‘Conversation with Cicely Saunders’ was facilitated by Dr. Justin Sanders. He used an AI programme to generate responses about palliative care from an AI persona of Cicely Saunders.

The responses provided are of a very high quality and appear to be provided from the late Cicely Saunders (1918-2005). The AI interview is a great example of how technology can be used to stimulate awareness around the importance of palliative care, documenting and sharing end of life wishes and preferences.

This is partly due to the quality of the content provided and the novel use of this technology within a palliative care context. 

Cicely Saunders work, writings and recorded content continues to inform and amplify her influence. Publications written since her death, institutions that include her name and the further development of the modern hospice movement continues to expand upon both her work and her legacy. 

The ‘Conversation with Cicely Saunders’ article will now be used to ask questions about AI and its use within personal, societal and professional contexts.


Ai and Cicely Saunders

Content created by Cicely Saunders before she died can be referenced and used as a primary source for research and learning. This includes a TV interview that was broadcast in 1983 now available online (Youtube). Such content also forms part of her digital legacy.

AI content is generated in realtime and is not a primary source. Content generated can be both factual or fictional.

In the ‘Conversation with Cicely Saunders’ the tone of voice might mirror that of the late Cicely Saunders however it is not hers. The information might appear to be from the late Cicely Saunders however despite its appearance, it was not created by her.

The article states that “using a new artificial intelligence program designed to create opportunities to converse with different historical and fictional characters, I set out to have a conversation about palliative care with her.”

This is of course, not technically correct. The platform enables people to converse with an AI engine that is interpreting information scraped from the internet and through the use of machine learning. Responses to the questions asked are those deemed most appropriate based on the information gathered and assembled. 

The social etiquette around how we describe and reference content created from AI platforms will change and evolve in the months and years to come.


AI using organic (primary) sources

In 2016 a number of holocaust survivors told their stories. These were recorded for an interactive, 3D video experience. This project involved interviewing survivors specially about their story and experiences from a historical context.

Once recorded and edited around 200 video projections were used to display the footage in a 3 dimensional way. Audience members are now invited to ask questions orally in order to interact with the installation. A microphone receives the questions asked and a natural language algorithm delivers a video response from the pre-recorded content. Just under 2000 responses are available from the holocaust survivor as shown on BBC Click (2016).

In this circumstance AI is used and information provided is organic and true to the question asked. By showing pre-recorded video responses, the authenticity of the content is clear and the content delivered can be cited as a primary source. 

The sources used and reasoning behind the Cicely Saunders’ AI responses are unknown. Some of the answers provided might be factually correct quotations. In some circumstances however and when relevant sources are not found, fictional responses might be provided.


Ownership, ethics, memory and legacy

AI brings up a number of ethical questions around ownership, ethics, memory and legacy.

Would Cicely Saunders for example, have wanted society to learn from her work through the use of interruptive AI or through traditional means.

Might the use of AI start to obscure Cicely Saunders teaching and rather than professionals reference one of her publications, reference an AI platform instead for speed, cost, convenience etc?

Similar questions might also be asked of obtaining information from search engines (like Google) and unregulated, third party publications published online.


AI Avatars 

When an image and/or someone’s voice bank is uploaded and integrated into an AI application an avatar of that person can be provided.

Questions around the grieving process and memory are two areas where emotions might be amplified due to the increased sensory nature of the AI content generated.

Virtual reality has been used to explore loss and grief in various immersive and expository ways.

In 2020 virtual reality was with an avatar of Jang Ji-sung’s late daughter. It created a fictional virtual reality narrative and used both immersive visuals, voice and special gloves that make the person using them feel as if they are touching a 3D environment (haptic VR gloves).



AI will continue to evolve and change the ways in which we consume information. It will impact the ways in which we behave and ways in which we cannot yet imagine. AI will also undoubtedly be used to further explore areas relating to death and bereavement in the months and years to come. 

As technologies evolve, the personification of AI generated content will further exacerbate the questions asked above and open up a multitude of new ethical and philosophical questions.


James Norris is the founder of MyWishes and the Digital Legacy Association.

James consults various governmental and non-governmental organisations across the globe in areas relating to death and the internet. He contributes and provides thought leadership in areas relating to death, bereavement, technology and the internet on TV, online, print and in scientific journals.

Appearances and publications range from BBC Breakfast and The Big Questions to the New Scientist and Vice magazine. He currently holds a post as the Digital Research Fellow at Michael Sobell House (London) and as the Design Research Fellow at the Velindre Cancer Centre (Cardiff).
James launched the Digital Legacy Association in 2015 to support professionals and the general public in areas relating to digital death. They argue that ‘planning for death digitally should form a holistic approach to advance care planning’.
He founded the free to use, care planning platform ‘MyWishes’. The service adopts a holistic, ‘digital first’ and public health approach to end of life planning.


Leave a Reply

Your email address will not be published. Required fields are marked *