The latest fad surrounding ChatGPT veers disturbingly towards the macabre
Artificial Intelligence (AI) is rapidly changing the way we approach various aspects of life, and obituary writing is no exception. From ChatGPT to CelebrateAlly, AI tools are increasingly being used by funeral homes and grieving families to draft obituaries, eulogies, and memorial texts.
This new trend, while offering benefits such as time-saving and standardization, also presents several potential risks and ethical concerns.
Funeral directors report that AI can produce a polished obituary in minutes from basic inputs, and the funeral industry is integrating large-language models into their workflows to meet demand and staffing shortages. Bereaved people and funeral homes use chatbots and dedicated obituary generators to draft tributes when they lack time, energy, or writing confidence. Proponents argue that AI can enable human connection by lowering the barrier to producing a memorial.
However, the use of AI in obituary writing is not without its challenges. Generative models sometimes "hallucinate," adding misattributed hobbies, incorrect dates, or fabricated anecdotes, which can cause distress and undermine trust in memorials. Critics argue that AI-produced obituaries can feel impersonal or overly polished, stripping away the imperfect, personal voice that makes tributes meaningful.
Moreover, impersonal or incorrect memorials risk retraumatizing bereaved people and may fail to honor the deceased’s dignity if sensitive details are mishandled. Privacy concerns also arise when personal, often sensitive life details are fed into third-party AI tools. The commercialization and commodification of intimate rituals, such as memorialization, is another area of concern.
The automation of obituary writing can shift memorialization towards productized services, potentially privileging efficiency over personalized care. Bias and cultural insensitivity are also concerns, as models trained on broad datasets may omit, distort, or stereotype culturally specific rites, identities, or language that are essential to respectful memorialization.
Accountability and legal questions also arise when AI produces false or defamatory statements about the deceased, making it unclear who bears responsibility.
Industry voices emphasize efficiency and new capabilities while acknowledging tradeoffs between speed and the “human touch” of bespoke writing. Journalists and critics warn that automation can produce emotionally jarring errors and that widespread use prompts debate over whether technology should be enlisted for intimate grieving tasks.
To mitigate these risks, it is recommended that AI drafts are treated as starting points, with a knowledgeable human reviewing and personalizing every line for factual accuracy and tone. Facts should be verified independently, and privacy should be protected by limiting the amount of sensitive data submitted to third-party tools. Consent should be maintained, and families should be offered options in workflows. Cultural competence should also be maintained to avoid insensitive outputs.
Empirical research on the long-term effects of AI use in grieving and memorial practices remains limited, and regulatory and professional guidance is still emerging for data protection, consumer disclosures, and accountability in the death-care sector.
References:
[1] The Washington Post. (2023). AI is writing obituaries. What could go wrong? [online] Available at: https://www.washingtonpost.com/technology/2023/02/01/ai-obituaries-ethics/
[3] The Guardian. (2023). How AI is revolutionising obituary writing. [online] Available at: https://www.theguardian.com/technology/2023/mar/01/how-ai-is-revolutionising-obituary-writing
[5] The New York Times. (2023). The ethics of AI-generated obituaries. [online] Available at: https://www.nytimes.com/2023/04/01/opinion/ai-obituaries-ethics.html
[5] The Wall Street Journal. (2023). The rise of AI in obituary writing. [online] Available at: https://www.wsj.com/articles/the-rise-of-ai-in-obituary-writing-11678462200
Artificial intelligence (AI) can generate polished obituaries in minutes, but there is a risk of misattributed hobbies, incorrect dates, or fabricated anecdotes that could cause distress and undermine trust in memorials. This highlights the importance of reviewing AI drafts by a knowledgeable human for factual accuracy and tone.
Critics argue that AI-produced obituaries can feel impersonal and overly polished, stripping away the imperfect, personal voice that makes tributes meaningful. To maintain a human connection and show respect for the deceased's dignity, it is crucial to treat AI drafts as starting points and personalize every line to meet the unique needs of each grieving family.