AI Companionship revisited
AI in Action
Patricia McAlernon
My previous article entitled ‘AI Companionship’ raised some interesting concerns by readers. It introduced a chatbot driven by AI called Companion which may help support mental health. Feedback received via social media created the opportunity to further discuss the technology and concerns with Sebastian Knorr, the person behind the innovation.
Sebastian agrees that human connection should always come first and that AI is unable to fully understand what it is like to live as a human in these challenging times.
The AI Chatbot cannot be held accountable like a therapist and should not replace relationships with other human beings. In other words AI is not a substitute for professional therapy or a replacement for a caring community.
It does not tell the user what to do and does not provide answers which intend to fix problems. It simply asks carefully designed questions with the purpose to help the user hear themselves more clearly. The technology is described not as a map but a mirror with a goal of self-discovery, not dependency.
Sebastian’s story includes his own personal journey of work stress, isolation, heartbreak and thoughts of ending his life. Companion played a role in how he kept going through very difficult times. It did not solve his problems but provided a space for him to reflect, process and keep moving forward. His own positive experience with the technology drives his motivation to encourage other people to use it and to ensure it is used with respect and transparency.
As with all technology, especially AI there is risk to the user, and how this is addressed is very important for Companion. The guardrails define the AI technology as a support tool and not a replacement for therapy or human connection.
In terms of accountability the methods used in the design of the AI model are explained in depth. The persons who build it are named and the principles guiding it are also described in detail. In terms of privacy and security, all data is encrypted and stored under Europe’s strongest security protocols. For the community, Companion includes optional sharing features to allow users to inspire and learn from one another which in turn reduces isolation.
We are living in modern times and the use of AI technology to some extent is unavoidable. It is already used in many areas of our lives including playing a role in our mental health wellbeing. There will be many different AI Chatbots available to help support mental health. Companion is one which is featured in this article because I am fortunate to personally know the creator, Sebastian Knorr.