People seek therapy because they need to be nourished. Good therapy, like good food, provides that satisfying nourishment and deep sense of well-being.
But personal therapy, like good wholesome food, doesn’t come cheap. As cheap, easily accessible fast food meets a hunger need, might AI provide the cost-effective answer when it comes to therapy? The proof of the pudding, we find, is in the eating.
Accessing treatment when you most need it is not always easy especially with long NHS waiting lists and limited services. Private therapy is often costly and sometimes finding the right therapist to connect with, and get the support you need, may take time. Will the revolution in AI provide the answer you seek? Like a fast-food outlet, or a ready meal, it is there; affordable on any budget, accessible, immediate, and promising satisfaction. And there are those who argue that this could be the solution to the mental health backlog that we face – the quick easy solution we all seek.
No such thing as a free lunch
It is not just in the field of therapy and mental health that AI is heralded as the solution we’ve all been looking for, with excitement about how it might improve our lives. But there are fears too about its lack of regulation and about its impersonal limitations.
It is not so long ago that we heralded the era of cheap ultra-processed food as the technological answer to a growing problem. Easy, budget meals, long shelf-life foods, convenience and accessibility – UPFs offered us so much. It is only now that we are recognising the real cost of cheap fake food. The most recent research indicates that Ultra-processed food is directly linked to 32 harmful effects to health, including a higher risk of heart disease, cancer, type 2 diabetes, adverse mental health and early death, according to the world’s largest review of its kind. Cutting costs comes at a price.
Fake food is bad for us. Will fake relationships be any better?
Artificial Intelligence could be the way forward in meeting the huge demand for mental health treatment it’s claimed. Depending on how you define “therapy” that is.
AI might well be able to provide us with information. So long as we are asking a question that others have asked too. And as long as we don’t need it tailored to our own unique set of circumstances.
In many therapies, information comprises a small part of what a therapist provides. But it isn’t ‘information’ which provides the nourishing benefit of treatment.
Therapy is about connection. With a good therapist, the client builds up a relationship of trust, a human relationship, and within the safety and comfort of that connection, can explore their own inner feelings, anxieties, pain. Over time, through that uniquely human connection, clients can heal and become whole, and wholesome. There is nothing fake or synthetic rather it is an intrinsically human experience.
Will AI based therapy leave you hungry for more
At some stage, it may be possible to create a robot that can respond effectively to the nuance of expression, the almost imperceptible change in tone of voice, the unspoken communication that sensorimotor psychotherapists work on to such impressive effect. But I for one doubt it.
Artificial Intelligence has its place – but it lacks the wholesomeness that real authentic treatment provides, and it can’t offer the lasting nourishment that the real thing delivers.
AI may be able to tell you what steps you need to take to be authentic with others. After all, you could read that in a book. But can it give you the opportunity to experience it? Often that’s what therapy is about, it’s an opportunity to rehearse what it’s like to have increased openness and visibility in a relationship with another. The chance to feel accepted by someone else despite our shameful secrets. To explore what we really want without fear of judgement. Can you really achieve this if there isn’t another person there to witness you? Or, like eating ultra-processed food, will ingesting something fake be bad for your health?
Is it prepared with love?
There’s no food as good as that’s which is prepared for us by someone who loves us and wants to nourish us. Any son or daughter will tell you this. Compare that with something that is mass produced. And the taste? It’s as if the meal knows it.
When therapy is offered by someone with a living, beating heart experiencing positive emotions for the person they’re relating to, the person often senses it. As individuals we operate like finely tuned ecosystems. There’s two-way communication between our brain and our heart. Our heart has its own cardiac nervous system consisting of 40,000 neurons which create heartbeats sending messages to the brain. Positive emotions lead to a smooth coherent heart rhythm which calms our brain and our body.
When we sit in a room with someone else, our ecosystem extends beyond our body. The heart’s electromagnetic energy field is 5000 times greater than the brain and influences our own bodily rhythm. But it can be also detected via ECG (electrocardiogram) in another person sitting nearby. So, when you sit in a room with a therapist who feels warmly and compassionately towards you, you will often sense it. I’m guessing we are a long way off a machine being able to soothe and nourish you in a way that another person can.
Should AI carry a health warning?
AI might also offer us some prompts between therapy sessions. Reminders of the things we know to do that keep us feeling healthier. Things we somehow forget when our rational brain goes off-line and our emotional brain takes over. Maybe it can also remind us that others are there offering a safety net? But in the end, it lacks the real lasting nourishment that wholesome connective therapy provides. It may give an apparent quick fix, but as with UPF, ultimately that ‘fix’ may be doing more harm than we realise.
Therapy, human to human, allows clients to play out their own fears, often their own shame, in safety, and by doing so, learn how to become their authentic self in their everyday lives and in their relationships with those who are precious or important to them. It’s hard to build that level of trust when interacting with an impersonal bot, however ‘authentic’ the voice or the image might appear.
Perhaps AI therapy, should come with a health warning attached.
Reviews of therapy apps aren’t great. You can check the apps out for yourself. But if you want to know a bit about what therapy with a human might feel like, please feel free to get in touch for a free consultation.