The role of Therapists in the digital media debate
What would you say to those who think the digital media 'debate' is political and beyond our remit as psychotherapists?
Just as Freud said that everything in life is about sex apart from sex which is about power, in his 1924 novel The Magic Mountain Thomas Mann asserted that everything is politics because all human interactions are about power.
As a democratic nation, it’s incumbent on all of us to keep abreast of what is happening in government and the debates that influence public policy. As psychotherapists, the impact of public policy on our clients is something we deal with every day. There is a common misconception that therapy is about looking inwards. Whilst this is absolutely the case, it’s not the full story. As a women’s psychotherapist and coach, the impact of the environment - including public policy on: gynecological health, VAWG, women’s workplace rights, and AI are intrinsically relevant to the work.
The tech elite influence geo-politics, economics and culture, and AI is at the forefront of this.
As I wrote in a recent LinkedIn post:
As a therapist specialising in women’s mental health, people often ask: "Why are you presenting at AI conferences?"
The answer is simple:
We cannot build a healthy future for women if the architects of that future don’t understand the female psyche.
Organisations such as the BACP have an important role to play in advocating not just for its members but also for all of our clients. As a BACP media representative, I also try to reflect and represent the issues that matter to my clients.
Since re-training as a psychotherapist - I both trained and worked with Rape Crisis - I have been supporting women for over a decade. During my research ahead of my presentation on the ethical considerations of AI at the BACP Annual Private Practice Conference in September 2025, I discovered that all of my clients use chatbots for mental health and/or relationship support. This was echoed in my media work when I was interviewed on BBC Radio 5 Live and then on BBC Morning Live - both programmes featuring [different] women using chatbots in this way.
Every woman I have met - professionally or personally - has experience of sexual abuse [I use the term abuse here to reflect the legal definition of sexual harrassment being categorised sexual violence or abuse]. So when stories around non-consensual image generation or sexist websites like check her body count are dominating the headlines, my clients are inevitably going to be talking about them in our therapy and coaching sessions.
Power is a recurring theme in the therapy room. I was recently commissioned to write and broadcast a ‘Lent Talk’ for BBC Radio 4. ‘Power and Relationships’ describes my own journey as a survivor/therapist and how working with the BACP and Rape Crisis helped me heal and help others heal through shared power and post traumatic growth. This was broadcast on International Women’s Day. On Mother’s Day, Mina Smallman - Campaigner for Women’s safety and Police reform (the mother of Biba and Nicole Smallman who were tragically murdered) broadcast ‘Power and Vulnerability’. Both of these broadcasts reflect the impact of public policy on women.
Should/how can we use our experience, knowledge and expertise both in service to clients and for greater societal good?
Psychoeducation is a really important part of my work, and as a BACP media representative, I am always evaluating how what I talk about in the media reflects my practice. My supervisor has always encouraged me to bring my whole self to work, and in this vein, to ignore my knowledge and experience of working in tech; would be suppressing this. When I presented at the BACP workplace conference in February - to some 600+ attendees - many of whom were not therapists the feedback I received was that it was both informative and refreshing to have a presenter who had knowledge and experience of working in technology (as well as being a therapist).
During my time at Google, I coached and mentored female founders. Working with these inspirational women, enabled me to observe how they didn’t differentiate between the professional and the personal in our sessions. To them, it was all just life. This meant that it was as common for us to discuss rounds of IVF as it was to discuss rounds of investment. This was my lightbulb moment that led to me re-training as a therapist. I recognised that whilst I was a fairly decent coach, my ability to help this cohort in particular, would be massively improved by deepening my knowledge of psychological approaches.
Since retraining, I still work with female founders as a business advisor. Femtech and women’s health startups are particularly penalised by algorithmic bias. The existence of organisations such as CensHERship demonstrates the extent of this with 95% of women’s health content creators experiencing censorship online. In my personal experience as a business advisor and coach, I have listened to countless stories of algorithmic suppression around products designed for female health. When at the same time, products designed for men’s health and indeed pornography, remain unchecked (often amplified). The result is the creation of echo chambers for hyper-sexualised content that is readily available to men and boys.
One of the symptoms of this is the rise of non-consensual image creation on platforms such as Grok, but also sexist AI powered websites such as ‘check her bodycount’ which I wrote about in the Metro recently.
But it’s important for us to be aware that all algorithms are biased. Because we are. Because in a patriarchal society - where women’s achievements have been largely suppressed, minimised or erased and women’s needs have been discounted or ignored - information about women is woefully inadequate. Inevitably, this means that women are both under and misrepresented in the content that is fed to the algorithm.
In 2018, Amazon had to withdraw their in-house developed AI recruitment tool because it completely screened out women. And in 2025, Workday - a global HR software platform used by both public and private sector organisations - was the subject of a class action suit in the U.S for discriminating against gender, age and ethnicity in its AI powered recruitment tool.
Having just presented at the FMA (Family Mediators Association) conference this week, I commented how some of my clients may well be some of their clients. And to this end, everything that happens in the world of technology and in particular with AI, is relevant to us as practitioners. Because it’s already shaping our clients’ lives.