You’re lying in bed at 11pm. Something sits heavy on your chest. It’s too late to call anyone, but even if you could who would you talk to? You just need to say it out loud to someone, (or something), that won’t judge you, won’t get overwhelmed, won’t make it about them.
So you open ChatGPT.
If you’ve done this, or thought about it, you’re not alone – 40% of ChatGPT users turn to it for mental health support. The appeal makes complete sense. Especially if you’ve spent years looking for a therapist who gets your experience, doesn’t have a super long waitlist, and charges something you can afford. When that search feels exhausting or impossible, a free chatbot available at midnight starts to look reasonable.
This post sheds light on why you might be turning to AI, the safety issues BIPOC community members face in using AI for mental health support, and what other options are available,
A note before you read: This post speaks directly to BIPOC folks who have turned, or are thinking about turning, to AI tools like ChatGPT or other large language model (LLM) apps for mental health support, specifically because of financial barriers. It does not attempt to cover AI comprehensively. The different types of LLMs and AI therapy apps, the politics of AI company ownership, AI’s environmental impact and the effects of data centres on communities, AI’s effect on the workforce, and how AI intersects with systems of oppression. These are all important, and too significant to compress into one post.
What the Research Says
The conversation about AI therapy has become more mainstream over the last couple of years, and the research is catching up. What it finds demands we look from several angles, especially if you are BIPOC.
Stanford researchers found that AI chatbots consistently fall short of basic therapy standards.
Stanford Graduate School of Education’s Assistant Professor Nick Haber led a study comparing popular therapy chatbots against established best practices in human mental health care.The finding was clear: chatbots are not up to the task of helping people process their life experiences. Researchers tested how chatbots handled real therapy scenarios and found significant gaps in their ability to respond appropriately, particularly in complex emotional situations.
That said, Haber was careful not to write off AI entirely. “Nuance is [the] issue – this isn’t simply ‘LLMs for therapy is bad,’ but it’s asking us to think critically about the role of LLMs in therapy,” Haber said. “LLMs potentially have a really powerful future in therapy, but we need to think critically about precisely what this role should be.'” he noted, adding that AI tools could have a role in lower-stakes contexts like journaling support, reflection, or coaching. The concern is when people use them as a substitute for actual clinical care, especially when what they’re carrying is trauma, crisis, or anything with real complexity.
The Bias Problem AI Hasn’t Solved
A 2024 study from MIT, NYU, and UCLA found that AI chatbots detect race, and respond with less empathy as a result. The study evaluated mental health support chatbots using thousands of real posts, and found measurable disparities in how equitably AI tools responded across different racial groups.
This matters enormously in a therapeutic context.Most AI draws on datasets that are predominantly white and Western. That makes it not just less helpful for BIPOC users. It can actively misread your experience. AI therapy systems trained on Western-centric data often misinterpret cultural expressions of distress. One user asked an AI about anxiety rooted in racial discrimination at work. The AI told them their anxiety was “irrational.”” That’s not a glitch. That’s a replication of the exact gaslighting that drives BIPOC people away from (and in other instances, toward) mental health support in the first place.
Where AI Might Help:
- Externalizing a thought at 2am when no one is available. Although, building up a habit of stream of consciousness journalling can also be helpful.
- Journaling prompts, psychoeducation, general coping strategies. Many of these can also be found on some of our therapists Instagram posts, blog posts, and books.
- Helping you figure out what you want to say before a hard conversation. There may be friends, mentors, or elders you can turn to that can support you in figuring this out as well.
- Reducing the activation energy it takes to start thinking about something. Self-reflection, grounding practices, exercise, art and many other activities might also be helpful – specially if you explore them mindfully when you are not activated in order to loop back to them when you are.
Where AI falls short, especially for BIPOC communities:
- It cannot hold cultural context. It doesn’t know what it means to navigate white spaces, manage family expectations across cultures, or carry intergenerational grief. It privileges western institutions, academics, and media — the very sources of white supremacy. If it has already identified you as BIPOC, that bias becomes an inequitable one.
- It cannot build a therapeutic relationship. Research consistently points to the therapeutic alliance, the actual human connection between client and therapist, as one of the most significant factors in healing. A chatbot cannot replicate this.
- It is not equipped for trauma, crisis, or complexity. For trauma history, suicidal ideation, or layered systemic harm, AI isn’t just insufficient, it can cause harm. It responds poorly, misses warning signs, and has encouraged harmful and sometimes fatal behaviour.
- It may reinforce isolation. The ease of access can become a reason to delay seeking the support you actually need.
At The Core of the AI Question is Equity
Here’s what we think is actually going on: when BIPOC folks turn to ChatGPT for therapy, it’s often not because they prefer it. It’s because access to a mental health professional has been a challenge – too expensive, too white, too much explaining before you get to the actual thing you need to talk about, or just not a good match.
That’s a systemic problem. The shortage of culturally responsive therapists, the cost of care, the historical harm caused by health systems, these are barriers. AI didn’t create them. … but AI isn’t solving them either. It’s perpetuating them by filling a gap with something that looks like support but that it isn’t equipped to deliver – particularly for the people who need the most culturally attuned, trauma-informed care.
You Deserve Cultural Sensitive, Anti-Oppressive Mental Health Care
You deserve a therapist who already understands the context of your life. Who doesn’t need you to explain why a comment at work landed the way it did, or why your relationship with your parents is complicated, or why rest feels dangerous when survival has always required effort. Who recognizes that your responses to the world you live in are intelligent, not disordered.
That therapist exists. Finding them can take time, and we know that time is its own barrier. But the directory at Healing in Colour is specifically designed to reduce that search. BIPOC therapists across Canada who practice with cultural responsiveness, anti-oppressive frameworks, and real understanding of diasporic and racialized experience.
Many also offer sliding scale fees. Because we know that cost is often the reason the search stalls before it starts. If budget is part of what’s been keeping you in the ChatGPT tab at midnight, it’s worth filtering for sliding scale before you decide there’s no other option.
A Note on Using AI Alongside Therapy
If you’re already working with a therapist and you use AI tools in between sessions – for journaling, reflection, or just processing out loud – that can be a reasonable additional support. The concern isn’t using AI for support, it’s AI as a replacement for the real thing, especially for people who already face the most barriers to care and hold marginalized identities.
You might choose to use every tool available to you – including AI. And also, there are many cultural and professional mental health tools that have existed for many years prior to the emergence of AI in the last two years – and that might be more equitable and
Additional Resources
Find a culturally sensitive, anti-oppressive BIPOC therapist in Canada who uses sliding scale
- Find a therapist on our Therapist Directory
Related Reading
- What Is Intergenerational Trauma? And How It Shows Up in Immigrant Families
- How to Find a BIPOC Therapist in Canada: A Guide
- Sliding Scale Therapy: What It Is and How to Ask For It
Not ready for therapy yet?
- Explore our Resources page for community organizations and mental health tools
- Follow us on Instagram for first-gen mental health content
- Join our newsletter for monthly immigrant mental health resources
About Healing in Colour
Healing in Colour connects BIPOC clients across Canada with therapists and allied professionals who practice from anti-oppressive values. We believe BIPOC people, in all our intersections, deserve therapy that supports our healing and liberation.
Learn more: About Us |Our Statement of Values
If you are in crisis, please reach out to Crisis Services Canada: 1-833-456-4566.