Welcome to the Therapist Burnout podcast. Today I am doing some hot takes on AI and therapy and the stuff they’re trying to market us.
Oh, my gosh.
So all things AI and hot takes and intersections with burnout. So let’s jump into it, because that’s what we do.
There’s just been a lot out there.
In the Internet streets, at least on LinkedIn, because that’s where I am on the Internet streets, AI and therapists.
So I get at least a few emails a month from, like, AI companies who are trying to market to therapists that want to come on the podcast. So I get lots of pitches for the podcast, by the way, that I save you from. Okay, so a lot of people don’t get on here because they just want to sell you something so they don’t get on for that. No, now stop it. So I know that a lot of therapists have reactions to the AI notes. Like, there’s a lot of these different companies that are essentially, they’re going to listen to your sessions and summarize the session and write your notes for you.
So the marketing tools from these AI note companies are, I think, a lot of times talking about, like. And I had one actually message me and say, our clinicians say that their notes are one of the highest factors in their burnout, which I guess in some ways is fair. Notes are a big pain point for them.
But that’s not the whole story. That’s not. I think reducing burnout to notes is really reductive and doesn’t really capture all the systemic issues where burnout comes from. The overwork, being underpaid, and not having the math.
Math, that’s where it lies.
But I, you know, there’s a lot of money behind these interventions, and I can see the utility. I use AI every day, and it’s like summarizing my podcasts and helping me write.
So I use, you know, tools that incorporate AI in a confidential way for report writing. So usually like a grammar checker or things like that. What other things do I use now for AI?
I know that some therapists, you know, are starting to use it in other ways. You know, perhaps I’ve. I’ve heard some therapist. And I’ll get into the LinkedIn comments because I started a thread on there.
If you don’t, if you don’t know.
Who I am, I’m Dr. Jen Blanchett. You can find me on LinkedIn. That’s where I hang out. So if you want to jump into some of these discussions, I would love you to Come in and just put your two cents in there because it’s important.
Where was I? Oh, my gosh.
Oh, yeah. People trying to pitch my podcast telling me that AI notes are the solution to therapist burnout. It’s not. It’ll help our stress levels 100%, probably, if we did this, but we.
I think what the tech companies don’t understand is the distrust that we have of tech from the infusion of tech into our space and trying to exploit our labor. So that is the long and short of it.
A lot of these tech companies, the ones that advertise on all the podcasts, I don’t know if they’re going to come against me. But not worse help, but help. You fill in the blank.
So the worst help.
And, and some therapists say they like working for this company, but I’ve seen some of the pay structures and some other sites and it’s not a lot. And what I worry about is we have a lot of younger clinicians who, by the way, have the highest levels of burnout who come out and they want to start their private practice.
They start out in one of these tech companies where you have to like, text your clients 10 million times and, you know, meet with them and then you have a certain pay structure for that. And I, I don’t see how that workload benefits therapists at all. Maybe they’re going to integrate some of these tools.
I don’t know. I don’t know where I’m going with all that. But anyway, yeah, the distrust of tech. So I think because a lot of us have been burned out from mental health tech, we have maybe worked for one of these companies or we see the writing on the wall that, you know, we are a cog in the machine for them and they’re not looking to us to, you know, provide any wonderful service. Anyway, there’s a distrust. We can just say there’s a distrust there.
So I think that’s one way AI is popping up in the therapist space. And interesting enough, I’m just going to go to the comments, so I’m going to look at some of that.
But before I do that, a little squirrel brainy here today. I didn’t know how I was going to approach this topic, so I built it up a lot in my head of how I was going to even talk to you guys about it because, you know, I just. It’ll come together in the end because I’m ending with something that I’m excited about. So last week, a randomized trial of a generative AI Chatbot for mental health treatment came that is the article title in the New England Journal of Medicine.
So that was from authors Heinz et al published in March of 2025.
So I’m just going to read a little bit of the abstract to you guys just so you know what the study was talking about. Generative artificial intelligence or gen AI Chatbots whole promise for building highly personalized effective mental health treatments at scale while also addressing user engagement and retention issues common among digital therapeutics.
We present a randomized controlled trial, RRCT testing and expert fine tuned gen AI powered Chatbot.
They’re a bot for mental health treatment. What do they do? They had their N was 210 adults and these adults were had clinically significant symptoms of major depressive disorder, generalized anxiety disorder or at clinically high risk for feeding and eating disorders.
And they’re randomly assigned to a four week TheraBot intervention with 106 in the TheraBot intervention and 104 on the wait list for controls.
They’re stratified into one of three groups based on mental health screenings.
Those with clinically significant symptoms of major depressive disorder, GAD or eating or feeding disorders.
And what did they find? They found that Therabot users showed significantly greater reductions in symptoms of major depressive disorder, GAD which were smaller changes in the GAD symptoms and the feeding disorder symptoms.
This is interesting. So Therabot was well utilized and received high user ratings.
Average use was greater than six hours and participants rated the therapeutic alliance as comparable to that of human therapists. So I saw, I don’t know if I can find this post on LinkedIn where someone posted this study, which is where I heard about it at first and they were like this is a great thing for mental health.
We’ve won.
And all the, all the therapists posting on the thread were like this is the end of us.
Like that was some of the react reactions where that’s the end of us.
No one’s going to use traditional therapy anymore.
This is crazy. We’re in some kind of alternate reality where computers are now taking over our jobs. This is not a great thing. And there were a couple comments like yay, you know, this is great.
There’s not access to mental health care. This would be great if people don’t have access to mental health care. To be able to access something like this potentially at a must much reduced cost because we know that there aren’t as many providers as there are clients.
So fair, right? Fair. I could hear that, but I also definitely hear the voice of therapists speaking up and, and really reacting to this. I, I think I’ve just not had my, my thumb on the pulse of this very much and I was a little like, oh, did I really know that there was therapy chatbots out there?
Maybe not. Maybe I’ve just been in my burnout little land and wasn’t, wasn’t available to me in my mind space. Wasn’t available to me. There’s a very narrow space sometimes that I look if for information because it’s so much, there’s so much information.
So it makes sense now when therapists comment a lot of times on these AI note tech solutions to have AI listen to your sessions and then write your notes that they’re going to train therabots to take our jobs.
So no, no, thank you tech. We’re not doing that. We’re not signing up for that today. But valid conclusion but that we might think that.
Right?
So in this post on LinkedIn, I, I kind of started saying that I keep hearing from therapists that it’s hard to keep up with how much things have changed in the last five years in the therapy space.
Like it’s, it’s, it’s hard enough just to keep up with like, okay, what’s the hot therapy modality and what training do I want to do? I, I mean that’s hard to keep up with like our own skills to make sure that we know stuff.
Then you know, we have this infusion of tech that has really exploded in the past five years in our space. And I think what I hear is like, you guys are concerned, I, concerned as well that any of it feels safe, ethical or confidential.
Destructive tech companies. I just spoke a lot about that.
This is a problem I have with the therapy chatbot. It, I, I don’t conceptualize that as therapy because I guess to me psychotherapy is a human experience.
So if AI therapy is different, can we just call it AI therapy or can we call it something else? We think, I think qualitatively it is different.
It’s not, I don’t think it’s going away. Right, it’s not going away. We can just assume that it’s not going away because it is coming, it is out there. But my problem is calling it quote unquote therapy because it does not involve the delivery of human care.
So maybe differentiating between human psychotherapy, AI psychotherapy might be a distinction we think about.
So I asked you guys, what would you add to the list if you’ve used AI successfully or any fears that you had so Brittany Lindsay check her stuff out on on LinkedIn.
I love her stuff. In her email list she said I wrote recent she was working on a piece for AI but hasn’t hasn’t done that yet. But I’m sure it’ll come out.
AI and mental health care isn’t just a technological shift. It’s a part of a much longer trend of devaluing relational, human centered work.
I can understand how people find AI tools helpful in the short term, such as with documentation, etc.
Or when prospective clients are on waiting list or feeling isolated or stuck and don’t have access to a therapist. That is real and I wouldn’t dismiss it.
But in my opinion we have to hold alongside a harder truth that sees the forest through the trees, for the trees through the trees. Which one is it? In the long run, people who benefit most from AI led mental healthcare aren’t clinicians or clients.
It’s insurers, tech platforms and investors. The goal isn’t healing honestly. Its scale, its efficiency, its workforce. Control AI companies.
AI allows companies to cut costs, reduce reliance on trained professionals, and turn care into data they can monetize. Meanwhile, therapists will still be left underpaid, burned out and increasingly replaced by tools trained on our labor but divorced from our ethics.
Oh, say that again for the people in the back. Meanwhile, therapists are still left underpaid, burnt out and increasingly replaced by tools trained on our labor but divorced from our ethics.
****, that’s good.
A lot of people mentioned like safety and concern. They’re worried about the safety in the data privacy.
Like what?
What are they going to do with this data that they’re getting?
Person said, can I Can AI do feelings and relationships now?
Touche.
All for helping AI, helping to reduce the workload of therapy.
I suppose it’s what you think therapy is. If it’s teaching people skills, then I can do that. If it’s having another person care and value you, even when you’re told about the worst, then that’s going to be a person.
So I’ve said, I’ve said this multiple times in this thread. AI does not have a nervous system. So if we want AI to teach CBT skills, DBT skills, general skills, or provide some general reflection for the person, maybe that’s a good tool. Maybe that’s something, you know, I see it more of like being like a AI coach. You know, a like something you can parrot some ideas, back off of, or summarize your thoughts, give you some reflections.
I think it’s a great use of AI Someone said I have to check out a video Wired put out on YouTube last week. I didn’t check it out yet. They interviewed a professor of AI and machine learning.
The fact that AI is not confidential, it does have bias and it doesn’t reason the way people do.
Even covers the fear that people lose their jobs and how AI compares it to and compares the invention of ATMs and how it impacted bank tellers. So I’m gonna put that in.
Let me. Let me look at that. Look at that.
Looks really good. It has 12 million views so far. So that’s wired professor answers AI questions is the title. So I’m gonna. I’ll do a hot take at the beginning of next week’s episode, which I don’t know what it’s about yet.
I said I was gonna do a money series because tax time is coming in the us I might just be squirrel brain again. We’ll see.
I don’t know. I’m pretty busy with my contract job right now, so you might get squirrel brain for a while. You tell me you like it.
Check out that Wired video. Looks good.
This person said I. It might be an unpopular pier. Leo Nielsen.
Pure.
I don’t know, it might be an unpopular opinion. But I love AI and it has become a loyal and indispensable partner in my daily professional life. That said, I’d never AI do anything for me, let alone instead of me.
Rather, AI collaborates with me, inspires me, helps me structure my thoughts, challenges me with feedback and supports my creativity.
I think of it as a highly competent assistant when I supervise closely, but who still is meaningful.
So meaningfully lightens my load. And that’s what I use AI for too, as well. And I don’t have a fear it’s going to replace me, at least in my lifetime.
I mean, I could be wrong, you know, I could be wrong. And we’re like, you know, live in some kind of like futuristic cyborg life in five years. I don’t know.
It’s fine.
And then furthermore, for what it’s worth, my clients and I are far more anxious down why were anxious and down depressed, I guess, this year than we were last year.
But AI has nothing to do with it. The causes lie elsewhere. Truth. Yes.
And I think we see that that mental health data. I’ve talked about anxious generation. I’ve talked about the rise of social media and the rise of the smartphone, which coincides with a stark rise in mental health symptoms and an increase in isolation.
So as AI does more for us, we might rely more on it. Yes, we might be able to like put ourself into a CBT thought record like nobody’s business. But we might feel still alone and sad and anxious.
I, I, I don’t know. I’m just, I’m just thinking that’s, that’s a reality if we rely only on it for connection.
And then Andrew Bryson, a trauma informed counselor, wrote this A main concern of mine is that AI inherently can only ever be a supplement to therapy. Trauma informed care requires one Practitioners are attuned and present, demonstrating authenticity, empathy and regulation. Truth that practitioners see, evaluate and appropriately respond to the subtle non verbal cues clients convey.
This may be more important than explicit communication. So true that serious reactions like dissociation are immediately responded to with accurate interventions.
Yes, AI cannot do these things. It can make it smarter over time, but it will never be a felt human presence.
There was that movie though. What was that movie where Scarlett Johansson Help me.
This is interesting. So I’m looking up the the Scarlett Johansson movie that featured her as in it was like a voice AI and she had a relationship with the character and she had a open AI clash over her own image and likeness.
Interesting. Her is a 2013American science fiction romantic comedy, a dramedy film where a man develops a relationship with an artificial intelligent operating system personified through a female voice. So that was that movie.
It’s just interesting. She ended in some legal situation over AI, which makes sense because she had a movie come out about AI makes a lot of sense. I wonder if it’s more prevalent in modern day society.
Like to disassociate with scrolling behaviors because I was just doing that the other night.
This one therapist says they collaborate with clients to see how they would like to use AI to support their treatment plans in conjunction with therapy.
In that regard, I’m very happy with the technology, but with the caution to my client to not share anything personally identifying with AI.
That’s from lcsw. Jennifer Horton I use gaming and lightsabers with IT professionals. I don’t. I can’t. That’s just their intro and in LinkedIn. I love that I use gaming and lightsabers.
That is an awesome bio. I’m. I’m liking that.
Rebecca Rydell, counselor in training and professional writer, editor, copywriter, says this is an interesting topic. I’m a counseling student working as an AI trainer to pay the rent.
Harmlessness training is one of my favorite tasks. Some models are trained to identify concerning mental health patterns and refer users to trained therapists and hotlines. Then it can Walk the user step by step to get help.
Finding a therapist is often a multi step process that can be daunting. First you need to find a network provider freaking insurance that’s in the U.S. if you’re not in the U.S.
i don’t even know. I mean I guess there’s universal healthcare other places where there’s no care. Right.
Make calls and emails and find someone who’s available and fit your busy schedule.
Just hard AI can help navigate a complicated system and follow through with a multiple tasks required to make that first appointment. Yeah, I mean that’s I, I think AI could be a great tool for like a waiting list management perhaps.
So if you’re first reaching out that AI can connect and say like hey, you know they you’re available for con like this. Your therapist is available for consult next week.
Here are some tools and resources if you’re needing to connect with providers sooner. Call this number. Call that number. That could be a good use of AI for sure. Yeah.
So that’s, that’s a LinkedIn roundup on AI.
Speaker B: All right, so I’m going to talk a little bit about the use of AI for therapists now. We got a little bit of the therapist roundup on LinkedIn. So I played around with one of these therapy chatbots.
Not played around, I utilized it.
So I used wisa WISA W Y S A so I engaged with it and used it for some thoughts that I was struggling with and I think it, it’s a CBT chatbot so it’s pretty much tried to put me into a thought record.
I did talk about some depression and so it prompted me with questions about suicidality and I was like okay, where’s this going to go? So I talked to IT about some transient thoughts of death and it initiated safety planning with me.
It gave me numbers for a, my local crisis helpline number if I knew them and then had different links for a crisis number. But I have with it is that it.
It tried to safety plan for me which I find. Okay, on on the one hand that that’s a resource.
Right.
If someone’s not in therapy that they’re given this information is good information to have to connect with a therapist. But for the therapy chatbot to say okay, what are your triggers for suicide?
What are the things that you need to do to keep yourself safe?
I think that’s a slippery slope. I think if a chatbot identifies that someone is in crisis it should immediately say you need to go to your local emergency room. You need to contact this Crisis number, you know, and keep following up in that way.
Hey, have you contacted them? I’m really concerned. Can you please contact them?
Yeah, I, I, I just struggle with that piece of the chatbot engaging in that way.
So I engage with that. I just would just, I was, I was trying to see what the chatbot would do with that information.
And when I went back into it, it didn’t have a memory of the conversation. So I’m like, what if I was feeling really depressed and down and potentially suicidal? And then I go back into the chat bot and it’s like it says I don’t have any memory of past conversations, but I’m here to listen. And I’m sure that some of these more advanced chatbots do have memory and you know, can continue the conversation. But this one, no, this was a paid account.
They gave me a seven day trial, so it had an option to add a coach if I wanted to add a coach, which I didn’t do. What I have issue is, is with AI, just the term AI therapy. AI therapist, AI therapy, chatbot. I think having a different phrase for that. AI CPT helper, AI companion or something like that. Ei AI emotional friend.
Fine, that’s, yes, but they’re not a therapist like we. I don’t, Is there AI doctor? Let me find it. What, in the ham sandwich? Yeah, I guess it’s out there.
AI doctor. You’re a trusted AI doctor.
Yeah. So anyways, I just think as therapists, psychologists, we have a hand in what we’re calling things, right? We, our skill is in describing and adding to an emotional experience of what and describing things like I, that is our, our gifting is how we, we can help the world describe what service they’re getting and help them understand.
Because I think the general public probably doesn’t understand what happens in a therapy room.
And if you were like me, when I practiced, I did not practice in a way where I utilized a lot of therapies. I utilized EMDR and also more expensive experiential therapies using human presence, using immediacy, reading, non verbal cues.
That was a lot of work that I did in therapy. Of course I brought in some of those interventions that we know can be helpful. I think CBT is a great modality.
It helps us understand our thought processes and the connections between emotions, situations in that cognitive triad. Helpful, very helpful. But it is one piece of the puzzle.
I think grief, grief work cannot be done by a machine. Maybe it can help someone identify.
Hey, do you know what resources are available for you if you are actively in grief. These are some tools that some people have found helpful according to research. X, Y and Z, whatever, fine, not a problem.
But what I know helps in grief is the witnessing that a human does to see the pain you’re in and acknowledge it. So that is something that cannot be replaced.
A recent article on the APA’s website, American Psychological association, it’s titled Artificial Intelligence and Mental Health Care. They go over the different types of AI and how AI is utilized in clinical decision making. Of course, I think it’s projected to grow.
It’s going to grow. So I think for therapists, sometimes I see posts like therapists get on board. If you’re not on board, it’s going to pass you up. And I’m like, stop. Just. No one wants to hear that just now, but let’s, let’s get into it. So administrative use. Yes, sure. Automating scheduling and appointment reminders. Streamlining routine communication such as providing educational information. I think that could be a great tool.
It has great applications in providing that educational component to therapy that feels like it’s mundane for us. And routine to provide information. Psychoeducation. Great use generating clinical notes. I think it can be great in that way if it can be done in a way that feels ethical and safe for therapists.
Facilitate billing such as checking insurance benefits and completing prior authorizations and submitting claims. Yes, I think that can be great. How about it can work on our behalf to advocate for these health care systems that want to reject claims.
I would love AI to do that. So can we sign us up for that one, please?
Clinical use early detection.
AI has the potential to aid in early detection of individuals who may be at risk for developing mental health conditions.
Yeah.
So perhaps by noticing patterns from data or analyzing medical records that.
I don’t really love that idea right now of AI looking at medical records, but perhaps it can detect things earlier. Like if, if I thought about this differently.
Right.
If it was cancer and AI could somehow pick it up differently or pick it up earlier and save lives. Yeah. I mean maybe in 10 years we’ll just, it’ll just be integrated and we, we won’t know any different. Right.
Okay.
Clinical support. AI is being incorporated into many types of clinical support tools. Digital therapeutics, which are evidence based, clinically validated soft programs and a category of digital tools. Yeah. So again, using like a CBT based skill, something like that.
We talked about that.
Training uses. So to help training, train clinicians, it’s not going to train them how to Listen, yeah, I don’t, I don’t think that that application. But maybe in case studies, I don’t know. Yeah, sure. Ethical considerations for AI. So there’s, there’s so much potential. So there’s the. Presently, there’s no overarching US federal legislation of AI. Key federal agencies have oversight over aspects of AI use.
Great.
So the FDA does have some regulation over AI and machine learning. And the APA does collaborate with the fda.
Great.
Okay.
We shall see. We shall see. See. What else? Another article by the apa. AI is changing every aspect of psychology. Here’s what to watch.
All right, so. And this is based on. This article is based on a. Speaking of psychology, which I listen to often.
And this is by Dr. Tom Griffiths, a professor of. A professor of psychology and computer science at Princeton. All right, some quotes from the article. A lot of people get resistance.
If we’re thoughtful and strategic about how we integrate AI, we can have a real impact on lives around the world. So they also cite Despite AI’s potential, there’s still cause for concern.
AI tools used in healthcare have discriminated against people based on their race and disability status. Rogue chatbots have spread misinformation, professed their love to users, and sexually harassed minors, which prompted leaders in tech and science to call for a pause on AI research in March of 2023.
So in this article, they asked the question, is AI safe to use? Is it ethical? What protections could help ensure privacy, transparency, and equity as these tools are increasingly used across society?
And again, I think therapists and psychologists are the most qualified to answer these questions.
Dr. Miner says one of the unique things psychologists have done throughout history is to uncover the harm that can come about by these things that appear equal or fair. In this article, they talk about AI in psychology specifically, and the need for clinicians to have tools they can understand and trust.
While chatbots lack the context, life experience, and verbal nuances of human therapists, they have potential to fill gaps.
We don’t have enough providers.
I’ve said that before. Other serious concerns include informed consent and patient privacy. Do users understand how the algorithm works and what happens to their data? In January, the mental health nonprofit Coco raised eyebrows after it offered counseling to 4,000 people without telling them the support came from ChatGPT three reports have also emerged that getting therapy from generative language models, which produce different text in each interaction, making it difficult to test for clinical validity or safety, has led to suicide and other harms. So the therapy chatbot that I used, WISA Chatbot, does not Use generative AI, but limits interactions to statements drafted or approved by human therapy.
It does not collect email addresses, phone numbers or real names and redacts information users share that could help identify them. So Minor and his colleagues are using AI to measure what’s working well in therapy sessions and to identify areas for improvement for trainees.
Interesting. And that was in, that was in 2022. So that was a while ago. For example, natural language modders could search thousands of hours of therapy sessions to surface missed opportunities to validate a patient or failures to ask key questions such as whether a suicidal patient has a arm at home.
Yeah, I mean again, I, I think we just have to conceptualize how would. I mean if I’m putting myself as the, you know, I have, I’m in therapy so how would I feel if AI was analyzing my sessions and I’d have to provide my client, if I was the therapist, informed consent on that. And I don’t, I just don’t think we’re there. I, I don’t know of any therapist who would feel like they would be wanting to do that.
The potential is there. So interesting areas of research with AI. So obviously AI is unlocking troves of new data on human behavior as stated in this APA article and providing power to analyze it.
So psychologists have long measured behavior through self reports and lab experiments, but can now use AI to monitor things like social media activity, credit card spending, GPS data and smartphone metrics.
So Sandra Matz is a associate professor at Columbia Business School and they’re looking at analyzing our online data.
AI opens up the opportunity for passive monitoring that may save lives. How do we feel about passive monitoring?
I understand, you know, it may save lives. I think the intrusion is what I’m thinking of in our life.
But anyway they are looking at testing an algorithm that collects screenshots of patients online activity to flag the use or viewing in terms related to suicide and self harm. Pairing data with these ecological momentary assessments or EMAs and physiological metrics from a smartwatch, they hope to build a tool that can alert clinicians in real time about a patient’s suicide risk.
Another interesting piece of what is going on in AI research is natural language processing models are also proving useful for researchers. A team at Drexel University in Philadelphia has shown GPT3 can predict dementia by analyzing speech patterns.
I think is hard with again I started talking about the changes that we’ve had in our field over the past five years.
There was Covid and the rise of teletherapy. That is Just that’s standard practice now. I think people routinely, if they’re working in therapy, provide telehealth services. And there’s. I think we can now say there’s no going back to everyone being seen virtually.
I mean, everyone being seen in person. Virtual care is part of therapy now.
And looking more into AI and AI therapy, chatbots, it’s going to keep coming. So it’s not that it’s not coming. It’s thinking of how can I understand it better? So if I’m going to stay working as a therapist, I can help my clients understand it.
And if I want to get into this field, perhaps if I want to assist in advocating for my clients, for safety, to continue to be a voice in the space, I think that’s.
That’s my role here, to have you.
Think about that, but to give you the information and the resources that you might want to look up the information. Because I know what you guys are like, because I’m the same type of person.
I think a lot of times there’s so much information coming at us, it can feel like I just want a Turtle head because la la la la.
I don’t even know, like, I’m just not gonna use AI. Like, AI is bad for therapy. I don’t think we can do that. Just like I know some therapists are like, I’m not doing telehealth.
I don’t do that. I’m gonna do in person. And I think there’s a market for that. I think you could do that, certainly. But I think understanding what clients might be using if you’re staying in therapy is helpful and also for yourself, you know, maybe you’d want to use one of these tools.
I use the tool, and I don’t think I want to use it personally, but I think the perception from younger generations might be different. They might be using tools like these more routinely than older generations would.
So I think it’s. It’s important just to be aware of what’s coming out. So today I want to end with something I put into ChatGPT. People often message me and ask me, what should I do with my career?
Should I quit my job? Should I close my practice? Should I find a bridge job?
Should I?
You know, I’m hopeless. I feel like I can’t work in another way.
So I struggle with the same things that you struggle with.
And so I am on this journey with you too, in many ways. And so I asked Chat GPT this question. You’ve read my chats? Like, I chat about My the podcast.
So pretty much all my ideas and my transcripts I have, I use AI to help me summarize them for you. And I asked it you’ve read my chats, tell me, how do you think I can lead a more connected and happy life in my career?
And AI said, oof, what a big beautiful question. Based on everything you shared in your past chats and the overall vibe of how you show up, I get the sense you already know what matters most to you.
You’re deeply connected to purpose, especially around supporting therapists who feel stuck, burn out, or like they’ve hit a wall in their careers. You’ve been in the trenches yourself. This gives your work so much authenticity.
So it gives me these recommendations.
1. Keep leading with your story. You’ve got powerful experience, career pivots, burnout, systemic frustrations, and they resonate.
You’ve made a space in your podcast.
And coaching for those real moments. Stay in touch with that truth telling voice, even when it’s messy or uncertain. It’s part of what keeps your work meaningful, not just for your audience, but for you.
So that reflection. Yeah, I sometimes when I think on this podcast about my story and sharing that with you guys very openly in a very public forum, it’s very vulnerable for me.
I feel sometimes again like I want a turtle head or I don’t want to share the things I’m sharing. But the reason why I keep doing it is because I get emails.
I get messages from the clients that I work with, but also from a lot of people just on the Internet and who listen to this podcast and they tell me things like, your story resonates so much with me.
I feel like I’ve walked in the same places you’ve walked in that it feels so good that you’re talking about this in a way that no one else is talking about it.
All those messages keep me going because sometimes it is very vulnerable and I want to hide. Honestly, if I’m being very honest, it tells me, number two, build deeper community slowly and intentionally.
You’ve already started doing this, Jen,
whether it’s through podcast guests, other connections, or bringing clinicians together in my home state of Maine. The key may not be spreading wide, but going deep, like one good connection or conversation at a time.
Intimacy and shared purpose over metrics. Oh, that is so good.
Yeah. And yeah, I was just reflecting on the deeper connections that the podcast has created for me, especially a couple of relationships. A couple few relationships certainly. And also with my coaching clients. Every time I’m on a call with one of you therapists who have found me through this podcast or through my LinkedIn posts.
I’m really grateful because I think what it’s helped me see is that it’s a universal experience a lot of people are struggling with in burnout, and I’m grateful for these relationships that have been created.
3. It says let your business support your life, not the other way around. You know the burnout signs.
Yes, I do.
You probably over functioned and over cared more times than you can count. ****, I’m gonna read that again. You’ve probably over functioned and over cared for more times than you can count.
Oh my gosh. I know that’s resonating for you, therapist. So a connected, happy career for you probably looks like a slower calendar, projects that energize you, and a space to play with your ideas without monetizing them right away and enough rest to stay grounded in your own body.
And this is just all from my reflections of what I’m telling I talked about on the podcast when I tell ChatGPT.
I’ve struggled with over the years from my story and with integrating the stories from you guys. This is emotional. So I’m pausing a lot as some.
Of these pauses get edited out, but they’re like 30 second pauses.
I want like just let my podcast go for 30 seconds with a pause. But I am pausing. I’m breathing, I’m tearful. 4. Let evolution be part of your brand. You’re not static and your work doesn’t have to be either.
What if your offering shifted over time, depending on your own seasons, growths or interests your people will follow if you stay aligned. I don’t think you need to pick one lane forever.
You’re building something that feels alive.
Mm. That hits. That is hitting. I am. I’ve been bubbling with this idea of bringing therapists together in person for a retreat with the focus on feeling alive again.
That’s what been my focus this year. That’s like my focus.
Word is alive.
The reason, and I say this so much on the podcast, the antithesis of burnout is being alive. Because what burnout feels like is depletion and stagnation. 5.
Make time to be off stage. You hold space. You produce, you guide. But what’s your offstage version doing? The one who doesn’t have to be a podcast host, the coach, or the person who figured it out.
Time for that version of you alone.
Speaker A: Or with the people who you see.
Speaker B: Without needing anything from you is crucial. And I know, I know therapists will resonate with that with the being the person that holds the space, being the person that is the guide, being the person that quote, unquote, should have it together.
And I think it just dovetails with the need to really be truly off being a human being with people that know you, that love you, that care for you. So this reflection from AI, when I got it, I just, I was like, wow, that’s powerful.
This is so powerful. There is so much good that can happen from it as well. So I wanted to add with this reflection and how it landed for me when I got this back from me using this over, you know, the past year and a half of doing this podcast, that it can have powerful reflections, but it is not the same as having another human see you and know you.
But it’s still powerful.
So there’s a way that we can integrate some of the power that it can provide this. This innovate innovation that we’re seeing in our lifetime, which is very cool.
That can be helpful. I think of the parallels between the advent of social media and that newness, the connection we felt with other people when it first came out, how it really changed the game for us, and then the thing that we thought would help connect us to other people and we could find people that we haven’t talked to for years.
I remember first being on Facebook when it first came out and how powerful that felt. And now feeling like social media is pseudo connection. It’s the fast food connection that we get.
It’s a dopamine hit that we get from a squirrel or get from a like. But it’s not the same as human connection. So the pseudo connection, perhaps that. That you’re getting with a therapy chatbot is what I’m terming pseudo therapy.
It isn’t real therapy, it’s something else. Is it still a tool we want to use? Probably, probably. But I think just like social media, it is that pseudo connection and it still has utility, perhaps in our lives.
We have to figure out ways to use it. It’s new. And so it makes sense that as the professionals, as people utilizing this technology, we need to think through how we’re using it, because we’ve seen what happened from social media.
I’ve talked about that a lot on the podcast. We’re more isolated as a society. And so as our world becomes more digital, more computerized, it becomes less human. And so I think we need to think of ways that it gets us back in touch with our humanity, not divorce it from our humanity.
I will end with that. So if you are not on my pinpoint list. Join at the top of the show notes I send you tips every week and I’m more vulnerable on the pen pal list which I was vulnerable today, but I often give you more stories.
And reflections from my lives that feel.
More vulnerable to share in this very public format and I’m thinking of ways that we can connect and feel more alive together. So if you want to I I’m just dreaming of more community for you honestly and what that looks like and I want your input.
So sign up for the pen pal list. I do write back.
I still have that bandwidth to write back.
Sometimes it might be a little bit later. It might be like the later gram but just like a pen pal. If you ever had a pen pal it was like snail mail.
You would write a letter, they would write you one back and you get it like a week later. So think through that.
Thank you therapist.
I hope you enjoyed this episode.