AI Therapy & Therapists Using AI to Make Therapy Better

🕑 6 minutes read

There is a growing interest in using artificial intelligence (AI) in therapy and mental healthcare nowadays, both on the therapist’s and the client’s sides. We hear a lot about clients going for AI therapy using chatbots instead of going to a therapist for therapy, especially after the pandemic, to get a cure for their mental illnesses. So the question that arises here is, “Is AI going to help clinicians or get them out of their jobs?”

But chatbots are not the only use case for AI in mental health. There are plenty of other AI systems and therapeutic tools based on machine learning, created using complex algorithms. Some of these are actually designed to assist a mental health professional in providing better mental health care. These tools might not even be visible to the clients. Let’s review some of these technologies.

AI-based chatbots that can be used in between sessions

Chatbots as virtual assistance

There are bots that schedule visits and answer requests for therapy. This is something that most businesses use nowadays to manage inbound requests from potential clients. Therapists can do that as well, and there are HIPAA-compliant bots available.

Chatbots provide homework and analyze journals

Some companies offer AI-enhanced diaries, where clients are encouraged to enter their thoughts in between sessions and some simple AI analysis is performed. This is usually based on sentiment and some keywords. For example, Limbic provides that.

HIPAA, PHIPA, SOC2 Compliance Logos

Chatbots provide CBT interventions in the form of a structured conversation

Companies with high reputations such as Woebot and Wysa are known to almost everyone. The interaction with them is very streamlined. AI is not heavily involved, as it’s primarily a scripted dialog with well-known CBT interventions. Sometimes clients complain that it is not very tailored and does not feel personal. But together with therapy, it could be a valuable addition, a modern form of homework.

Unsupervised chatbots chatting with a client

This is a more experimental field. It is questionable whether it can even be classified as clinical or evidence-based. Those bots just discuss topics with the client based on the statistical data they have after absorbing lots of internet content. Replika AI is an example. The client can create their own avatar or a friend who is supposed to help them with their well-being. Replica positions itself as a therapy proxy, but it is unclear whether the bot provides any value other than easing symptoms of loneliness in the short run.

Between-session communication is also vital, and documenting it effectively is just as critical. Tools like Mentalyc AI that focuses on streamlining essential clinical documentation tasks, enabling mental health professionals to spend less time on notes and more time on care. With tools to automate documentation and organize records, Mentalyc helps ensure accuracy and compliance without the extra administrative burden.

Benefits of using AI bots in therapy

Some of the benefits of using chatbots in therapy are as follows.

Increased access

AI-based therapies can be accessed remotely, which can be especially helpful for people who live in rural areas or have mobility issues.

Reduced stigma

Some people may feel more comfortable discussing sensitive issues with an AI than with a human therapist.

Automate documentation and session analytics with Mentalyc

From intake to progress and treatment planning — automated, and aligned with your clinical style.

  • HIPAA, PHIPA & SOC2 Compliant
  • SOAP, DAP, BIRP, EMDR & more formats
  • SMART treatment plans with goals & tracking
  • Alliance insights
Alliance Genie dashboard preview

Consistency

AI-based therapies can provide consistent and standardized care, which can be helpful for people who have difficulty forming a rapport with a human therapist or who have had negative experiences in therapy.

However, the most popular opinion on the topic is that AI-based therapies are not a replacement for human therapy. They can be a valuable supplement to traditional therapies, but they should not be used as the sole form of treatment. It’s also important to carefully consider the ethical implications of using AI in therapy, as well as the potential risks and limitations.

AI scribes and session analytics

AI that writes progress notes for private practice

There is one tool so far out there that does it, called Mentalyc. With Mentalyc, you can easily input patient information into their AI program and have it generate a personalized progress note automatically. This not only saves time and energy but also ensures the accuracy of patient data.

Plus, with Mentalyc’s built-in natural language processing capabilities, the generated progress notes are tailored to each specific patient, making them both comprehensive and easy to read. The ability to quickly create accurate therapy progress notes helps make private practices more efficient while improving overall patient care.

AI that tracks interventions and transcribes Zoom calls for community health providers

Eleos provides a baseline for a note, but its core value proposition is to label interventions during sessions.

AI giving feedback to therapists in training

Based on academic research, Lyssn offers tools aimed at training young clinicians. Lyssn also provides transcripts of sessions, but without session notes, meaning that they transcribe the whole session verbatim.

AI note-taking in psychotherapy involves using artificial intelligence to automatically transcribe and analyze the content of therapy sessions. This can be done using voice recognition software or other AI-based tools.

Watch real stories from Mentalyc users

Try Mentalyc for Free

Benefits of using AI for note-taking in psychotherapy

Some potential benefits of using AI for note-taking in psychotherapy include:

Increased efficiency

AI can help therapists save time by automatically transcribing and organizing their notes, allowing them to focus on providing care to their clients.

Improved accuracy

Artificial Intelligence can help ensure that notes are transcribed accurately and consistently, which can be especially helpful if a therapist has difficulty writing legibly or works with clients with strong accents.

Enhanced analysis

Artificial Intelligence also can help therapists analyze the content of their sessions in ways that may not be possible for a human, such as identifying patterns and trends over time.

Improved compliance

AI such as the one from Mentalyc focuses on proving medical necessity and organizing the notes in a way that helps pass audits.

Saved time

The average time spent on a note outside of a session is 15–20 minutes. With AI, it takes up to 3 minutes to read, adjust, and sign.

However, it’s important to carefully consider the ethical implications of using Artificial Intelligence for note-taking in psychotherapy, as well as the potential risks and limitations. Therapists should also be transparent with their clients about the use of AI in therapy and ensure that their clients are comfortable with this approach.

Mentalyc offers client consent and takes good care of data security by anonymizing transcripts and not storing raw session data. It was reviewed by ethics experts, lawyers, psychology professors, and clinicians.

Diagnostic and monitoring tools

Advancements in technology have made it possible to approximate the presentation of certain diagnoses (mostly depression and anxiety) from different data streams such as voice, mobile phone information, or interactions with games.

Voice Biomarkers

Lots of tools have emerged on the market, including Kintsugi, or Sondehealth. Those tools mostly detect depression or anxiety from voice effects, or at least that’s what they say. It is questionable how insightful and useful that can be for a therapist.

Digital Phenotyping

A growing number of mobile apps claim that they can detect depression and anxiety based on mobile phone data. For example, geolocation can be used to see whether the client spends most of their time at home or takes some walks. Mindstrong, led by Thomas Insel, pioneered this approach about a decade ago. Since then, there have been many projects, but the use cases are not very clear.

Most of the tools promote it as monitoring the mental health of employees, but there are some ethical concerns around it. Most of the apps are still in the research phase. Approximation of PGQ9 and GAD7 from simple phone data comes across as simplistic and potentially violates privacy. But it is possible that, with time, this technology will become more precise and companies with high-security standards will manage to succeed.

Games monitoring changes in diagnosis

Simple games that allow monitoring the state of depression and other disorders have recently hit the market but have been the subject of research for some time. Clients can engage with them between sessions or while waiting for therapy to start, and the game can check and alert the clinician if there is a higher risk. Some companies, like Thymia, want to at some point be able to treat disorders using games.

Beyond AI – Other advancements in clinical technology

VR exposure therapy

Amelia VR is an example of a company that does that (previously called Psyrius). There is also a German company called PsyCurio offering the same for the German market. They usually have a lot of different programs for simulating situations and exposing clients to their phobias. Those tools should be used in a clinical setting.

The Bottom Line

AI is proving to be beneficial to psychotherapy just as it is for all other professions. New AI-based systems and software are replacing the old, slow, and inefficient methods of psychotherapy. They have proved to be very effective so far. Psychotherapists use them to make their psychotherapy practice better and faster. Therapists are encouraged to take advantage of these resources and not overlook them and get left behind in this era of artificial intelligence.

So, if you haven’t started making use of AI in your private practice and are looking to start using AI in your practice, Mentalyc is the way to go. This powerful platform allows you to quickly and easily set up an AI-driven system that can help you provide better care for your patients.

Disclaimer

All examples of mental health documentation are fictional and for informational purposes only.

Ready to start your free trial?

15 free notes for 14 days • No credit card required

Why other mental health professionals love Mentalyc

Stanley LeMelle 
“Do yourself a favor, make your life easier. Use the tools that are readily available … I found Mentalyc to be one of the best tools that I’ve ever used.
Stanley LeMelle 
Licensed Marriage and Family Therapist
Dominique Walker
“If I were recommending this software to a colleague, I would tell them that it is the best thing that they could do for their practice.
Dominique Walker
Licensed Professional Counselor
Amber McKinney
“For those who have hesitations … It is a lifesaver. It will change your life and you have more time to be present with your patients.
Amber McKinney
Licensed Clinical Social Worker
Kara-Myung Jin Purves
“It immediately changed my quality of life, personally and professionally. I went from 3–4 hours a week of notes to 1 hour at most … that alone is invaluable personally and professionally.”
Kara-Myung Jin Purves
Owner/Independently Licensed Marriage & Family Therapist (IMFT)

Compliant notes. Stronger care.

Automated notes, treatment plans, and insights that prove therapy works.

Try Mentalyc for FREE

Your Author

Maria Szandrach is the CEO and Co-Founder of Mentalyc, a clinical intelligence platform for mental health practitioners that transforms therapy sessions into insurance-ready notes, treatment plans, and progress insights.

With an MSc from London Business School and experience founding three startups, Maria combines entrepreneurial expertise with a personal mission – shaped by her own journey with an eating disorder – to make psychotherapy more effective and accessible.

Recognized as a thought leader in AI in mental health and clinical documentation, she leads Mentalyc in redefining how technology supports therapists, from documentation automation to tools like Alliance Genie™ and Impact Tracker that improve therapeutic outcomes and strengthen therapeutic relationships.

More related posts

  • Mentalyc vs Heidi Health

    Mentalyc vs Heidi Health: Why Mentalyc Is the Better AI Tool for Therapists in 2025

    Mentalyc vs Heidi: Why Mentalyc is the better choice than Heidi Health Choosing the right AI platform for therapists can be overwhelming – especially with so many options claiming to do it all.  Today, we’ll compare two popular tools: Mentalyc vs Heidi Health — Mentalyc, designed specifically with therapists in mind, and Heidi, a more […]
    Maria Szandrach, CEO of Mentalyc Avatar
    Maria Szandrach, CEO of Mentalyc
  • 10 Ethical issues in Counseling

    10 Ethical issues in Counseling (Examples and Considerations)

    Ethics in counseling refers to the standards and principles that govern the professional conduct of therapists. Ethics refers to the system of moral principles which guide human behaviour. It outlines what is right, what is fair, what is just, and what is good. Ethics in counseling are not mere rules, as a therapeutic space is […]
    Gargi Singh, Psychologist Avatar
    Gargi Singh, Psychologist
  • Starting a Counseling Private Practice in Texas

    Starting your counseling private practice is a career move that is a step into professional independence. It’s where clinical work meets entrepreneurship, where values begin shaping the practice environment, and where your presence can directly support your community’s mental health needs. Counseling Private Practice in Texas When it comes to private practice in Texas, they […]
    Gargi Singh, Psychologist Avatar
    Gargi Singh, Psychologist