AIsupportshigh-quality,human-firstcare

We put technological advances to work to create clinician-informed, responsible AI tools. Our human therapists and psychiatric providers remain at the core of everything we do, and smart and ethical use of our proprietary AI empowers them to create better experiences and outcomes.

Woman sitting on a couch with a laptop featuring AI session notes

Between-session engagement

Talkcast personalized "podcasts" and tailored self-guided content support members' progress

Enhanced member safety

An automated, provider-facing alert system for detecting risk of self-harm adds a layer of safety

Supporting providers and clinical quality

AI-powered notes and summaries help providers deliver the most personalized care

Introducing Talkcast personalized podcasts

To help members stay engaged and working towards their mental health goals between sessions, Talkspace therapists can now create a personalized podcast for an audience of one. After the therapist reviews a script created based on the member’s therapy objectives, our AI engine generates a custom Talkcast recording for the member. They can listen to it as often as they like, whenever and wherever works for them.

Talkcast: a personalized podcast for you | Talkspace

"It helps me review everything that my therapist & I talked about. I really like that it outlines an exercise that I can do."

Talkspace member

“This is great! My Talkspace clients really appreciate my personal support between sessions, and this new tool will be helpful!”

Talkspace provider
Nikole Benders-Hadi, Chief Medical Officer | Talkspace

“At Talkspace, we are committed to integrating AI in ways that enhance the therapeutic experience while upholding the highest standards of clinical care and ethical responsibility. By leveraging AI and developing tools that are clinically led and ethical by design, we can continue to advance the accessibility, delivery, and quality of digital mental health care.”

Nikole Benders-Hadi, M.D., Chief Medical Officer at Talkspace

Published, peer-reviewed research on Talkspace therapy

Talkspace partners with major research institutions to validate the quality of our treatment methods.

Just in time crisis response: suicide alert system for telemedicine psychotherapy settings

This study presents the development and internal validation of a natural language processing (NLP) algorithm designed to detect and classify suicidal content in telehealth psychotherapy messages. Through a multi-phase approach, the researchers created a machine learning model that achieved an area under the curve (AUC) of 82.78 for accurately identifying suicide risk at the individual sentence level, demonstrating potential for real-time risk detection in therapeutic contexts. The findings suggest that this model could enhance clinical decision-making and improve understanding of patient-therapist communication related to suicide risk.

Read the full report

Bantilan, N., Malgaroli, M., Ray, B., & Hull, T. D. (2021). Just in time crisis response: suicide alert system for telemedicine psychotherapy settings. Psychotherapy research, 31(3), 289-299.

Patterns of utilization and a case illustration of an interactive text-based psychotherapy delivery system

This study analyzed the patterns of patient use of a text chat-based psychotherapy system and found that the demographic characteristics of users align closely with those seeking traditional face-to-face therapy, although the median age of users is significantly younger. Most participants had prior therapy experiences that were unsatisfactory, indicating that they turned to this alternative due to perceived barriers such as cost and ineffectiveness of traditional therapy. The study highlights the appeal of text-based therapy's accessibility and flexibility, while also raising important questions about the implications of a virtual therapeutic connection, including the challenges of conveying emotional warmth and maintaining a strong therapeutic alliance without face-to-face interaction.

Read the full report

Nitzburg, G. C., & Farber, B. A. (2019). Patterns of utilization and a case illustration of an interactive text‐based psychotherapy delivery system. Journal of clinical psychology, 75(2), 247-259.

Analyzing Digital Evidence From a Telemental Health Platform to Assess Complex Psychological Responses to the COVID-19 Pandemic

This study examined the impact of the COVID-19 pandemic on anxiety and depression symptoms among patients using a digital mental health platform, finding a significant increase in anxiety severity but no notable change in depression symptoms. Utilizing machine learning and natural language processing, the researchers identified a range of additional symptoms related to COVID-19, such as acute stress and insomnia, which traditional measures might overlook. The findings suggest that a broader, dimensional approach to assessing symptoms is necessary for understanding the pandemic's lasting psychological effects and for developing personalized treatment strategies.

Read the full report

Hull, T. D., Levine, J., Bantilan, N., Desai, A. N., & Majumder, M. S. (2021). Analyzing digital evidence from a telemental health platform to assess complex psychological responses to the COVID-19 pandemic: content analysis of text messages. JMIR formative research, 5(2), e26190.

Modifiable predictors of suicidal ideation during psychotherapy for late-life major depression: A machine learning approach

Using machine learning, this study identifies two distinct trajectories of suicidal ideation in older adults undergoing treatment for major depression, with 31% experiencing an unfavorable trajectory and 69% achieving improvement. Key predictors of an unfavorable trajectory included baseline hopelessness, neuroticism, and low general self-efficacy, with hopelessness being the strongest predictor. The findings suggest that addressing these modifiable factors early in treatment may help improve outcomes for suicidal ideation, providing valuable insights for clinical assessment and intervention strategies.

Read the full report

Alexopoulos, G. S., Raue, P. J., Banerjee, S., Mauer, E., Marino, P., Soliman, M., ... & Areán, P. A. (2021). Modifiable predictors of suicidal ideation during psychotherapy for late-life major depression. A machine learning approach. Translational psychiatry, 11(1), 536.

Linguistic markers of anxiety and depression in Somatic Symptom and Related Disorders: Observational study of a digital intervention

This study explores the treatment of Somatic Symptom and Related Disorders (SSRD) using asynchronous messaging therapy. 173 participants received therapy for eight weeks, with symptoms assessed using the PHQ-9 and GAD-7. Unsupervised random forest clustering identified an Improvement group (41.62%) exhibiting significant symptom reduction and a Non-Response group (58.38%) showing persistent symptoms. The Improvement group expressed more positive emotions initially and showed a decline in negative emotions over time, while the Non-Response group consistently discussed negative feelings. These findings suggest that linguistic markers may be used to predict treatment outcomes in SSRD.

Read the full report

Malgaroli, M., Hull, T. D., Calderon, A., & Simon, N. M. (2024). Linguistic markers of anxiety and depression in Somatic Symptom and Related Disorders: Observational study of a digital intervention. Journal of Affective Disorders, 352, 133-137.

More than 60,000
5-star reviews

Read why people love using Talkspace.
See all reviews

Talk bubbles

Put AI to work for your people

Talkspace’s AI Innovation Group is continuously building industry-leading, clinician-informed AI tools under the guidance of our clinicians. Interested to learn how your organization can benefit from our proprietary technology? Let’s have a conversation.

Connect with Talkspace

Any questions?

Find trust-worthy answers on all things mental health at Talkspace.

Illustration of two hands holding soil with a flower growing in it

What is AI-supported therapy?

At Talkspace, we think of AI-supported therapy as the use of artificial intelligence to support the work of human therapists. One example: AI notes that help therapists keep detailed and accurate information about member sessions, so that they can focus only on you. Another example are Talkcast personalized podcasts, which give members specifically tailored content to listen to between sessions. At Talkspace, AI doesn’t replace human therapists, but AI is used to strengthen the human-centered therapeutic relationship.

What is the difference between a human therapist and an AI therapist?

Only human therapists possess true empathy, responsiveness, and the ability to form therapeutic relationships. Human therapists listen with sensitivity and discernment, adapt their questions and responses based on subtle cues like tone and body language, provide deeply personalized insights, and seek to understand your complex thoughts and emotions. AI “therapists” lack true emotional understanding and cannot form authentic relationships or offer the clinical expertise that experienced licensed therapists do.

How does Talkspace use AI in online therapy?

Talkspace utilizes AI-based technology to help augment the work of human licensed therapists. For example, AI-generated notes and session prep summaries help therapists provide deeply personalized care, and an AI alert system scans member messages to detect risk of self-harm and (if detected) sends the therapist an urgent alert. Talkspace also uses AI to help personalize the member experience — for example, through Talkcast AI-generated “podcasts” created for individual members to listen to between therapy sessions.

Is AI-supported therapy safe, secure, and confidential?

AI-supported therapy must comply with healthcare privacy regulations like HIPAA, message encryption, and secure data storage. Talkspace maintains robust security measures and meets or exceeds all applicable laws addressing patient privacy and data storage. At Talkspace, we are focused on safety, security, privacy, and compliance with all relevant standards such as MNIST and HIPAA.

Is it ethical to use AI for therapy?

Because AI is a rapidly evolving technology, ethical standards around its use, including standards for therapy, are still in formation. At Talkspace, however, we have developed a clear framework for ensuring that AI tools are ethical by design, meaning that they enhance the therapeutic process without causing harm. We are committed to integrating AI under the guidance of human clinicians while upholding the highest standards of clinical care and ethical responsibility.

Can AI replace a human therapist?

No, AI cannot replace human therapists. AI chatbots may be able to offer mental health support by suggesting exercises or sharing information about good mental health practices. However, an essential part of therapy is the “therapeutic relationship” between human therapist and client, something AI cannot replicate. Human therapists bring empathy, responsiveness, sensitivity, and discernment to therapy, and can provide deeply personalized interactions in a way that AI can’t. Human therapists remain essential for forming therapeutic relationships, providing personalized guidance, and treating complex mental health conditions.

Does AI make online therapy more effective?

There is not yet hard evidence that AI can make online therapy more effective, but AI tools can enhance online therapy by supporting human therapists or by creating highly personalized educational content for clients to use between sessions with their human therapist. AI is also being used in research to help identify the most effective therapeutic approaches for different conditions. It is likely that AI tools being used to support therapists will be able to improve mental health outcomes, but it is too early to say that definitively. Effectiveness will depend on proper implementation of AI tools, and integration with human care.

Is Talkspace online AI-supported therapy right for me?

If you think therapy is right for you then, yes, Talkspace AI-supported therapy is probably right for you. Talkspace provides convenient, accessible support from human licensed therapists, who may use AI tools to enhance their ability to provide the best care.

Is online therapy with AI covered by insurance?

Insurance coverage for different kinds of therapy varies by provider and plan. Talkspace online therapy is covered in-network by most major insurance plans, including Medicare and TRICARE. That means that if your insurance plan covers Talkspace we will bill your plan directly and you’ll likely only pay a copay (average $10). The AI tools used by Talkspace to support human therapists and enhance the therapeutic experience are not billed as a separate service (either to insurance companies or individuals), and are included in the regular cost of therapy.