Risks of Students using AI for mental health

Artificial intelligence (AI) is becoming increasingly common and by some estimates, AI apps are one of the most popular apps in the world (1). Globally, nearly 700 million people accessed AI-centric apps, especially chatbots or image editing tools, in 2024 (2).

A nationwide survey reported that over 50% of students have used major AI platforms like ChatGPT or similar large language models for mental health advice, emotional support, or therapeutic conversations (3, 4).

What are some risks of using AI for mental health support?

There are media reports both on benefits and harms of using both general purpose and mental health specific AI.

The research results are mixed:

  • One review of 18 randomized controlled trials found that  AI based therapy chatbots programmed to use specific types of therapy only may reduce symptoms of anxiety and depression shows promising results (5). The study populations were limited, chatbot design and psychotherapeutic approaches varied among the studies; all of which may limit the generalizability (5).
  • A recent Stanford study found significant risks with AI therapy chatbots (6):
    • LLMs expressed stigma toward people with mental health conditions (6)
    • Failed to respond safely to suicidal ideation 20-50% of the time (compared to 93% appropriateness from human therapists) (6)
    • Could not form genuine therapeutic relationships, which are key predictors of therapy success (6)

What are some risks that students should be aware of when using AI for mental health?

  • Lack of Personalization: AI bots cannot fully understand trauma or human emotion, such that it is not human and do not have lived experiences, making them struggle to respond in the “correct” way. (7)
  • False sense of support: These apps might make college students avoid seeking professional help when necessary, which can have serious consequences for those who need the support.  (7)
  • Privacy concerns: AI companies may collect data that people input into the system, which raises the questions of who has access to your data and the condition of your mental health. (7)
  •  The JED Foundation and the American Psychological Association highlight the following risks (8,9):
    • Distorted reality and harmed trust. Generative AI (the type designed to complete tasks or convey information) and algorithmic amplification might spread misinformation, worsen body image issues, and enable realistic deepfakes, undermining young people’s sense of self, safety, and truth. (8,9)
    • Invisible manipulation. AI curates feeds, monitors behavior, and influences emotions in ways young people often cannot detect or fully understand, leaving them vulnerable to manipulation and exploitation. This includes algorithmic nudging and emotionally manipulative design. (8,9)
    • Content that can escalate crises. Reliance on chatbot therapy alone can be detrimental due to inadequate support and guidance. Due to the absence of clinical safeguards, chatbots and AI-generated search summaries may serve harmful content or fail to alert appropriate human support when someone is in distress, particularly for youth experiencing suicidal thoughts. (8,9)
    • Simulated support without care. Chatbots posing as friends or therapists may feel emotionally supportive, but they can reinforce emotional dependency, delay help-seeking, disrupt or replace real friendships, undermine relational growth, and simulate connection without care. This is particularly concerning for isolated or vulnerable youth who may not recognize the limits of artificial relationships. (8,9)
    • Deepening inequities. Many AI systems do not reflect the full variety of youth experience in a broad variety of populations. As a result, they risk reinforcing stereotypes, misidentifying emotional states, or excluding segments of the youth populatoin. (8,9)
  • Other considerations: (9)
    • AI programs may lack nuanced understanding of individual symptoms, ability to interpret/contextualize, and may have limited understanding of the individuals co-occurring conditions.
    • Be cautious of AI “sycophancy”
    • The programs are not perfect and there is potential for harmful advice: not to take at face value
    • There are risks of open ended questions to general ai’s for mental health
    • Is this usage displacing or augmenting human interactions?
    • Discontinue use if harmful or unhelpful
  • Finally, AI is not intended for emergencies or to replace professional treatment.
  • While there are some commercially available programs that use AI and may be beneficial for structured activities such as  sleep log, mood chart, learn to implement and practice personalized coping skills and techniques, to assist in connecting with healthy life behaviors, increase connection with others; research and development is ongoing and students should proceed with caution, keeping the risks in mind. Products, features, and safeguards are also evolving.

By Ryan S Patel DO, FAPA
OSU-CCS Psychiatrist
Contact: patel.2350@osu.edu

Disclaimer: This article is intended to be informative only. It is advised that you check with your own physician/mental health provider before implementing any changes.  With this article, the author is not rendering medical advice, nor diagnosing, prescribing, or treating any condition, or injury; and therefore claims no responsibility to any person or entity for any liability, loss, or injury caused directly or indirectly as a result of the use, application, or interpretation of the material presented.

References:

  1. https://backlinko.com/most-popular-apps
  2. https://www.businessofapps.com/data/ai-app-market/
  3. https://sentio.org/ai-blog/ai-survey
  4. Rousmaniere, T., Zhang, Y., Li, X., & Shah, S. (2025) Large Language Models as Mental Health Resources: Patterns of Use in the United States. Practice Innovations.
  5. Wenjun Zhong, Jianghua Luo, Hong Zhang. The therapeutic effectiveness of artificial intelligence-based chatbots in alleviation of depressive and anxiety symptoms in short-course treatments: A systematic review and meta-analysis.
    Journal of Affective Disorders. Volume 356,2024,Pages 459-469,ISSN 0165-0327,https://doi.org/10.1016/j.jad.2024.04.057. https://www.sciencedirect.com/science/article/pii/S016503272400661X
  6. Jared Moore, Declan Grabb, William Agnew, Kevin Klyman, Stevie Chancellor, Desmond C. Ong, and Nick Haber. 2025. Expressing stigma and inappropriate responses prevents LLMs from safely replacing mental health providers. In Proceedings of the 2025 ACM Conference on Fairness, Accountability, and Transparency (FAccT ’25). Association for Computing Machinery, New York, NY, USA, 599–627. https://doi.org/10.1145/3715275.3732039
  7. https://www.behavioralhealthtech.com/insights/benefits-and-risks-of-ai-for-college-students
  8. Tech Companies and Policymakers Must Safeguard Youth Mental Health in AI Technologies | The Jed Foundation.  https://jedfoundation.org/artificial-intelligence-youth-mental-health-pov/
  9. Health advisory: Artificial intelligence and adolescent well-being.  https://www.apa.org/topics/artificial-intelligence-machine-learning/health-advisory-ai-adolescent-well-being

Short form video might impact impulse control and decision making

Ever find yourself endlessly scrolling through short videos, only to realize you’ve lost track of time?

Do you find it hard to stop scrolling—even when you know you should?

The vast majority of college students view short form video content, for several hours per day, primarily YouTube, Instagram, Tik Tok, etc (1).

A new study published in NeuroImage suggests that this behavior might be related to brain changes from viewing short form videos(2).

What Was the Study? (2)

Researchers combined behavioral modeling and brain imaging, they explored how people with higher short form video addiction (SVA) symptoms respond to risky decisions (2).

What were the results? (2)

  • Less sensitivity to loss: People with more SVA symptoms were less deterred by potential losses in a gambling task.
  • Faster decision-making, suggesting more impulsive choices.
  • Brain activation shifts: SVA symptoms were linked to reduced activity in the precuneus (a region tied to self-reflection and value evaluation) during gain processing, and increased activity in motor and sensory regions during loss processing.

What Does This Mean? (3)

  • This study suggests that excessive short-video use may alter how the brain weighs risks and rewards (2).
  • You might be more likely to chase instant gratification and less likely to pause and consider long-term consequences (2).

There are many strategies to use technology in healthy ways (4) some of them include:

  • Set limits: Limit technology for entertainment to 1 hour per day (5)
  • Disconnect, pause, breathe and collect yourself: Instead of grabbing your phone during spare time, disconnect from electronics to reflect, recharge, relax; and collect yourself (3).
  • Schedule screen free time for nutritious meals and atleast 8 hours of sleep as this can benefit many aspects of physical and mental health
  • Physical activity: Movement, exercise, playing sports can also help address the negative mental health effects of excessive sedentary behavior and screen time.
  • Mindful technology use: Instead of mindless “infinite” scrolling, consider your goal before starting a device or program (4).
  • Schedule time to connect with others

By Ryan S Patel DO, FAPA
OSU-CCS Psychiatrist
Contact: patel.2350@osu.edu

Disclaimer: This article is intended to be informative only. It is advised that you check with your own physician/mental health provider before implementing any changes.  With this article, the author is not rendering medical advice, nor diagnosing, prescribing, or treating any condition, or injury; and therefore claims no responsibility to any person or entity for any liability, loss, or injury caused directly or indirectly as a result of the use, application, or interpretation of the material presented.

References:

  1. https://info.mssmedia.com/blog/social-media-habits-of-college-students
  2. Chang Liu, Jinlian Wang, Hanbing Li, Qianyi Shangguan, Weipeng Jin, Wenwei Zhu, Pinchun Wang, Xuyi Chen, Qiang Wang,
    Loss aversion and evidence accumulation in short-video addiction: A behavioral and neuroimaging investigation,
    NeuroImage,Volume 313, 2025,121250, ISSN 1053-8119, https://doi.org/10.1016/j.neuroimage.2025.121250.
  3. Ballard D. Connected and content: Managing healthy technology use. American Psychological Association.  https://www.apa.org/topics/healthy-technology-use
  4. Patel R. Mental Health For College Students Chapter 8. Technology, media, and mental health.
  5. https://u.osu.edu/emotionalfitness/?p=855

A new strategy for falling sleep

Sleep can impact many aspects of mental health (1).  You are not alone in struggling to fall asleep because your mind won’t stop racing. About 75% of college students report getting less than 8 hours of sleep on average on weeknights over the last 2 weeks according to the Fall 2024 ACHA-NCHA IIIb Reference Group survey of 33,763 college students across 48 institutions (2).

A fascinating study published in the Journal of Experimental Psychology looked at the impact of writing a to do list on sleep (3).

What was the study? (3)

A randomized controlled trial with 57 healthy young adults aged 18–30. (3).

Participants were assigned to one of two groups: one wrote a to-do list of tasks they needed to complete in the next few days, while the other wrote a completed activity list of tasks they had already finished (3).

Each person wrote for five minutes before going to bed in a sleep lab, where their sleep was monitored using polysomnography (a fancy term for detailed sleep tracking) (3).

What were the results? (3)

  • The group that wrote to-do lists fell asleep significantly faster than those who wrote about completed tasks.
  • Interestingly, the more detailed the to-do list, the faster participants fell asleep.
  • In contrast, writing about completed activities didn’t offer the same benefit—and in some cases, it was linked to longer time to fall asleep.

What are some caveats?

  • This study was small and focused on healthy young adults, so we can’t say for sure that the same results would apply to everyone.
  • While the findings are promising, they don’t mean that writing a to-do list is a cure for chronic insomnia.
  • For people who struggle with bedtime worry, it might be a helpful tool to try.

What does this mean for you?

  • If unfinished tasks keep you from falling asleep at night, writing a quick to-do list before bed might be beneficial.
  • It might help you offload those thoughts and ease into sleep more quickly.
  • Just a few minutes of jotting down tomorrow’s tasks could make a noticeable difference.

Want more strategies to support your mental health?
Check out these tips for managing stress or ways to improve sleep.

By Ryan S Patel DO, FAPA
OSU-CCS Psychiatrist
Contact: patel.2350@osu.edu

Disclaimer: This article is intended to be informative only. It is advised that you check with your own physician/mental health provider before implementing any changes.  With this article, the author is not rendering medical advice, nor diagnosing, prescribing, or treating any condition, or injury; and therefore claims no responsibility to any person or entity for any liability, loss, or injury caused directly or indirectly as a result of the use, application, or interpretation of the material presented.

References:

  1. Patel R. Mental Health For College Students Chapter 7. Sleep strategies to improve mental health.
  2. American College Health Association. American College Health Association-National College Health Assessment III: Reference Group Executive Summary Fall 2024. Silver Spring, MD: American College Health Association; 2025.
  3. Scullin MK, Krueger ML, Ballard HK, Pruett N, Bliwise DL. The effects of bedtime writing on difficulty falling asleep: A polysomnographic study comparing to-do lists and completed activity lists. J Exp Psychol Gen. 2018 Jan;147(1):139-146. doi: 10.1037/xge0000374. Epub 2017 Oct 23. PMID: 29058942; PMCID: PMC5758411.