
Growing use of AI chatbots has prompted parents to seek regulations to safeguard children’s mental health.
In a recent Senate hearing, two parents testified against AI companies, reporting that their teenage children developed extended relationships with their chatbots that led to suicide. Analysis by digital safety company Aura says that children are developing more in-depth conversations with AI companion apps, averaging 163 words per message compared to just 12 words in a typical message with a friend.
Florida State University’s Secil Caskurlu is an assistant professor of instructional systems and learning technologies. Her overall research focuses on the design, development and evaluation of technology-rich learning experiences to enhance student outcomes including learning and engagement.
A few guidelines can create a safer experience with chatbots, she said.
“One of the most critical practices is remembering that you are interacting with artificial intelligence, not a human mind,” Caskurlu said. “It should be treated as a thinking partner, not a replacement for human intelligence. Chatbot responses can be useful for brainstorming, summarizing or drafting ideas, but they should never be taken as the final answer. This is because its decision making is shaped by the data it was trained on and lacks moral and ethical reasoning.”
Martin Swanbrow Becker is an associate professor of psychological and counseling services in FSU’s educational psychology and learning systems department. His current research examines the personal and contextual factors that influence the progression of adolescents and young adults along a distress and suicidal continuum of experience with a focus on stress, coping, resilience and help-seeking.
Swanbrow Becker believes community support is crucial when it comes to youth mental health, including suicide prevention.
“Youth may be increasingly turning to AI for help with their problems,” Swanbrow Becker said. “While AI can sometimes provide useful responses, it can also increase distress, particularly by building on and amplifying the concerns the person has. This highlights the importance of what we can do as a community. By encouraging each other to stay connected to our communities, saying something when we notice someone struggling, and supporting each other with access to mental health resources, we can help each other thrive.”
Media interested in discussing the ethical practices of AI may contact Secil Caskurlu at scaskurlu@fsu.edu.
Media looking for perspective and analysis on suicide prevention can reach out to Martin Swanbrow Becker at mswanbrowbecker@fsu.edu.
Secil Caskurlu, assistant professor of instructional systems and learning technologies, Anne Spencer Daves College of Education, Health, and Human Sciences
1. In your opinion, how can people make AI safer and better regulated?
AI can become safer and more regulated through a multi-dimensional approach that requires collaboration between developers, policymakers and an informed public. First, we need ethical guidelines and frameworks focused on fairness (AI tools treating all users fairly and mitigating against algorithmic bias), transparency (An explainable AI decision-making process, so we can understand how and why a system arrives at an outcome), and accountability (clear responsibility for AI’s impact).
Just as important is empowering users through critical AI literacy. For instance, my recent research with K-12 teachers revealed that when teachers understand how AI systems work — that they are trained on data and have inherent biases and limitations — they become more cautious and responsible users.
2. What are some best practices for using chatbots like ChatGPT in a safe and ethical manner?
We must remember that every conversation with a chatbot can potentially be used as training data. Data privacy and surveillance have been major concerns. To protect security, never share sensitive or personal information. Furthermore, since chatbots are known for generating incorrect information or ‘hallucinating’ references, it is essential to fact-check outputs against reliable sources before relying on them, especially for critical information or advice. Always be the final editor, and when appropriate, discuss outputs with peers or mentors.
Martin Swanbrow Becker, associate professor of psychological and counseling services, Anne Spencer Daves College of Education, Health, and Human Sciences
1. How can we assist people in feeling more comfortable to ask for help when they’re struggling?
When people experience psychological distress and even thoughts of suicide, they often feel isolated, believing that others cannot understand what they’re going through and that they are alone in these experiences. Yet our research shows that over half of college students have thought about suicide at some point in their lives. We can help others by recognizing that most of us experience mental health distress and thoughts of suicide at some point in our lives and approach conversations with compassion, care and empathy when we are concerned about someone. We should also reach out to those we are concerned about to open up a conversation and support them. Groups can request training on how to help with a range of mental health topics, including suicide prevention, through the university counseling services.
2. What role does the community play in preventing suicide, and how can we all contribute to creating a safer environment for those at risk?
Enhancing psychological wellness and preventing suicide is a community effort. People often think about suicide when they feel they do not belong or believe they are a burden on others. Also, when someone feels disconnected from others, it can be harder for those around them to recognize their struggles or step in with support. It is also important to recognize that we are all responsible for helping others in our community. While we will not be each other’s therapists, we can recognize when others are struggling, reach out to them and, when appropriate, assist them in receiving professional help, such as at Counseling & Psychological Services at FSU. We can also encourage people to access support sooner so that they can address challenges before they become overwhelming.