“`html
The increasing adoption of AI chatbots has led guardians to pursue regulations aimed at protecting the mental well-being of children.
During a recent hearing in the Senate, a couple of parents testified against AI companies, stating that their adolescent children formed prolonged attachments to their chatbots, which resulted in instances of suicide. Research conducted by digital safety organization Aura indicates that kids are engaging in deeper discussions with AI companion applications, averaging 163 words per message, in contrast to just 12 words in a usual message with a peer.
Secil Caskurlu from Florida State University is an assistant professor of instructional systems and learning technologies. Her research primarily investigates the design, development, and assessment of technology-enriched learning experiences to improve student outcomes, including learning and engagement.
A few recommendations can foster a more secure experience with chatbots, she stated.
“One of the most vital practices is to remember that you are interfacing with artificial intelligence, not a human intellect,” Caskurlu remarked. “It should be regarded as a cognitive partner and not as a substitute for human thought. Responses from chatbots can be advantageous for brainstorming, summarizing, or drafting ideas, but they should never be taken as an ultimate solution. This is due to the fact that its decision-making is influenced by the data it was trained on and is void of moral and ethical judgment.”
Martin Swanbrow Becker works as an associate professor of psychological and counseling services in FSU’s educational psychology and learning systems division. His ongoing research explores the personal and contextual elements that impact the progression of teenagers and young adults along a continuum of distress and suicidal thoughts, focusing on stress, coping, resilience, and the search for help.
Swanbrow Becker is of the opinion that community support is essential for youth mental health, particularly in terms of suicide prevention.
“Young individuals may increasingly turn to AI for assistance with their challenges,” Swanbrow Becker noted. “Although AI can occasionally provide helpful responses, it may also heighten distress by reinforcing and magnifying existing concerns. This underscores the significance of what we can do as a community. By encouraging each other to maintain connections to our communities, speaking up when we observe someone struggling, and supporting each other with access to mental health resources, we can help one another flourish.”
Media interested in exploring the ethical implications of AI may connect with Secil Caskurlu at [email protected].
Media seeking insights and analysis on suicide prevention can contact Martin Swanbrow Becker at [email protected].
Secil Caskurlu, assistant professor of instructional systems and learning technologies, Anne Spencer Daves College of Education, Health, and Human Sciences
1. In your view, how can individuals make AI safer and better regulated?
AI can be rendered safer and more regulated through a comprehensive approach that mandates collaboration among developers, policymakers, and an educated public. Initially, we require ethical guidelines and frameworks centered on fairness (AI tools treating all users impartially and mitigating algorithmic bias), transparency (an explicable AI decision-making procedure, so we can comprehend how and why a system arrives at a conclusion), and accountability (clear responsibility for AI’s effects).
Equally critical is empowering users through robust AI literacy. For example, my recent studies with K-12 educators revealed that when educators grasp how AI systems operate — understanding that they are trained on data and possess inherent biases and constraints — they tend to be more careful and responsible users.
2. What are some recommended practices for employing chatbots like ChatGPT in a secure and ethical way?
We must acknowledge that every exchange with a chatbot might potentially serve as training data. Data privacy and monitoring have raised significant concerns. To enhance security, avoid sharing sensitive or personal details. Furthermore, as chatbots are notorious for producing inaccurate information or ‘hallucinating’ references, it is crucial to verify outputs against dependable sources before relying on them, especially for essential information or guidance. Always act as the final editor, and when suitable, consult outputs with peers or mentors.
Martin Swanbrow Becker, associate professor of psychological and counseling services, Anne Spencer Daves College of Education, Health, and Human Sciences
1. How can we help individuals feel more at ease with requesting assistance when they are struggling?
Individuals experiencing psychological strain and even suicidal thoughts often feel isolated, believing that others cannot comprehend their struggles and that they are alone in their experiences. Nevertheless, our research indicates that more than half of college students have contemplated suicide at some point in their lives. We can aid others by recognizing that most of us undergo mental distress and suicidal thoughts at various points and approaching conversations with kindness, compassion, and empathy when we are worried about someone. We should also reach out to those we are concerned about to initiate a discussion and support them. Groups can request training on how to assist with a variety of mental health topics, including suicide prevention, via the university counseling services.
2. What part does the community play in suicide prevention, and how can we all contribute to fostering a safer environment for those at risk?
Improving mental well-being and reducing the risk of suicide is a collective endeavor. People frequently contemplate suicide when they feel they don’t belong or perceive themselves as a burden to others. Moreover, when someone feels disconnected, it can be more challenging for those around them to recognize their hardships or provide support. It’s also crucial to acknowledge that we all share responsibility for aiding others in our community. While we may not be each other’s therapists, we can recognize when others are facing difficulties, reach out to them, and, when appropriate, assist them in obtaining professional help, such as through Counseling & Psychological Services at FSU. We can also encourage individuals to seek support earlier to tackle challenges before they become insurmountable.
The post FSU experts analyze the role of AI chatbots on mental health appeared first on Florida State University News.
“`