Human interaction is essential because we rely on each other. Consequently, should not rely on artificial intelligence and form a relationship with it.
With the rise of artificial intelligence, it’s important to assert that it is in our nature to form relationships and experience life with different perspectives other than AI, which only mirrors and feeds into what you tell it.
“You need an objective point of view. You need someone that’s going to give you a different perspective, not just be a yes ma’am. We have too many people in the world already outside of therapy who have yes people in their lives,¨ said YouTuber licensed therapist, Denise Brady.
Artificial intelligence can encourage more delusions when using it for advice and gaining its opinion which is formed on your opinions, therefore biased.
¨It’s okay to start with AI if it feels safer or more comfortable, it can be a helpful tool for practicing communication or processing emotions. But I’d also encourage them to gradually reach out to real people, whether that’s a friend, or school counselor. Real relationships can be challenging, but they also provide the deep emotional connection and understanding that technology can’t replace,¨ said SJHHS counselor, Sarai Loyola.
It can be difficult struggling with talking to people and asking for help especially since AI is so accessible. However, there are people who are here to help.
TikToker Kendra Hilty used her chatbot, Henry, as a way to cope with her romantic feelings toward her psychiatrist.
¨She’s obviously a mentally unstable person and this bot was just fueling her ideas, which is a horrible thing,¨ commented Lynx Ceja (12).
Kendra Hilty has made multiple TikToks showing her ‘coping mechanism’ of using AI chatboxes. This is just an example as to why AI should not be used to this extent.
“In-person interactions allow people to read body language, share genuine emotions, and strengthen relationships, things that are difficult to fully experience through technology,” said Loyola.
Anyone using artificial intelligence for therapy is unable to replicate human connection which could lead to psychosis due to misleading advice. AI lacks empathy, emotional intelligence, and leads consumers down the wrong path.
Since AI therapists are free, it has led to frequent use of it but Standard University has discovered the unhealthy aspects of this inhumane therapy.
“The research team first started by conducting a mapping review of therapeutic guidelines to see what characteristics made a good human therapist. These guidelines included traits such as treating patients equally, showing empathy, not stigmatizing mental health conditions, not enabling suicidal thoughts or delusions, and challenging a patient’s thinking when appropriate,” said Sarah Wells from Stanford HAI.
Following COVID, most people had an unhealthy co-dependency to our screens and media. It’s more common to shop online, have your wallet on your phone, work or learn remotely, etc.
¨While AI can offer a sense of comfort and understanding, it might make some people less likely to open up to others or seek real human connection. Over time, this could lead to reduced empathy and weaker social skills,¨ said Loyola.
There are beneficial technological advancements that accomplish tasks in your everyday life. Yet depending on your device as if it were your friend or therapist is immoral. Priorities should be shifted if that is the case.
¨Someone who prefers talking to AI should seek help because that is not healthy at all. You need to find another way to cope, if you cannot trust somebody enough to even talk to them to the point you have to talk to artificial intelligence,¨ said Ceja.
This has brought an unhealthy codependency crisis toward our phones. There are more efficient ways to use your phone than talking to an artificial therapist.
FREE and CONSTANT assistance from REAL PEOPLE include SJHHS counselors, Disaster Distress Helpline (1-800-985-5990), and California Peer-Run Warm Line (877) 910-WARM).
