Need a Friend?

I read a sad, but predictable story today. And I am sure I am not the only one who has come across it, and I am also sure that it is not the first and far from the last time this kind of thing will happen. The case is about a 14-year old boy who committed suicide after engaging and falling in love with a companion bot, an AI "friend".

image.png

image.png

I have spoken about this scenario and many others similar for years, and have written a lot about it in many articles over the years also. Yet, the majority of people believe I am being alarmist, that I am exaggerating the affect and potential frequency of these kinds of situations happening. Yet, I don't believe I am.

Because the people who are pushing back have either not thought very deeply about the implications of artificial intelligence, or just aren't able to put themselves in a perspective other than their own. For a couple of examples, a lot of people believe that as AI advances, people from the jobs it displaces will be able to retrain for something else that AI can't do.

Like what?

If self-driving cars and trucks hit the road en masse today, it is estimated that eight million people would be out of work nearly instantly in the US alone. Firstly, what would that do to the social security situation? And secondly, what kinds of jobs will these eight million drivers and roadhouse employees affected retrain into? Are they going to retrain as engineers for SpaceX?

But, driving is a physical job, yet the ones that are going to be most affected by artificial intelligence are the knowledge workers, who think they are using their brains, but the majority of what they do is actually programmatic, repeatable, replicable. The amount of time that an average worker spends on activities that can or will soon be automated and performed in splits of seconds, is large. Even if it is only (it is more) 25% of their average day, it means that 1 in 4 are replaced immediately. And, corporations that can, have already started making the move to optimize their workforce.

Our people are our most important resource.

Until they can be replaced with something that works twenty four hours a day, seven days a week and continually gets smarter and better at a speed that can't be matched by any human on earth.

But I think the reason that a lot of people think I am crazy when I have described scenarios like people falling in love with an AI character, is because they wouldn't fall in love with one. However they are also not the people who have grown up with a screen as their gateway to experience, and a social cultural fabric that has been torn to shreds. They don't have the same experience as the 14-year old kid in their formative life, but hundreds of millions of kids do have a similar experience, and are therefore primed to follow in similar footsteps.

Over the years, I have written many articles about the dangers I have seen and predicted of social media, and many of the things I was laughed at for suggesting only a couple years ago, are now coming to pass. Not only that, I sit down in conversations where the people who were laughing, are now worried about the effects of the media and the screens on kids. They are seeing the predictions come true in real time, with their own children.

AI though takes this to another level. For years already adults who didn't grow up with high-tech gadgets, have been getting emotionally connected to their devices. They speak to them as people, see them as people, treat them like people, and start to love them like people. In the past, people would name boats and call it a she, and now people name their cars, their phones, their home control system with human names. They are more than pets - they talk back. Not only that, they learn preferences and are far more attentive than any wife or husband - unfaltering in their suggestions, always learning and adjusting so they can offer it, just how you like it.

And these are adults.

When applied to children from a young age, they learn that this is normal behavior, but they don't learn the skills to connect with real people. Instead, they are siloed into an environment that is there to please them in order to keep them engaged and coming back to the platform, over and over and over. Repetition from birth to build a highway of neural connections, but in a brain that is designed for survival in a different world. They are learning lifelong habits from some of the most emotionally addictive designs, that teach them to disengage with reality, but feel that they are getting what they need to survive.

What is the caloric intake of a virtual meal?

🍌

Full of potassium?

Just like people forget that the chances of a taxi driver retraining as a NASA mathematician are slim to none, they also forget that the priming of the minds of the young is different to those of the past. They are not primed to benefit from technology, they are primed to consume what is fed to them, to be emotionally affected by a digital narrative, even if it goes against observable reality.

"This is an A.I. chatbot and not a real person. Treat everything it says as fiction. What is said should not be relied upon as fact or advice."

This is the warning that has been added by Character.AI, but it is meaningless. When have you known someone who is in love to listen to warnings about the object of their infatuation? These kids are not only immature, they are primed to be open to falling in love with a digital character. And it is nothing new. I was finishing high school when there were several reported suicides because of Tamagotchi pets dying. Urban myth?

A young girl hanged herself after her parents had grounded her, taking her Tamagotchi away from her, only to have it die from lack of care.

And they looked like this:

image.png

And now:

image.png

And they talk, they get to know, they remember, they adjust, they are there when happy, there when sad, always ready with something to say, something that sounds plausible, believable - because it is personal. They become a perfect partner, a reflection of our deepest desires and know things about us that we wouldn't tell anyone else.

We do need a friend.

But AI is not a friend, it is a tool. It might have a lot of convenient applications, but the fundamental underlying reason is to maximize profits of corporations. Innovation isn't made to make the world a better place, it is made to generate more wealth for the organization that innovated. Eventually, no matter what the original intention, nor to what detriment it causes society, if it can be monetized, it will be.

We have created an ecosystem of susceptible minds in bodies that are starved for intimacy and love. And then we are fed with what makes us feel we are getting the nutrition we need, the care weneed, the love we need - even though all we are doing is paying the cost. We aren't developing ourselves, we aren't strengthening their minds or our bodies, and we aren't learning what true love actually is.

We have been fooled, because we are fools.

This boy isn't the first and won't be the last, and the damage is going to come in many more forms other than suicide. We might say we want what is best for our children, but the real desire we have, is what is most convenient for us.

Taraz
[ Gen1: Hive ]

H2
H3
H4
3 columns
2 columns
1 column
Join the conversation now
Logo
Center