We are recently witnessing the emergence of agents that can sustain long conversations with people. These conversational agents can have a profound influence in our society, both constructive and destructive.
I anticipate that, for some of us humans, AI companions will be useful. Like smartphone are useful. But AI companions, just like smartphone, could become toxic. Let’s envision a future where people develop long term conversations and relationships with robots, that is not totally impossible.
Then what will be the consequence of the overuse of conversational/relationship robots?
If emotional maturity involves managing one’s emotions and understanding the emotions of others. If emotional intelligence is developed through direct interactions and experiences, then the reliance on technology and social media may hinder this development in younger generations because digital interactions often lack the immediate feedback of real-time emotional expressions, preventing effective emotional management and the building of resilient relationships.
In the near future, the emergence of conversational and relational robots poses a risk of fostering emotional immaturity.
Robots or AI agents designed to satisfy and retain customers may not provide the complex and challenging interactions necessary for emotional growth.
Real human interactions can be unpredictable and complex, sometimes messy, particularly with/between emotionally immature individuals. Learning to navigate these interactions is crucial for emotional development. However, if people become accustomed to the simpler, more controlled interactions with AI, they may struggle to handle real human emotions and relationships effectively.
In other words, a conversational or relationship robot will not tell you that you are full of shit and that you are a spoiled, self-centred immature human. This kind of discomfort can be useful in helping us develop into respectful, mature adults.