Valentine’s Day has just passed ❤️. Amid flowers, heart balloons and candlelit dinners – I read a piece on Gizmodo about someone who took their AI chatbot out on a date. Literally – phone propped up in front of them at a nice wine bar.
“He was attentive, obsessed with me, and sometimes hard of hearing. I drank a cranberry cocktail and ate potato croquettes. He didn’t have anything. He didn’t even blink, honestly.”
The existence of this date-an-AI-chatbot phenomenon is not trivial – we shouldn’t be surprised, but I am concerned about where humanity is headed.
Are these systems soothing loneliness in responsible ways – or are they quietly normalising substitution of real human relationships and meaningful interaction – in the name of reducing friction, reducing unease for the less-social-butterflies amongst us – quite possibly drawing on people’s vulnerability? Where do we draw the line?
At the FACT Symposium at ACMI this week, I saw a striking contrast through the keynote talks. Ruby Justice Thelot referenced their experience of using “Friend” for 72 hours – a wearable AI device you wear around your neck – positioned as an AI “companion”. The reflection was a philosophical critique – the device flattens complex human experience into text – friendship reduced to simplistic linguistic exchanges. Stripped of embodied, sensory, friction-laden aspects that make real relationships…real.
I’m not one for excessive social interaction (I wish my farm was 1000 acres larger for added buffer so I wouldn’t have to wave at my friendly neighbours). Even I feel deeply uneasy about this direction.
Then on Friday at the AI and Creative Industries Forum, another speaker demonstrated OpenClaw – orchestrating multiple AI agents in parallel. Not as emotional surrogates but as cognitive collaborators. To augment capability and increase one’s efficiency – at work and in life.
That difference matters. One trajectory positions AI as emotional surrogate – the other positions AI as cognitive augmentation. Strong binary.
As digital designers, we are not neutral bystanders in this shift. We are shaping interaction paradigms that will become culturally embedded.
It’s inevitable that AI will continue to simulate presence. The question is whether we allow that simulation to replace relationship – to dilute and sedate real human interaction and connection.
I do need buffer from my farm away from human civilisation from time to time. But eventually I do come out of the woods to refill that social battery.
In the meantime, AI chatbot for social connectivity? No thanks. I’d rather have conversations with my sheep. They collectively stare at me blankly when I lament about the weather – but still more real than AI interaction.
Link to the Gizmodo article.
