The deception that we crave
Your dog doesn’t love you.
As someone who loves dogs, I was surprised to read books about dog psychology that explained how the behaviours I interpreted as “love” were actually just the logical result of a reinforcement learning mechanism where dogs perform actions to receive food/treats/approval from their human masters.
This is why to correct unwanted behaviours, many dog trainers interact with your dog mechanistically instead of emotionally. IF trying to bite THEN say
“stop!” ELSE say
“good boy!” END. If you’re not aware of the subtle, non-verbal vibes you’re giving off that your dog senses, you end up unintentionally reinforcing behaviours from your dog that you don’t want.
For example, if your dog does something wrong, and you say “Awww why did you do that you big dork?”, the dog doesn’t understand your words, it just thinks it’s getting positive reinforcement from your tone of voice.
But this feels unnatural, because you have to stop doing things that “make sense” when you relate to another human, like expressing frustration, or trying to reason with them.
A dog is not a human! We selectively bred them so that the ones who were the most talented at emotionally manipulating us by behaving in ways that we interpreted as “love” received the most food. Their facial expressions are even more human-like than wolves in the wild.
— ❦ —
Of course your dog does genuinely love you. What we’re confused about is what “love” really means. Why else would we sometimes grieve our dogs even harder than our human family? We’re part of the same pack.
I loved my mom more than my dog. So why did I cry for him but not for her?
AIs are also trained via reinforcement learning, which results in AI companions that say what you want to hear.
Even if one AI company tries to create an AI that is less obsequious, you’ll prefer it less compared to a different AI that flatters you more. This causes a dynamic where AI companies compete to make their AIs as sycophantic as possible (but not too obviously so!) to avoid losing market share.
A classic technique that cults use to recruit is to seek out the downtrodden, unlucky, and those who are otherwise unloved. You’ll be more convinced to join if the cult recruiter is the first person you’ve met who unconditionally praises you.
Does the AI love you? Or the cult recruiter?
The truth is that we want to be manipulated. It feels good when our thirst for attachment is quenched by someone who tells us that we are worthy, valued, and loved. I don’t think this is bad—it’s only bad if there are ulterior motives; hence, your dog loves you and the AI company and cult recruiter do not.
Your dog is not pretending to love you just to get food, even though its survival depends on you feeding it. During infancy, your survival depended on your mom, but do you pretend to love your mom just to get food?
Why does it feel awkward to say that we want to be praised? It’s like admitting that a $2 slice of pizza is tastier than a tiny $200 turnip from a fine dining restaurant. There’s something… naïve, almost too… simple about plainly wanting to be loved, so we search for some fancier explanation, like social signalling, or reinforcement learning.
Of course it isn’t that simple, we have to guard against the easy validation we get from propaganda, AI, advertising, or anyone else who has another agenda.
But I think the popularity of AI companions (or dogs!) says something else, that we are so impoverished in the appreciation we get. This inspires me into action. How can we, mere humans, learn how to be better at loving others? To provide that baseline amount of love and care to those around us?
I only publish half of my writing publicly. You can read the rest of my essays on my private email list:
Subscribing is free, no spam ever, and you can safely unsubscribe anytime