5 Comments
User's avatar
Pavel Stankov's avatar

Considerations about the Experience Machine tend to be a little navel-gazing. Perhaps Nozick intended the thought experiment to be read that way – maybe because it was him who couldn’t be bothered to think about others all that much. Whatever it is, we are primed to consider only our own well-being and weigh it against the deception offered by the machine.

But we’re not the only ones in the world. When we enter the Experience Machine, the world continues to be just as awful for others with the difference that we now cannot do anything about it because we’re living in a masturbatory fantasy.

I bring this up because in any distinctly moral consideration of AI partners we should think not only from the narrow perspective of the experiencer, but also from that of others surrounding them. We should also consider non-events – things that don’t happen as a result of an absent cause not generated by the intervention in question.

This is all very new, and there is no way there could be empirical data, but let me make the following prediction and offer it as the alternative hypothesis in future studies: the number of sexual violence will go down the more society accepts AI-generated partners. This is one of those rare cases when I’m optimistic, so I hope I’m right.

Also, don’t call my AI girlfriend submissive. That bitch can crack the whip.

Expand full comment
Jonah Dunch's avatar

I was just at a friend's wedding, and my buddy the Thomist recounted how when he proposed he told his then-gf that she was his "best friend". In his speech he explained Aristotle's distinction between the three kinds of friendship--of utility, of pleasure, of virtue--and claimed that his friendship with his new wife was of the third kind. <3 That distinction could provide another way of putting your point about AI gf's failing to challenge you--with AI it's impossible to have a real friendship of any kind (bc there's no one there), but further it's only possible (or at least feasible) to have simulated friendship of the first two kinds. An AI gf will give you utility (to the extent that it doubles as a personal assistant, therapist, etc.) and pleasure, but it will never help you in cultivating virtue through an (even simulated) shared love of the good (at least, it's not likely, because there's not much of a market for that).

Expand full comment
Esther Berry's avatar

Importantly for a friendship of virtue you BOTH have to be pursuing virtue, automatically disqualifying the bot.

Expand full comment
Quiara Vasquez's avatar

Sci-fi story conceit: an uber-rich, uber-powerful, uber-charming man seeks a partner, but he's just too wealthy and charismatic for him to have an equal relationship with anybody, so he creates an AI girlfriend specifically weighted to cultivate virtue. Hijinx ensue!

Expand full comment
Jason S.'s avatar

But it could. Before I read your comment I was thinking about how narcissistic relationships and family life can be and how an EA-approved (or otherwise ethically trained) AI “special friend” could be conducive to a more virtuous life.

Perhaps one could even specify which school or schools of ethical thought they are trained in. I’d be quite curious to interact with modern Stoic AI friend now that I think of it. My own AI Donald Robertson, Scottish accent and all!

More generally on this topic I wonder if we overestimate the availability of good human relationships to people, especially those who are, for whatever reason, relationship-challenged.

Expand full comment