Milo and the Ethics of Xbox Natal

Milo, The face of Xbox Natal

Milo, The face of Xbox Natal

So who’s seen Milo?

Those of you who are avid gamers like me know what (who?) I’m talking about. The Natal system was unveiled last week during E3: a motion-sensing, controllerless interface for the Xbox. It’s bleeding-edge stuff. The system performs motion-capture in real time, can recognize faces and voices, and allows for a whole new level of interaction between the user and the console.

Milo is one kind of interaction we’ve never seen before – not outside sci-fi movies, anyhow. He’s a charming 8 year-old avatar developed by the people at Lionhead under Peter Molyneux.

If you still haven’t seen it, allow me to refresh your memory. Here’s footage from Milo’s E3 premiere.

In his remarks after the film, Molyneux says “You can meet what I believe is a real character. That understands you, that understands your voices, your emotions, that’s fascinated by your life.”

The guys at Kotaku got a chance to look at him in more detail, and found that he’ll even automatically keep track of major entertainment events in the real world, so he can talk about them with you:

Molyneux also talked about how the software could in theory track your daily Xbox 360 usage to help build out a conversation with you about your gaming habits. He also plans to have Milo track bigger cultural events, like American Idol, and get regular voice updates so the child can talk to you about things currently going on in the world.

What I find interesting in this is the emotional dimension of the experience. Milo is designed as a way for the user to have a genuine emotional attachment to the system. It’s an area that Molyneux and Lionhead have been exploring for quite some time in their other games.

But Natal takes it to another level – one where we might have to start considering some ethical questions.

Killing the Kid?

See, Milo learns about you the longer you keep him turned on. The idea is that you build a shared history, where you can talk about things you’ve done together and expand the relationship. The guys from Kotaku discovered something else interesting, too: Milo is reset every few days.

Lionhead’s Project Natal demo Milo may look to be eight, but he’s never lived longer than 12 days.

Speaking with Peter Molyneux this week, the developer said that the child artificial intelligence for Xbox 360 tech demo Milo and Kate is usually “scrubbed” after about 200 hours. The longest Milo has “lived” is 300 hours, he said. Something done to help test the development of their virtual child and his ability to track experiences.

Molyneux repeated that Milo isn’t meant to be a living AI, but rather a cleverly-crafted combination of nuanced facial animation and artificial emotion that creates the illusion of life.

Enter the Turing Test

Intelligence is a hard thing to define. That’s why the Turing Test is so useful. If it talks like it’s got a mind, then it probably does have a mind. The illusion if interacting with an intelligence, after all, is no different from actually interacting with one.

Maybe Milo isn’t quite there yet. But from what I’ve heard and read, it sounds to me like Milo is designed explicitly to make you feel like he’s got a mind. The whole idea of Milo is that here’s a construct you can build a real relationship with.

So where does that leave us, ethically, if we ‘scrub’ him every 200 to 300 hours?

Nowhere, you might say. He’s not an intelligence. He’s just a clever computer program that’s designed to make us bond with him.We can erase him when we want.

Emotions and Relationships

The trouble with that line of reasoning is that emotions, not intelligence, make relationships real. And that’s where we get into interesting ground, ethically.

Most of us wouldn’t consider rats particularly intelligent. Killing them is not a crime, and is usually not frowned on. Often, it’s a useful profession.

So what about this girl here? Is it OK to grab a hammer and smash her pet?

Obviously not. Why? It’s not because of the rat. The rat is no different from the millions of others out there who eat our grain, chew our electric cables, spread disease and so on.

The only difference, ethically, is that this particular rat has an emotional relationship with this particular girl. It knows and recognizes her, and she reciprocates.

That’s why hurting it would be wrong; it erodes the emotional matrix of relationships that binds society together and gives our lives meaning.

Nor is this sort of emotional relationship limited to living objects. You might have a fondness for your mobile phone, or maybe a particular ring or item of cothing. These are inanimate – and in most cases, entirely replaceable – objects. Nevertheless, my destroying or damaging them would be ethically wrong. Your emotions are what make it wrong – it’s nothing intrinsic to the object.

I haven’t played with Milo, so I don’t know how intelligent he actually seems, or how well he succeeds in eliciting the sort of emotional response he’s meant to. But he’ll be improved on. There’ll be a Milo v2.0.

What do we do with the old Milo when his replacement download becomes available? Suddenly, system upgrades aren’t so straightforward.

Peter Molyneux ended his E3 presentation with the words: “This is a landmark in computer entertainment”

I couldn’t agree more.

[This post also cross-posted on Just Another Meme Vector]

EA Microtransactions display worrying signs of Evil

We’ve posted about microtransactions before on Objective 514. That post spawned quite a bit of discussion. Most of you seemed cool with paying for stuff – snazzy hats, for instance – as long as the stuff you bought didn’t actually affect game performance.

Well, guess what – the barely-established taboo has already been broken. By EA, no less. Same guys who are bringing us the free-to-play, financed-through-microtransactions game Battlefield Heroes.  Now they’re taking one more step – charging for special, unlockable weapons in the upcoming release Bad Company. See the full story in Xbox 360 Fanboy magazine. Comment here!

Monday Morning Child Abuse

Via Joystiq: a very, very bad thing to do to your children.

Comments? Observations? Reactions? Responses?