Preview: GDC - Operator's Side

By brandon sheffield.

March 8, 2003 11:59 PM PST


Today I finally got the chance to play around with the voice communicator we’ve been hearing so much about in the last month or so. The games set up were Socom and Operator’s Side. Being not incredibly interested in the already-released Socom, I focused my attentions on Operator’s Side, the Voice Action Adventure game.

There’s a tutorial here, which I unintentionally bypassed, but it just familiarizes you with the basics of the voice command system. You wear a headset, through which sounds are imparted to you, and you may speak to the game.

The story: a terrible catastrophe has occurred in the space station. Monsters are everywhere, and have already consumed most of the guests. You have the ability to view the entirety of the ship through the security cameras, and can control basic locking/unlocking functions for doors, lockers et cetera with the square button.

You come across a young woman who’s holed herself up in a containment cell for safety. As a worker there, she has a headset on. You can direct her with vocal cues, as she doesn’t know what to do, where to go, nor does she have a map of the ship (is she perhaps a new employee? I’m not entirely sure).

She accepts basic commands, like walk, run, turn right, turn left. You input these commands simply by saying them. It should be noted that this version of the game was in Japanese, so once I started playing a crowd gathered round. There were very, very few persons there who actually played the game with any success (given the language barrier), thus when a white guy could control the thing, everyone in the area became full of questions and curiosities.

As I familiarized myself with the system, I discovered to my amazement and enjoyment that the girl would accept commands that were not on the list. I instructed her to look for another door. She did so. There was an attaché case on a chair. I wasn’t sure what to call it, so I referred to it as a ‘kaban’ or bag. She found it anyway. You can actually speak to her in complete sentences, it is not necessary to issue one-word commands.

Battles are very interesting, as she takes no initiative on her own. She does have a pistol on her person (she finds it in the opening sequence, in classic survival horror form), but doesn’t know what to do with it, or how to fight. You must tell her to advance, shoot, run away and regroup, even which enemy and what part of the body to shoot! It’s really quite advanced in terms of what you can do, and the inventive language you can use.

The trouble is that oftimes she misunderstands my commands. My pronunciation is really quite good, so the majority of the time, she knew what I was on about. But here’s an example. I said ‘hokka no doa wo sagashite’ – ‘look for another door’. She said ‘lokka, ne?’ – ‘lockers, eh?’ She then went to open the locker door. So basically, when there’s a keyword trigger in the room like locker, it’s best to use a different word for ‘other’. When I said ‘betsu no doa wo sagashite ‘ she got it.

And sometimes she just doesn’t want to do what I say, not because of her will, but because her A.I. doesn’t accept it in the register. This makes sense of course, but is rather unfortunate in some instances. For instance I couldn’t get her to turn right when she was standing still. Technically there’s no reason that she should do that, but it’s a simple command that I’d like to be able to use. I can turn her while she’s walking, but when I ask her to do so in place she says ‘tomateru janai’ – ’I’ve already stopped!’

Not quite what I was looking for.

But it is extremely cool that she understands a very wide vocabulary. If she’s walking, I discovered that I can make her go faster not only by telling her to run, but by telling her to hurry. This means that the language you can use can really match the situation. You can effectively become part of the script, in the most real way imaginable.

Another nice point is that after you’ve examined an area, icons come up displaying the name or type of each item that’s available for further investigation. This way there’s no room for confusion as to what you should say to get the girl to look at whatever you wish. But if you can’t read the kanji, you’re pretty much screwed. However this is not true in all cases! For instance when I opened the attaché case, I couldn’t read the kanji for the contents. So I told the girl to look inside the attaché case - she understood, and did what I asked.

This basically takes the interactivity and complexity of the text-based adventures, one ups it and makes it vocal. I liked this game a hell of a lot more than I thought I would, and played it for a rather extended period of time, more than any other game on the floor.

But I dare say I would feel silly doing this in English. Somehow the game’s being in another language makes me feel less bizarre about the fact that I’m talking to a machine. Plus it gives me a real sense of empowerment to know that I can command this game in Japanese. That said, I think that this would be an incredible tool for learning the Japanese language, and I really would encourage developers to pursue that angle.

It’s unlikely, but I really see a potential benefit there with some great possibilities.

Overall I’m incredibly impressed by the way this technology has advanced, and the sheer lexicon of terms that it can accept. In this way it might almost be possible to develop some sort of emotional relationship to the characters. This truly may be the next great step for games.

brandon sheffield


 




Operator's Side - Intro

(0:33 - 4.00 MB)

"What an old camera..."

(0:32 - 4.91 MB)

Attache Case

(0:49 - 5.88 MB)

Lockers and Vending Machines

(0:46 - 5.40 MB)