Working out the Dialogue System

A visitor to the Robot University installation can interact with the robots in different ways. One of the robots involves dialogue interaction. The user can select messages to send to the robot, and it responds. It is an interaction that involves the user remotely running a mission on Mars. They command the Mars rover through the clunky and sometimes anxious robot in front of them.

I have been playtest-driven with this project, and because editing programs are not always publishing programs I’ve had to keep changing what tools I use. For the first playtest I conducted with actors, I used the free online interactive narrative publishing tool Inkle Writer. I had begun writing the dialogue just in Word, but switched to this online service.

InkleWriter

With Inkle, I can write the multi-threaded narrative and have it playable immediately online. For the playtest, I simply opened the webpage on my iPad and held that while the player clicked through and responded to the robot. But Inkle isn’t really a development tool, and I felt restricted by what I could do and what I knew how to do. So after that testing I then used mind mapping software. I had bought this software back when I was doing my PhD, and it is dead easy to use. For some reason, it really helps to see all the branches of text and their relationships visually.

Mindmapper

So I kept developing the narrative in MindMapper. But then it came closer to the point in the production schedule when the user interface (UI) could happen. I have been playtesting the interaction elements with paper and pen and iPads for as long as I could. But of course I need to see them up on the screen. So I did research to find a tool that could be exported to Unity (the game engine we’re using). I found one that I had come across before but which seemed like a good solution: ChatMapper.

ChatMapper_1

I could begin using it for free and just pay when I need to export. As you can see from the picture above, it has it’s own logic and system of display. Each dialogue interaction has its own box, and when there are multiple choices you need to move the responses outside the box and group them across a few boxes. You can also put in images to represent the roles. So I put an early sketch of Clunkybot in to represent the “non-playing character”, and a photo of Adam (the programmer) in to represent the “player-character”. In each interaction box, I also have to set who is “speaking” (the actor) and who is “listening” (the conversant). I can also set the mood and animation, etc, but for now I’m just sticking with the basic dialogue interaction. Because the programme has its own logic on how to deal with the interactions, I’m not being influenced in part by what I can do.

But then I was checking out the Unity plugins that worked with ChatMapper and the options didn’t look that good. For some reason, I ended up on a ChatMapper page that featured a plugin that wouldn’t suit us at all. So then we jumped to trying out a free dialogue plugin a developer in Germany is playing with. Because the programmes don’t talk to each other, I entered all the dialogue into the software again. The editor was easy to use so I edited straight in Unity. As you can see with the image below, it has a similar visual system.

Dialogue1_Unitybranch

Looking at these visual representations of the dialogue interaction, I noticed something interesting. In terms of a interactive narrative structures, my script is fairly straightforward. It is a single session interaction between one player-character (PC), and one non-player-character (NPC). So we don’t have lots of characters, and we’re not figuring out how a person’s choices affect different play sessions. It is a branching structure in the sense that the user has choices to select and these lead to more choices and so on. However, there is not always a correlation between the visual structure of a branching narrative and the actual narrative. Let me explain. Here is a screenshot of some of the scripted conversations between the player and the robot. This image is the editor view within Unity, using the free dialogue plugin (which we’re not going to end up using – more on that shortly).

dialogue2_Unitycloseup

Now, if we take Ian Schrieber’s helpful diagrams from his archived course on game design, we see three types of narrative structure that relate to my design: branching, parallel, and threaded. Visually, my dialogue has elements of three of these structural moments.

storybranching

storyparallel

storythreaded

So, what I realised is that these visuals are often employed to show people how plot can operate in interactive narratives. Each branch, each choice, leads to different plot points, and ultimately to a different ending. But mine isn’t operating exactly like that. Sure, there are slightly different endings, but the essential difference in the threads is about the tone of the interaction rather than the plot. In one thread the player can be a rude SOB, in another a bit too accommodating perhaps. So for me, I have my high-level arcs or moments that need to happen. I then script the interaction so that the user makes choices about how they’ll enact them.

I’m spending more time playing with the different ways I can write the player. I’m not trying to give the player every action they may want to do at that time. Once they make choices about how to behave at the beginning, then their fairly character is set. I do give them little moments to change their mind, but ultimately they have sealed their own fate from the beginning. So unlike another dialogue approach where you give the player the option to respond in a number of different ways for each choice — for instance: ask boy about his mother; ask boy about the gold; hit boy — they keep being who they’ve already become. We’ll see how it goes. But I must admit, I do find scripting the player a lot of fun. I guess because they’ll react in some way to how they’ve been written, whereas my non-player-characters don’t have the complication of pre-existence tying into the character experience.

We’ll be testing tomorrow and so I’ll get to see what the dialogue feels like on the big screen. I’m guessing I may need to shorten the experience in some places. We’ll see. As for the Unity plugin we ended up using. Adam came across a great one by Pixel Crushers: Dialogue System. It gives us the flexibility to customise the user interface with our own art fairly easily, and it is designed to work with ChatMapper. They say you can edit within Unity. You can. But they’re not putting much effort into the editor because ChatMapper works smoothly with it. So I’m tweaking the dialogue in ChatMapper, and Adam is tweaking its look in The Dialogue System in Unity. So far, so good…

Leave a Reply

Your email address will not be published.