Coder Update 4 – Input

Robot University Update 4 – Input

Good morning, day, evening or night friends. My last two posts have been discussing the rather unique issues for setting up a camera in Unity for a project spanning an entire wall of the Cube. The last post stated that the camera was solved. This isn’t quite the case. There is still another issue to solve, but let’s take a break from cameras. This week we got input working, so we’ll talk about that.

The touch panels at The Cube use the TUIO protocol. This is “an open framework that defines a common protocol and API for tangible touch surfaces.” (http://www.tuio.org/). What this means for us is that Unity’s Input API isn’t going to help us here, it doesn’t, by default, receive TUIO touches. Continue reading

Coder Update 3 – Camera Setup [Final]

Good morning, day, evening or night. This is quite overdue, but as with any indie dev, I’ve been pretty busy. More than a few projects on the table at the moment, and Robot University has been throwing some challenges at us. I know, we knew that when we started. But it’s one of those projects where you can never identify them before they smack you in the face.

The plan was to get the camera system sorted out in the first week or two, then nail down touch input, tie it into 2D Toolkit UI and I can sit back and wait for content to come my way. Sure, there’s more to it than that. I have a dialog tree to implement. And I haven’t used 2D Toolkit before, so all the UI, and it’s various transitions and functions, will be new to me and take time to figure out and bug test, but essentially the core should be a few weeks work. Well, we’re nearly half way, and I think I just got the camera system sorted out. Touch input is still toying with me. So I’m going to have plenty of late nights in the months to come.

Let’s talk cameras. Continue reading

Our first technical test!

On Wednesday 18th September, we conducted our first technical test on the screens at The Cube. With projects that have never been done before, tests not only give you an insight into what works and what doesn’t but importantly also tells you things you didn’t know would happen. The team had been preparing for this test, getting the 3D models and code ready so it can be displayed on the screen. The process isn’t a simple click-to-display option. The screens in the space are run by different computers, and so there are complex calculations that have to take into account rendering the imagery across different screens…which is turns out have different frame widths. So to be honest, I wasn’t sure if anything would work on this first test.

AdamandBrian

Continue reading

Domestic bot aka ButlerCat

ButlerCatThumbnails700pxtall

The Robot University project is coming along, with progress on all robot designs. This is the domestic robot. The housekeeper of the robots, which will challenge the idea of robots being subservient to humans.

This image shows the refinement process of the bottom left silhouette in my previous thumbnail exploration.

I really liked the idea of creating a robot that was animalist and cute. The domestic robots are effectively slaves, and since people expect their pets to love them unconditionally I thought designing the robot like a pet would give it some interesting depth to explore. We wouldn’t make a slave in our own image, but it’s going to be hanging out in your home when you are and aren’t there. That’s a very personal space! You need to trust it. It needs to be relatable, and maybe even vulnerable. So I thought why not make it look like a cat? Or some other cute animal.

The result looks like super advanced Japanese-derived tech. Like the evolution of a Tamagotchi, but now it’s taking care of you, instead of the other way around.

The final row includes some accessories, after Christy and Paul suggested giving it butler cues as well, like a bowtie or collar. It was tough to add bits without spoiling the slick, minimal stylings, but we’ve ended up running with option F, a very Hello Kitty reminiscent design. Our writer, Christy, was excited about adding the stereotyped gender you’d get from a pink bow, and it certainly raises some interesting questions.

Do you think people will always assign a gender to artificial intelligence?

How would it affect our interactions?

Coder Update 2 – Camera Setup

So last update on Robot University I promised a post describing the camera rig set up once I had it working. That was a few weeks ago. I only just got it working. Almost. It has turned out to be a greater challenge than I anticipated.

Here’s a recap of the situation. The wall at The Cube that will be displaying Robot University is made up of 12 touch screen panels, set up as linked pairs controlled by a single PC called a node. The panels are in portrait orientation, side by side along the bottom of the wall. The top of the wall is three projectors being handled by another node. This node handles the image compositing, so thankfully I don’t need to worry about combining and overlapping three projectors, I can simply treat this is a good old fashioned single machine with some sort of crazy monitor running a 5360 x 1114 resolution. Continue reading

Leading with the Face

In my previous post about the embodied playtest (where we tested the first designs with actors), I comment on how I want to vary the emotional end-point of experience. What I found during the playtest is that the testers could feel a bit bad about themselves, or at least be shocked at the end of the their session. I want to ensure there is at least one positive end-point to a session. So last night I flicked open Katherine Isbister’s book “Better Game Characters by Design: A Psychological Approach,” and funnily enough I opened the book at a section on emotional feedback.

The section is about how the design of faces can influence the player’s emotional experience. Specifically, the section talks about mirroring and how people involuntarily mirror other people’s facial expressions, which then influences their own emotional state. What this means is that the design of a face in your game can then directly influence the emotional state of the player. Indeed, what stood out for me was Isbister’s comment:

Good character designers direct player emotions by using player-characters to underscore desirable feelings (such as triumph or suspense) and to minimize undesirable ones (such as fear or frustration). [page 151]

Isbister refers to the design of Link from The Legend of Zelda: The Windwaker to illustrate the point. As you can see with this moment in The Windwaker, the designers guide the player’s emotions through Link’s (the player-character) emotions. He is about to shot out of a cannon, but his emotions shift from fear to determination (see at 1:53). What happened during the embodied playtest with the actors, is that a robot session or “scene” I had designed to be one that facilitates player-empathy was more disturbing than intended. A large part of that was the delivery by the actor. She did a great job of feeling and communicating with her face the trauma the robot is feeling. This, coupled with the text I had written, sent the scene into a much more negatively-intense experience than I had planned. This is why work in directing as well as writing, so I can be a part of all the elements that shape the experience. So while I can temper the experience with my writing, I will also work with Simon (concept artist) and Paul (modeler and animator) to temper the experience via facial expression. This means it isn’t a case of the facial emotions duplicating the emotion of the dialogue (interface text), but of adding more complexity to it. And importantly, as Isbister notes, accenting the moments I want the installation visitor to be lead to. The lead emotions rather than all of them.

Demolition robot

Fleshing out the demolition bot. The interactions require it to have at least 5 weapons. These designs includes one or multiple in the chest compartments.

Below are a ton of thumbnails fleshing out different parts, pushing ever closer to a final design.

20130814DestBotThumbnailsV1 20130814DestBotThumbnailsV220130818DestBotThumbnailsV3
Continue reading

Getting started with the concept art

I’ve been brought on to design the robots, the scene, and generally guide the look and feel. I’m pretty pumped to be involved in any opportunity to design robots, and will also be blogging my artventures on Tumblr.

Here’s the general layout we’re shooting for, with some WIP robots:

WallLayout

 

On the left will be a clunky, mismatched robot.

The middle is home to a partially constructed demolition robot.

The floating droid on the right is a futuristic domestic assistant.

Here are some initial thumbnails for each:

Continue reading

The embodied playtest

On Tuesday 6th and Wednesday 7th August, the whole team came together for our first group onsite meet. Jacek, Simon, and myself flew from Melbourne; and Adam and Paul from outer Brisbane. Over the two days, The Cube team gave us an induction of the site – showing us around the space and facilities. The Cube curator Lubi Thomas and UX designer Sherwin Huang shared things they’ve learned from previous installations; and we went over the game plan with The Cube technical team. We also visited the QUT Robotics Department to talk about their robot research projects, and see a Nao dance! Adam shares a video of it below:

The rest of the team was very excited about the space – they’re as keen on the possibilities as I am. We talked about the design of the project in light of seeing the space, and debated different approaches. The important part for me was also running an embodied playtest.

Continue reading