For our playtest yesterday there weren’t many changes to the appearance and animations, but we were able to test some of the responsive aspects of the installation. One of the best aspects of having the installation on the screens is being able to watch people interact with it. A mom pointed out the big robot to her son and so Adam set the weapons in action to show the kid.
When the kid saw the robot hands open up to fire the laser, he yelled “nippers!”. Fun. 🙂
A visitor to the Robot University installation can interact with the robots in different ways. One of the robots involves dialogue interaction. The user can select messages to send to the robot, and it responds. It is an interaction that involves the user remotely running a mission on Mars. They command the Mars rover through the clunky and sometimes anxious robot in front of them.
I have been playtest-driven with this project, and because editing programs are not always publishing programs I’ve had to keep changing what tools I use. For the first playtest I conducted with actors, I used the free online interactive narrative publishing tool Inkle Writer. I had begun writing the dialogue just in Word, but switched to this online service.
This morning we had a long time (an hour and a half?) before most of the all elements were working. So Adam and Brian kept making changes and uploading new builds to check if the changes made it work. For a long time, Destructo our trusty giant robot couldn’t be seen (see pic below). But we got it working finally.
Robot University Update 4 – Input
Good morning, day, evening or night friends. My last two posts have been discussing the rather unique issues for setting up a camera in Unity for a project spanning an entire wall of the Cube. The last post stated that the camera was solved. This isn’t quite the case. There is still another issue to solve, but let’s take a break from cameras. This week we got input working, so we’ll talk about that.
The touch panels at The Cube use the TUIO protocol. This is “an open framework that defines a common protocol and API for tangible touch surfaces.” (http://www.tuio.org/). What this means for us is that Unity’s Input API isn’t going to help us here, it doesn’t, by default, receive TUIO touches. Continue reading
We just did our third text on the zone screens this morning. Great! More and more people are coming along to the tests, and so we have lots of excited smiles and discussions. In this photo, we have the local key members Paul and Adam (and myself), as well as Senior Curator Lubi Thomas, and most of The Cube technical team.
Good morning, day, evening or night. This is quite overdue, but as with any indie dev, I’ve been pretty busy. More than a few projects on the table at the moment, and Robot University has been throwing some challenges at us. I know, we knew that when we started. But it’s one of those projects where you can never identify them before they smack you in the face.
The plan was to get the camera system sorted out in the first week or two, then nail down touch input, tie it into 2D Toolkit UI and I can sit back and wait for content to come my way. Sure, there’s more to it than that. I have a dialog tree to implement. And I haven’t used 2D Toolkit before, so all the UI, and it’s various transitions and functions, will be new to me and take time to figure out and bug test, but essentially the core should be a few weeks work. Well, we’re nearly half way, and I think I just got the camera system sorted out. Touch input is still toying with me. So I’m going to have plenty of late nights in the months to come.
Let’s talk cameras. Continue reading
On Wednesday 18th September, we conducted our first technical test on the screens at The Cube. With projects that have never been done before, tests not only give you an insight into what works and what doesn’t but importantly also tells you things you didn’t know would happen. The team had been preparing for this test, getting the 3D models and code ready so it can be displayed on the screen. The process isn’t a simple click-to-display option. The screens in the space are run by different computers, and so there are complex calculations that have to take into account rendering the imagery across different screens…which is turns out have different frame widths. So to be honest, I wasn’t sure if anything would work on this first test.
So last update on Robot University I promised a post describing the camera rig set up once I had it working. That was a few weeks ago. I only just got it working. Almost. It has turned out to be a greater challenge than I anticipated.
Here’s a recap of the situation. The wall at The Cube that will be displaying Robot University is made up of 12 touch screen panels, set up as linked pairs controlled by a single PC called a node. The panels are in portrait orientation, side by side along the bottom of the wall. The top of the wall is three projectors being handled by another node. This node handles the image compositing, so thankfully I don’t need to worry about combining and overlapping three projectors, I can simply treat this is a good old fashioned single machine with some sort of crazy monitor running a 5360 x 1114 resolution. Continue reading
This is a repost from my personal blog site, AdamSingle.com.
This week the creative seas of our little collective met and lashed the mighty coastline that is The Cube. Probing for weaknesses, seeking pathways to new ideas.
This was the first time we had all been together in one place, and for many of us, the first time we had met each other in person. This in itself is a wonderful step forward. Being able to hold an impromptu discussion with someone, on anything at all, is incredibly important when it comes to building your impression, trust and confidence in a person that you are going to be working with. I’m delighted at the professionalism, creativity and motivation that Christy, Jacek, Simon and Paul showed. I am very much looking forward to this project.