On Wednesday 18th September, we conducted our first technical test on the screens at The Cube. With projects that have never been done before, tests not only give you an insight into what works and what doesn’t but importantly also tells you things you didn’t know would happen. The team had been preparing for this test, getting the 3D models and code ready so it can be displayed on the screen. The process isn’t a simple click-to-display option. The screens in the space are run by different computers, and so there are complex calculations that have to take into account rendering the imagery across different screens…which is turns out have different frame widths. So to be honest, I wasn’t sure if anything would work on this first test.
Brian and Murray from The Cube (the image above is Brian and Adam) spent about 30 minutes figuring out how to get our files viewable on the screens. They had to figure out what was the right resolution for our project. They have a lot of different projects that screen in the space, but we’re the only one using Unity for a whole scene distributed across different computers. We kept ourselves busy talking about the space and ideas until it seemed the moment had arrived…
Wohoo! And there it is: the first glimpse of our placeholder robot image. As you heard in the video, the first thing that stood out for me was the effect of the bright space and shine on the screens. The Cube space is open to the public 10am to 4pm daily, and so people will always be viewing the work during the day. Most digital projects have the benefit of being experienced inside and so you have some control over the final result. But in this space, I think we’ll need to play with the colour and lighting of the scene to contend with the brightness of the venue.
Gradually the whole scene loaded, and we could see the setting and robot positioning across the zone. We reset the positioning of the camera three times to try and emulate the angle of the layout image Simon had sent us. As you can see in the two images below, the size of the robot in relation to the rest of the space isn’t working as intended. This is because of a few issues that we’re dealing with. [Adam can talk about this.]
Paul was very excited to see his giant robot on the big screen! We all were! The “Destructo” robot is the one we’re concentrating at the moment because it is the most complicated of all the robot models. The robot is also positioned inside a big hanger, and so we’re playing with perspective to get the position right.
The robot we affectionately call “ButlerCat” looked very cute (though in this version it is a little stretched). Indeed, the best part of the test was seeing people walk by and attempt to interact with our robots. They wanted to play with our bots! That was a great feeling.
So some of the items we discussed were positioning. The robots were too far near the edge of the screens, so they will be moved further in for the next test. I also wanted to see what affect the UI screens has on the scene. I had one of those moments where something that wasn’t clear to me on sketches was suddenly obvious: I had placed to UIs (user-interface screens) beside each on the left. I don’t want that, so now I’ll move the UI for Destructo to the right. Adam also suggested using the actual screen for the UI (ala Minority Report). This would work well for the EmoBot/SpaceBot on the left as the interaction doesn’t need to have a screen always there. Instead, the visitor can touch the robot and a screen appears beside it. But the Destructo robot needs to have a screen always present, as the words on the screen and the act of initiating them as crucial to the experience of the session.
One thing I also wanted to see with this test is the effect of the doors. The idea is to have the doors in front of Destructo closed, with Destructo peaking over the top. It is only when the visitor presses the button to open the doors is the robot revealed and then activated. But once in the space it was clear that the doors weren’t having that effect. They weren’t big enough and it wouldn’t work to have them taller. So instead, what I came up with is having the giant robot in almost darkness. When the visitor presses the button the doors open with a hanger-awaking sequence of lights coming on along the path to the giant robot to reveal it’s full self. As teasers, it would be great to have a sound like that evokes thoughts of something breathing (like they did with the King Kong musical). The robot doesn’t need to breathe, but we can use sounds of hydraulics or something to emulate breathing in an industrial setting.
Another item on my list of things to think about after seeing the scale of the scene was a casual interaction. All of the current Cube installations have activities dispersed across all the screens. Robot University, on the other hand, involves deep interaction with three robots. Now we already know it will be interesting to watch (from the previous playtest), but I’m also concerned that there needs to be something else for others. The current context of the space means visitor will expect to be able to touch the screens and do something. We could just make the screens reactive in some way, but I want to integrate all the responses with the fictional setting as much as possible. There is some room at the front of the Destructo robot, and I thought perhaps there could be some little rat bots running around or something. But when we chatted about Destructo being in darkness, the idea came up of having little flying bots with little lights. They would give activity to the upper space and also give teasing glimpses of the giant robot when it is in the dark. I’m just thinking about how I can create an activity at the front that gives the visitor some agency with the flying bot lights.
We also had Jacek’s atmospheric music playing on one of the computers. It sounded great. He had an awesome bass that evoked the industrial space. The bass is positioned on the far left of the screens and so it is clearest there, but travels across the zone quite well. It was clear, however, that the bass could annoy the students trying to study in the area. We have been told to design the installation to work without sound and they will be switching off the sound during exam times. But I think I’ll also need to get Jacek to create a version without bass, so for times sound is allowed the installation can still use sound if the bass is getting complaints.
Since Jacek and Simon were in Melbourne, I sent them the above photo of Adam and Paul while we were there so they could share in the team excitement. These are just some of the learnings that came from this first technical test in the space. It really was so exciting to see our robot world up there, and even more so to see people walk up and want to interact with the robots. Soon the robots will respond…