For today’s playtest we had more tinkering to figure out technical problems. Thankfully we figured out what was stopping the Kinect from working, and also managed to find the culprit behind the dwindling frame rate. Once these were sorted, we were able to enjoy the scene at an almost proper composition.
The middle section of the doors have changed, and seem different to the rest of the doors. But detail in the rest of the door area looks fantastic. We can’t wait to have sounds with them. We now have the doors opening slower than they close. And this works really well for the session experience. Now, because we have added separate (internal) cameras for the two foreground robots, some tinkering is needed to get their size and positioning right. But today, they first turned up as tiny versions of themselves:
Once Adam managed to position the camera so the robots look bigger, we were able to see some of the new detail on ButlerCat. Now this robot is looking very schmick with the white and black detail, pink bow, blue glowing eyes, and shiny finish.
Paul has also put in the first iteration of the floating objects around ButlerCat. He has put ButlerCat’s skeleton in, and so he’ll be able to animate the robot now. I sent through a long list of animations for ButlerCat to do, but I made sure they were repeated actions as much as possible. The idea is that once a visitor taps an object, ButlerCat grabs it. When they select another object, ButlerCat grabs that object and then does something with both objects. For instance, if you tap the spray bottle and sponge then ButlerCat sprays and wipes the glass in front of you. So there is a basic puzzle and consequence there, but there is also another layer. When certain objects are combined they reveal the story of the robot: how she is a child robot taking care of a human child for a family.
At the end of the testing session we managed to get the robot swarm operating. There are a couple of generations of flying robots in the scene: baby ones in the foreground, behaving like a swarm; and the larger adult ones in the background hanger flying by themselves. Because we need to give people a reference of what size the adult ones actually are and what they look like, there are some adult ones in the foreground. These are the Nanny bots. They all look great, but we will need to scale down the baby bots so they don’t interfere too much with the rest of the scene.
The way they work is that they swarm together when they detect a person nearby. They track that person, but when the person moves too close to them they scatter. Today they scattered for different people at different distances, which meant we all played favourites. The play emerged immediately – whether it was trying to get as close as you can, or having someone walk through and scatter your bots, or figuring out how they go from tracking you to tracking someone else.
During the test we also had Jacek’s environmental sound tests playing. He included environmental sound for when the doors are closed and then when they are open, for instance. Since they are all in one file, I asked Jacek to put an announcement between each one. Jacek used a digitised lady announcer and she sounded great. At first I couldn’t hear what she was saying, so we turned it up and the announcements were clear. Those announcements suddenly gave the scene an extra immersive dimension: where we felt the scene was more alive, and that the robot world exists beyond what we see, and that the world includes ours. So I definitely want to have some “robot university” announcements playing infrequently. Things like “training in progress” and even “danger, human approaching”. We’re well aware that the sound might be turned off if students studying in the space find it annoying. So we’ll work out a command upon intialisation to load a voice or non-voice version of the sound.