Testing Kinect, object response, and the dialogue plugin

For our playtest yesterday there weren’t many changes to the appearance and animations, but we were able to test some of the responsive aspects of the installation. One of the best aspects of having the installation on the screens is being able to watch people interact with it. A mom pointed out the big robot to her son and so Adam set the weapons in action to show the kid.

happykid

When the kid saw the robot hands open up to fire the laser, he yelled “nippers!”. Fun. 🙂

nippers

ButlerCat is all textured and now has textured objects around it. They look great. I”m glad you can’t read or see the photos, as these need to be revealed through the animation. After figuring out a Adam was able to get the objects to respond to touch, so that means he can go ahead with the animations.

BCcu

However, we realised that unfortunately we have to re-position the robot so it is on the far right (see below). We did have the robot positioned so the main body is on one screen, and the arms and objects spill over to the two screens on each side of it. But the way the screens are network (one computer per two screens), it means the coding (and rendering) will take too long to do in time. So we’re moving the robot to be on the end two screens, with the animation happening to the bottom side rather than the bottom center. What this also means is that Simon needs to be put some detail on the background where the screen is now blank.

BC

Simon has been working on the center area of the doors too, but he is keen to do some more tweaking.

doors

We moved Destructo back up again, which not only makes Destructo the right size for the scene but also ensures all the arm animations happen over the top of the projector line.

Destructo

All of the Kinects are now installed, so now we can develop the swarm interactions. Some of the problems we had last week were resolved, but now we just need to get the tracking happening across all the nodes. The Kinects are at slightly different heights, and for some reason even though their angle is set in the code there is one that angles differently. So Adam is doing some more tweaking there.

left

As I mentioned in the previous post about the dialogue system, we now have a working plugin to run the dialogue interaction in the game engine we’re using: Unity. We found some errors in the program such as the subtitle message coming up, and all the single buttons (eg: “OK”) just ran through automatically. Adam contacted the dev and he told us how to circumvent the autoplay option, but is getting back to us about the subtitle message. I also noticed how different the text reads on the big screen, and so will be making some changes. I always find it fascinating how different the same text can feel on different screens and contexts. It reads differently online and on the iPad than it does on this installation. This is another reason why testing is to important.

Next week we have two tests, so the action is ramping up!

Leave a Reply

Your email address will not be published. Required fields are marked *