This morning we had a long time (an hour and a half?) before most of the all elements were working. So Adam and Brian kept making changes and uploading new builds to check if the changes made it work. For a long time, Destructo our trusty giant robot couldn’t be seen (see pic below). But we got it working finally.
We met today for our last playtest in October. Brian (a Cube staff member) told us that the equipment stopped working yesterday. But after working on it all day and night, it was is fine today. Except for one thing. Sound has started crackling and popping, and isn’t even playing in some of the areas. So the team are working on that. But as a back-up I’ve arranged for us to use another zone which has the same setup as ours. That way we can still go ahead with our audio testing schedule too.
One thing I noticed with this build is that Destructo looks the right size for the environment now. We’ve been tweaking the positioning of Destructo and the environment around it for a few weeks now. Today Destructo does look as if the size and position match. I realise now, though, that if we had designed a giant robot with smaller arms then we wouldn’t have needed to give the sense of distance as much. One of the problems has been that the long arms raise up towards the visitor. So there needs to be enough space for that movement. If the robot was build differently, then it could be closer to the visitor. But at the same time, if the robot was built differently then it would be harder to do a good pull-back weapon-firing sequence.
We just did our third text on the zone screens this morning. Great! More and more people are coming along to the tests, and so we have lots of excited smiles and discussions. In this photo, we have the local key members Paul and Adam (and myself), as well as Senior Curator Lubi Thomas, and most of The Cube technical team.
On Wednesday 18th September, we conducted our first technical test on the screens at The Cube. With projects that have never been done before, tests not only give you an insight into what works and what doesn’t but importantly also tells you things you didn’t know would happen. The team had been preparing for this test, getting the 3D models and code ready so it can be displayed on the screen. The process isn’t a simple click-to-display option. The screens in the space are run by different computers, and so there are complex calculations that have to take into account rendering the imagery across different screens…which is turns out have different frame widths. So to be honest, I wasn’t sure if anything would work on this first test.
This image shows the refinement process of the bottom left silhouette in my previous thumbnail exploration.
I really liked the idea of creating a robot that was animalist and cute. The domestic robots are effectively slaves, and since people expect their pets to love them unconditionally I thought designing the robot like a pet would give it some interesting depth to explore. We wouldn’t make a slave in our own image, but it’s going to be hanging out in your home when you are and aren’t there. That’s a very personal space! You need to trust it. It needs to be relatable, and maybe even vulnerable. So I thought why not make it look like a cat? Or some other cute animal.
The result looks like super advanced Japanese-derived tech. Like the evolution of a Tamagotchi, but now it’s taking care of you, instead of the other way around.
The final row includes some accessories, after Christy and Paul suggested giving it butler cues as well, like a bowtie or collar. It was tough to add bits without spoiling the slick, minimal stylings, but we’ve ended up running with option F, a very Hello Kitty reminiscent design. Our writer, Christy, was excited about adding the stereotyped gender you’d get from a pink bow, and it certainly raises some interesting questions.
Do you think people will always assign a gender to artificial intelligence?
How would it affect our interactions?