Kinetica Panel Report
Documentation:
Gallery | Video | Writing : Kinetica 2010 | Live Reporting
By: Steve Boxer
The ongoing Robots & Avatars programme has resumed in 2010: its second instalment popped up at the Kinetica Art Fair in London, in the guise of a panel discussion on Saturday 6 February subtitled Collaborative Futures.
Click to jump direct to a particular section:
- Robots And Avatars – an introduction
- Peter McOwan: living with robots
- Anna Hill: bringing space to Earth
- Michael Takeo Magruder: avatar adventures
- Ron Edwards: virtual worlds, practical uses
- Noel Sharkey: rise of the Robatars
Robots And Avatars – an introduction
Ghislaine Boddington, Creative Director at body>data>space kicked off proceedings, by treating the packed house to an introduction to the Robots And Avatars programme, explaining that its primary focus is to: “Look at robots and avatars in the future, and examine how young people will work and play with representative forms in both the virtual and physical worlds.”
Boddington outlined some of the programme’s key points: “We’re looking at avatars, cyborgs and robotic culture, which are all representations of ourselves in some way, as we move into a world which is more multi-identity.” She illustrated those points with a short presentation, showing the likes of Orla Ray, (b>d>s’s performing avatar), virtual dance performances in Second Life, and Milo, games developer Lionhead’s virtual boy created to showcase its Project Natal system. Then it was time for the first panellist presentation.
Peter McOwan: living with robots
First up was Peter McOwan, from the Computer Science department at Queen Mary University, London. McOwan begun by saying: “I discovered as a researcher into subjects like artificial intelligence that reality as we see is something constructed in our heads, rather than something that can be measured.” A point he demonstrated with an illustration, apparently of a cylinder on a chequerboard, but when he removed the cylinder, it was obvious that two apparently black and white squares were the same shade of grey.
He then revealed details of his work on a European project called LIREC: Living with Robots and Interactive Companions . He said: “We’re trying to go beyond the novelty effect with robots: people find them fun for the first few days then get bored with them. We’re looking at how we can build into them the ability to engage people’s interest for longer periods. We’ve actually been using dogs – they are socially connected with us, so understanding how dogs and humans interact will help us understand human-robot interactions.”
McOwan then highlighted some ongoing research into the subject, showing a study in which iCat cat-shaped robots learned to play chess with children, “So the child and robot build a relationship – the iCat can recognise and reflect emotions. One important aspect was migration at the end of the game – the entity in the robot moves to a handheld device, so the robot will continue to learn. And when the child is near to the robot again, it will migrate back.”
He also showed clips of a University of Hertfordshire project investigating social robotics, consisting of a flat full of people as well as robots: “Robots in such situations can, for example, be useful if people are forgetting things – they can remind them.” And he showed a project he has been involved in with Fiddian Warman (soda), also designed to improve humans’ ability to relate to robots: a set of three robots which had been designed to pogo. Installed at several gigs by ‘punk bands’, McGowan reported that they had been well received.
In closing, McGowan highlighted what he saw as the key issues for robots and avatars in the future: “Perception is critically important: with robots and avatars, you have the issue of the uncanny valley. The more a robot looks like a human but doesn’t act like one, the more we feel uncomfortable with it. It can be like a zombie – something which isn’t quite right. For me, the future for robots and avatars is very much to do with their ability to be beneficial to, for example, blind people, or people who aren’t able to do everything.”
Anna Hill: bringing space to Earth
Next up to address the group was Anna Hill, CEO and Creative Director of Space Synapse. She explained that Space synapse is a company set up to explore interaction with space: “We’re working on systems to get from space to Earth, and offer some sort of collaboration between the two.”
She offered some examples, including Remote Suit, a wearable system designed to share the experience of being in space with people on Earth, and the Symbiotic Sphere – a pod which gathers inspirational space data including images, videos, sound and haptics from space, the idea being to give those who sit in it an idea of what it is like to be in space.
Hill outlined her vision of the future: “I can envisage a feminising of technology. I’m very interested in augmented learning and collective and systemic thinking – there will be fewer top-down organisations. And there’s a need for robots not to replace humans.”
Michael Takeo Magruder: avatar adventures
Michael Takeo Magruder, b>d>s Associate and Artist/Researcher from King’s Visualisation Lab, King’s College London, was next to take the stage. The resident avatar-guru at Robots and Avatars, Magruder explained: “I’m going to concentrate on talking about the virtual body: I’ve worked with avatars in virtual worlds for the better part of a decade.”
Magruder proceeded to show details of avatar projects he has worked on, including Theatron III, which reconstructed ancient theatres, 2007’s Rhythmic Spaces and the Vitruvian World, which took place within Second Life. The latter, as Magruder explained: “Showed that you could bridge the networked, physical and virtual worlds.”
Magruder mused about supercomputers: “I find them interesting.” He added: “The technology is rapidly filtered down – now we have the PlayStation 3, which is a supercomputer which sits in our home. But you have to explore the notion of the character before you get to the avatar.” He explained that his first experience in that realm was not with avatars, but playing as a space marine in Doom, and contended that, typically, we get our first experiences of virtual bodies from playing videogames.
He then explained: “Our best notions of avatars are not new: looking back into history you can, for example, find the Ten Avatars of Vishnu.” But, he added: “We now live in a world in which avatars are ubiquitous”
He continued: “In the future, it will be about mobility versus immersion. We will have to think about redefining conventions, such as how we choose to represent ourselves, and the relationship between our virtual forms and our one physical self. There are issues of ethics – trust, privacy, agency and exchange. For example, in Second Life, we often let our colleagues use our virtual bodies.”
Ron Edwards: virtual worlds, practical uses
Edwards explained how his organisation Ambient Performance operates in the commercial sector in the computer training world, bringing virtual world technologies into the workplace: “Our primary focus is enterprise-grade virtual worlds. And we’re extending avatars to mobiles using augmented reality.” He explained how live data can be fed into Ambient Performance’s virtual worlds, and avatars within them can even call mobile phones.
Looking into the future, he showed an application in which avatars could walk around representations of data, and continued to look at emerging avatar/virtual world technologies, such as Infoterra’s Shape, a 3D city-visualisation tool being used for response-drill training. He showed a traffic simulator designed to model what would happen if the hard shoulder of the M42 was opened up to drivers: “We had to look at how drivers would behave, not just model the behaviour of vehicles. That’s a good example of what you could call robot-avatars.”
Edwards signed off by introducing the audience to Emotion AI, software designed to help people create avatars that move and act in a believable manner: “For example, when in conversation, you can tell a lot from subtle shoulder movements – how do you automate that?”
Noel Sharkey: rise of the Robatars
Robots And Avatars stalwart and project champion Noel Sharkey, Professor of Artificial Intelligence and Robotics at the University of Sheffield, was next to take the stand. He concentrated on his specialist subject of robots, and opened by suggesting that people tend to think: “Robots will kill us, heal us or have sex with us.” He proceeded to coin the phrase “Robatars”, citing the example of physical military drones operating in war-zones, yet controlled by operators in the Nevada desert. “Virtual Reality is coming into play in a new way, which you could call “Real Virtuality” – you’re looking at VR in a cocoon, where you can smell, touch and so on.”
He demonstrated a practical application of robots – as remote doctors and nurses, where robots with TV screens as heads, on which real doctors and nurses in remote locations could do their rounds. He contended that the rise of robots and avatars, and blurring of boundaries between the two, “Raise possibilities for discussion about whether they should be more or less remote-controlled. For example, I can send my virtual self as a robot to meet people. Or sex-robots could have affairs without actually meeting anyone.”