A two day collaborative workshop exploring performance, the Kinect, and movement based games took place at Lincoln University on 25th/26th March 2014 . The event was organised by Dr Patrick Dickinson and hosted by the Performance and Games Network .
The first day consisted of talks by:
Ida Toft and Sabine Harrer of the Copenhagen Game Collective;
Nick Burton, New Technology Lead of Rare Gaming;
David Renton, Microsoft MVP
(Kinect for Windows technical communities)
and Matt Watkins of Mudlark.
This was followed by group discussions in preparation for the collaborative “hack” day exploring five themes:
- Interfaces for Performance (Leaders: Duncan Rowland, Kate Sicchio)
- Mobility Impaired Performance (Leader: Kathrin Gerling)
- Physical Games in Playgrounds (Leader: Grethe Mitchell)
- Performative interfaces to seed social encounters (Leader: John Shearer)
- Audience and Movement Games (Leader: Patrick Dickinson)
I joined the Interfaces for Performance group where we had a lively group discussion on notions of interface, HCI, Human Human interfaces with the idea of creating challenging, embarrassing and awkward interactive acts and interfaces. (inspired by Sabine Harrer and her work on awkward games)
The large group spilt into sub groups to develop individual and group sub projects. I worked with artist/performer/dancer Ruth Gibson of Igloo exploring the idea of a motion capture (Cinema Mocap) as a tool for improvised performance.
Playing on the idea of awkwardness, the hack demo was conceptualised as a game where two or more people would record a short awkward, challenging or embarrassing performance for the second person to try and copy or improvise.
Ruth’s initial performance involved rapid and complex movements and challenged the ability of the mocap system to record correctly, resulting in distorted limbs and inhuman movements. The glitches however inspired Ruth to produce a motion capture of an inhuman looking movement:
In the discussion after the demo it was suggested that the prototype resembled a motion capture version of the game of Exquisite Corpse, leading to discussions of how it could be developed into a game with scoring and also find application in serious games such as dance training.
The ability of capturing and replaying motion within the Unity Games Engine offers scope for further performance experiments and scripting opportunities for the development of an improvisation or practise tool.
The following video illustrates how expressive actions can be captured and re-represented by a male and a female Unity character.
Further research will be to investigate the difference between possessing a unity character – where it copies you – to being possessed by it – where you try and copy it. A convolution like algorithm could be used to generate a ‘coherence value’ indicating the closeness of the movements which could be used to give real time user feedback or generate a score. Generating real-time user feedback of the coherence value via colour or sound would result in the performer learning to copy and move in time with the movements of the character. Applications of coherence feedback might be in “serious games” such as dance practise, sports exercise and taichi.