Practice Research

The combination of practice, the art of doing coupled with academic research has been described as ‘Practice As Research in the Arts‘ (Nelson 2013) or ‘Practice Based or Practice Led Research‘ (Edmonds, Candy 2006), whilst there is some debate over the terminology,  during a symposium at Plymouth University Rachel Hann, simply described the process as Practice Research (Hann 2016).

“…many practice-led researchers do not commence a research project with a sense of ‘a problem’. Indeed they may be led by what is best described as ‘an enthusiasm of practice’:” (Haseman, B. 3: 2006)

In my research, practice has been used as a the basis of a research methodology investigating how augmented reality theatre might be realised and its impact on participants and observers. The practice research methodology has been informed by the literature, websites and practitioners, see Practice Research Resources.

The process and development of the research is documented in the form of a diary like research blog, with key words forming a tag cloud, enabling both an overview of key topics and offering a mechanism for browsing the blog according to content rather than a linear time based journey.

The following represents an overview of the research practice, describing key moments, influences, new directions and research outcomes.


A prototype system “iMorphia” based on the Unity Games Engine has been developed as a test vehicle for the research. Characters can be loaded into the engine which can then be “puppeteered” by a performer whose position and movements are sensed by the Microsoft Kinect.

iMorphia projects a computer generated character onto the performer who is wearing a white costume. The performer see their transformed selves from the perspective of the audience via video glasses connected to a video camera.

 imorphia systemiMorphia System Diagram

The first prototype created in November 2013 utilised the Japanese MikuMiku dance software, though very effective in creating a very responsive real-time character, was later abandoned due to its inherent limitation of a cartoon like anime aesthetic.

mikumikunewMikuMiku Character

The Unity Games Engine enables characters to be imported from Daz 3DStudio, MakeHuman and also by downloading pre-existing ready-made characters.

imorphia 3images
Daz3D female, male and MakeHuman female

The research blog documents a series of enactments illustrating the intermingling of practice and theory (praxis) and the generation of potential avenues for further research. A snapshot of this process is described in the paper iMorphia: A prototype Performance System presented at the Fascinate conference in August 2014.

The period following the paper involved explorations of improvisation with two players and improvisation with props. Theory on intermediality (Chapple, Freda and Kattenbelt, Chiel 2006) also impacted on the practice, creating a firmer foundation on which to pin the practice down. One of the resulting outcomes was a deeper investigation into the relationship of performative interaction with technology and improvisation.

PopUpPlay an alternative augmented reality performance system developed by Leicester De Monfort enabled a comparative study between iMorphia and a system that required a third party, a director to control the action. In comparison to PopUpPlay, iMorphia was envisaged as a standalone system where performers would interact with and control the outcomes and emergence of narratives and improvisation.

The performative notions described by Steve Dixon (Dixon 2007 ) – Navigation, Participation, Conversation, Collaboration,  became a key driver in shaping a series of PaR enactments from August 2015 to March 2016, documented in the following links:

August 2015: Navigation enactment
October 2015: Participation, Conversation, Collaboration
November2015: Props enactment
December2015: Props and Physics enactment
March 2016 Participation and Navigation workshop

During the March 2016 workshop we discussed notions of embodiment, this combined with feedback from my supervisors led me to investigate embodiment in the context of gaming, examining avatar representation and embodiment through interaction.

One of the main differences between iMorphia and traditional gaming, and most forms of computer interaction is that they are screen based where a user interacts with a screen in front of them.

My investigations into Dixon’s notions of performative interaction made me realise iMorphia was substantially different, the performer interacts away from the screen, the image is behind them and they performer towards an audience. This realisation was encapsulated into a concept I coined the “Embodied Performative Turn”. Between February and March 2016 I produced a document outlining a potential argument and structure centered around the practice research, “The Embodied Performative Turn and its relationship to gaming”. Word document: iMorphia PaR

I also entered an extended abstract based around this concept to the Computer Human Interaction Conference (CHI 2016), which was subsequently accepted and published: “IMorphia: An Embodied Performance System”, doi >10.1145/2851581.2891087

The Embodied Performative Turn or not?
The enactments investigating participation identified the difficulty of  locating and interacting with a 3D object in space. This was due to a combination of a lack of depth perception in the video glasses and discrepancies between Kinect tracking of real physical space and the projected virtual space of the Unity scene. I labelled this as  an issue of colocation, and to resolve this carried out an enactment with myself and one of the previous participants using the Occulus Rift VR headset so as to provide depth perception through stereoscopic vision of the Unity scene. This also enabled me to try out alternative perspectives from that of the audience, such as first person.

May 2016: Colocation and the Oculus Rift

First person perspective of the scene enabled objects to be easily located and acted upon, far easier than the removed second person audience perspective inherent in the concept of the Embodied Performative Turn. I began to wonder whether this novel concept was so useful and wonderful after all.

Might it be better that the performers see the virtual stage from a first person perspective, whilst the audience retains theirs of a projected second person perspective of the scene? Certainly interaction is better enabled, though the illusion of seeing oneself transformed is lost.

After discussions with my supervisors and many fraught days thinking I had hoodwinked myself I realised that both perspectives have validity. My second supervisor suggested I look at acting techniques,  such as those proposed by Stanislavsky who had once advocated the use of a mirror than rejected it as enabling only a superficial means of getting into character.

I realised I had both an argument and counter argument for the use and value of the Embodied Performative Turn and that other perspectives might also have value.  These pros and cons might then become the basis of an argument as found in a traditional thesis. Whether I attempt to document my research using a traditional linear structure or  a series of complementary writings (Nelson 2013: 36-37) arranged in the form of a hyperlinked structure is yet to be decided.

Performative Interaction and Embodiment on an Augmented Stage