By pressing a mouse button, the new version enabled the instant changing of a character between male and female, plus the addition or not of tattoos. Rather than costuming the characters, I chose to create naked characters using the latest edition of MakeHuman. The idea of a person donning a white boiler suit over their clothes then appearing virtually naked I felt added an element of risk and surreal drama to the occasion.
Five visitors to the exhibition chose to experience iMorphia whilst a small audience watched the proceedings. Positive feedback from the participants and audience confirmed the effectiveness of the illusion in producing a strange and disturbing unworldly ghost like character. One person commenting that from a distance they thought they were watching a film, until they came closer and were surprised in realising the character was being projected onto and controlled in real time by a performer.
Recorded footage of iMorphia once again demonstrated how participants creatively exploited glitches produced by Kinect tracking errors. Laughter resulting from one of the participants breaking the tracking entirely by squatting down:
Inspired by the intermedial performance work of Jo Scott, I am beginning to formulate an outline for a series of experimental live performance experiments as a means of testing the hypothesis as to whether it is possible to evoke the uncanny through intermedial performance. Intermedial being used here to highlight the mutually dependent relationship between the performer and the media being used.
Jo Scott improvises with her technology as a means of being present and evoking liveness, giving her the ability to move her performance in any direction at any time, responding to feedback from her system, the generated media and the audience. In comparison the iMorphia system as it currently stands does not support this type of live improvisation, characters are selected via a computer interface to the unity engine and once chosen are fixed.
How might a system be created that supports the type of live improvisation offered by Jo’ s system? How might different aspects of the uncanny be evoked and changed rapidly and easily? What form might the performance take? What does the performance space look like? What is the content and what types of technology might be used to deliver a live interactive uncanny performance?
How does the performance work of Jo Scott compare to other intermedial perfomances – such as the work of Rose English, Forced Entertainment and Forkbeard Fantasy? Are there other examples that might be used to compare and contrast?
I am beginning to imagine a palette of possibilities, a space where objects, screens and devices can be moved around and changed. An intimate space with one participant, myself as performer/medium and the intermedial technology of interactive multimedia devices, props, screens and projectors – a playful and experimental space where work might be continually created, developed and trialled over a period of a number of weeks.
The envisaged performance will require the development of iMorphia to extend body mapping and interaction in order to address some of the areas of future research mapped out following the workshops – such as face mapping, live body swapping and a mutual interactive relationship between performer, participant and the technology:
Face projection, interactive objects and the heightened inter-relationship between performer and virtual projections are seen as key areas where the uncanny might be evoked.
There will need to be a balance between content creation and technical developments in order that the research can be contained and released.
Live interactive face mapping is a relatively new phenomena and is incredibly impressive, with suggestions of the uncanny, as the Omote project demonstrates (video August 2014):
Omote used bespoke software written by the artist Nobumich Asai and is highly computer intensive (two systems are used in parallel) and involves complex and labour intensive procedures of special make up and reflective dots for the effect to work correctly.
Producing such an effect may not be possible due to the technical and time limitations of the research, however there are off-the-shelf alternatives that achieve a less robust and accurate face mapping effect including a face tracking plugin for Unity and a number of webcam based software applications such as the faceshift software.
The ability to change the face is also being added to mobile devices and tools for face to face communication such as skype as the recently launched (December 2014) software Looksery demonstrates:
Alternatively, rather than attempting to create an accurate face tracking system, choreographed actors and crafted content can produce a similar effect:
This is recording of an experimental live performance where a gesture responsive MikMiku Japanese animé dance figure is projected onto the body of the performer and at the same time the video of this projected body image is seen by the performer through a pair of video glasses.
The resultant effect is that of a simultaneous Other and The Double – The Double resulting from the simultaneous co-existence of the male body superimposed and transformed by the projected female Other.
The immersive effect of seeing the body transformed into a female Other had a strange uncanny effect on myself as the performer in that I began to play and adopt the movements and characteristics accorded to the projected Other. For example, the physical characteristics of the avatar’s long hair encouraged movements that resulted in greater expressions of flowing hair.
The visual feedback of an alternative body through projection has a transforming effect on the performers behaviour, creating a sense of immersion into an “alter body”.
The documentation is a recording and further tests need to be done to determine whether witnessing a live projection can convey the same sense of the uncanny. It has been pointed out that the video recording of a solitary act of performance constitutes an equally valid form of mediated performance and is associated with notions of voyeurism and secrecy that would not be present if performed live.
It is envisaged that alternative avatars and backdrop scenes will be created using Unity 3D to explore the effects and potentials of other characterisations. These might include archetypes from Fairy Tales (old man, wizard, prince, old lady, witch, princess, monster) or classic gaming characters such as Lara Croft and the Prince of Persia.
Technical details: Hardware: Microsoft Kinect, Vuzix Video Glasses,
I5 Windows PC, video projector, video camera. Software: MikuMiku with OpenNI plugin.
Performative Interaction and Embodiment on an Augmented Stage