Tag Archives: mikumiku

Unity 3D and Kinect tests

Overview
It has been some time since the MikuMorphia experimental performance and the dubious delights of being transformed into a female Japanese anime character. Since then I have cogitated and ruminated on following up the experiment with new work as well as reading up on texts by Sigmund Freud and Ernst Jentsch on the nature of the uncanny, with the view of writing a positional statement on how these ideas relate to my investigations in performance and technology.

In January I moved into a bay in the Mixed Reality Lab and began to develop a more user friendly version of the original experimental performance whereby it would be possible for other people to easily experience the transformation and its subsequent sense of uncanniness without having to don a white skin tight lycra suit. Additionally I wanted to move away from the loaded and restrictive designs of the MikuMiku prefab anime characters. I investigated importing other anime characters and ran a few tests that included the projection of backdrops, but these experiments did not result in breaking any new ground. Further, the MikuMiku software was closed and did not allow the possibilities of getting under the hood to alter the dynamics and interactive capabilities of the software.

MikuMorpha as spectator
Rather than abandoning the MikuMiku experience altogether I carried out some basic “user testing” with a few willing volunteers in the MR lab. Rather than having to undress and squeeze into a tight lycra body suit, participants don a white boiler suit over their normal clothes, This does not produce an ideal body surface for projection being a rather baggy outfit with creases and folds, but enables people to easily try out the experience.
Observing participants trying out the MikuMiku transformation as a spectator rather than a performer made clear to me that watching the illusion and the behaviour of a participant is a very different experience from being immersed in it as a performer.
The subjective experience of seeing one self as other is completely different from objectively watching a participant – the sense of the uncanny as a spectator appears to be lost.

Rachel Jacobs, an artist and performer likened the experience to having the performers internal vision of their performance character visually made explicit, rather than internalised and visualised “in the minds eye”. The concept of the performers character visualisation being made explicit through the visual feedback of the projected image is one that deserves further investigation with other performers who are experienced in the concept of character visualisation.

Video of Rachel experiencing the MikuMiku effect:

Unity 3D
My first choice of an alternative to MikuMiku is the games engine Unity 3D which enables bespoke coding, has plugins for the Kinect and an asset store enabling characters, demos and scripts to be downloaded and modded. In addition the Unity Community with its forums and experts provide a platform for problem solving and include examples of a wide range of experimental work using the Kinect.

Over the last few days, with support from fellow MRL PhD student Dimitrios, I experimented with various Kinetic interfaces and drivers of differing and incompatible versions. The original drivers that enabled MikuMiku to work with the Kinect used old version of OpenNI (1.0.0.0) and Nite, with special non-Microsoft Kinect drivers. The Unity examples used later versions of drivers and OpenNI that were incompatible with MikuMiku which meant that I had to abandon running MikuMiku on the one machine. I managed to get a Unity demo running using OpenNI2.0, but in this version the T-pose which I used to calibrate the figure and the projection was no longer supported, calibration was automatic as soon as you entered the performance space, resulting in the projected figure not being co-located on the body.

Technical issues are tedious, frustrating, time consuming and an unavoidable element of using technology as a creative medium.

Yesterday, I produced a number of new tests using Unity and the Microsoft Kinect SDK, which offers options in Unity to control the calibration, automatic or activated by a selecting a specific pose.

Below are three examples of these experiments, illustrating the somewhat more realistic human like avatars as opposed to the cartoon anime figures of MikuMiku.:

Male Avatar:

Female Avatar:

Male Avatar, performer without head mask:

This last video exhibits a touch of the uncanny where the human face of the performer alternatively blends and dislocates with the face of the projected avatar, the human and the artificial other being simultaneously juxtaposed.

 

 

Miku Morphia – Experimental Performance

This is recording of an experimental live performance where a gesture responsive MikMiku Japanese animé dance figure is projected onto the body of the performer and at the same time the video of this projected body image is seen by the performer through a pair of video glasses.

Observations
The resultant effect is that of a simultaneous Other and The Double – The Double resulting from the simultaneous co-existence of the male body superimposed and transformed by the projected female Other.

The immersive effect of seeing the body transformed into a female Other had a strange uncanny effect on myself as the performer in that I began to play and adopt the movements and characteristics accorded to the projected Other. For example, the physical characteristics of the avatar’s long hair encouraged movements that resulted in greater expressions of flowing hair.

The visual feedback of an alternative body through projection has a transforming effect on the performers behaviour, creating a sense of immersion into an “alter body”.

The documentation is a recording and further tests need to be done to determine whether witnessing a live projection can convey the same sense of the uncanny. It has been pointed out that the video recording of a solitary act of performance constitutes an equally valid form of mediated performance and is associated with notions of voyeurism and secrecy that would not be present if performed live.

Further work
It is envisaged that alternative avatars and backdrop scenes will be created using Unity 3D to explore the effects and potentials of other characterisations. These might include  archetypes from Fairy Tales (old man, wizard,  prince, old lady, witch, princess, monster) or classic gaming characters such as Lara Croft and the Prince of Persia.

Technical details:
Hardware: Microsoft Kinect, Vuzix Video Glasses,
I5 Windows PC, video projector, video camera.
Software:  MikuMiku with OpenNI plugin.

System Diagram:

mikusystem