KINECTIC

Search
Skip to content
  • Practice Research
    • Practice Research Resources
  • Research Blog
  • Research Overview
    • Historical Context
    • Contemporary Research
    • Background
  • Performative Examples
    • Performance Reviews
  • The Kinect
    • Kinect Software
    • Motion Capture (Body)
    • Motion Capture (Face)
  • Centre for Doctoral Training
    • Internship
    • Conference paper
    • Research Writing
Research Blog

20. Colocation and the Oculus Rift

May 16, 2016 admin

In post 18. Interactive Props and Physics it was noted “Colocation issues are the result of the difficulty in perceiving where the character is in three dimensional space due to the lack of depth perception.”

In this enactment the Oculus Rift VR headset is used as a means of ascertaining whether the added depth perception of the stereoscopic rendering of the Unity scene might assist in enabling a perfomer to locate the virtual props in 3D space.

Three enactments were carried out, two with  the rendered viewpoint from the camera located from the audience perspective and one from the first person perspective typically used in VR and gaming.

The video below is a mobile phone recording of a computer monitor rendering the Unity scene in real time. The computer uses an i7 processor and a relatively powerful Nvidia GT 720 graphics card to deliver the stereoscopic rendering to the Oculus Rift. Though the system is able to support the new Kinect v2, the older Kinect was used in order to maintain continuity with previous enactments.

In the first enactment myself and one of the previous performers carried out the task of knocking the book off the table. We both felt that the task was much easier to accomplish with the stereoscopic depth enabling one to easily judge the position of the avatars hand in relationship to the virtual book.

Kinect tracking errors made bending the arm and precise control of the hand a little problematic. The task was felt to be much easier to achieve than  previous enactments using the monoscopic video camera perspective as it was possible to clearly see where the virtual hand was, even if when it was ‘misbehaving’.

However with the added depth perception a new issue came to be highlighted that was previously unnoticed, that of difficulties in knowing  front from back. When one moves ones hand forward it moves away from you, whilst when viewed from the camera perspective the hand moves nearer to the camera, the opposite direction to which one is used to. This effect parallels the left right reversal of a mirror in comparison to the camera view. In both cases through practice it is possible to become accustomed to the depth reversal and lack of mirror reversal, though at first one finds oneself moving in the opposite direction, or using the opposite limb, It is possible to technically produce a mirror reversal, but a depth reversal was felt to be more problematic. A simpler solution, easily achievable using VR was to give the performer the same first person perspective as one is normally used to – seeing the scene from the viewpoint of the avatar.  In the video, the third enactment  carried out by myself demonstrates this perspective.

Due to time constraints it was not possible to test this enactment with the external participant. However despite the incredibly immersive qualities of the first person perspective, I felt there are some serious problems resulting from this viewpoint.

Firstly I felt a very strange out of the body experience looking down at a virtual body that was not mine, in addition my virtual limbs and my height were completely different to my own and this produced a strong sense of disorientation. Perhaps a male body of similar height and dimensions to my own might have felt more familiar.

The task of  knocking the book over felt extremely easy as I could see my virtual hand in relationship to the book from a familiar first person perspective. Despite Kinect tracking issues, it was possible to correct the position of the hand and ultimately knocking the book over was easy to achieve. Both the issues of depth and mirror reversal were removed using this perspective.

However walking and moving in the scene resulted in a strong degree of vertigo and dizziness. For the first time I experienced “VR motion sickness” and nearly fell over. It was extremely unpleasant!

Further, after taking the headset off, for some minutes I still felt disorientated, somewhat dizzy and a little out of touch with reality.
Although the first person perspective should have felt the most natural, it also produced disturbing side effects which if not rectified would make the first person VR perspective unusable if not hazardous in a live performance context.

The feelings of vertigo and motion sickness may well have been exaggerated due to Kinect tracking issues, with the avatar body moving haphazardly resulting in a disconnect between the viewpoint rendered by the avatars perspective and that of where my real head thought it was.

Two further practical considerations are:  i) the VR headset is tethered by two cables  making it difficult to move feely and safely and ii) the headset being enclosed felt somewhat hot after a short period of time. Light, ‘breathable’ wireless VR headsets may solve these problems, but the effects of vertigo resulting from the first person perspective whilst moving in 3D space and feeling as if one is in another body are perhaps more problematic.

The simplest solution, though still with the depth reversal issue, is removing the VR tracking and to create a fixed virtual camera giving the audience perspective, parallel to the previous methodology of relaying the audience perspective through a video camera mounted on a tripod.

Before dismissing the VR first person perspective being the sole cause of motion sickness, it is planned that a further test be carried out using the more accurate Kinect v2 with a virtual body of proportions similar to my own. It is envisaged that the Kinect v2 would result in a more stable first person perspective and with a more familiar viewpoint as one I am used to with my natural body.

In addition other gaming like perspectives might also be tried, the third person perspective for instance, with a virtual camera located just above and behind the avatar.

A key realisation is that the performers perspective need not necessarily be that of the audience, that the iMorphia system might render two  (or possibly more) perspectives – one for the audience – the projected scene, and one for the performer. The projected scene being designed to produce the appropriate suspension of disbelief for the audience, whilst the performer’s perspective designed to enable the performer to perform efficiently such that the audience believes the performer to be immersed and present in the virtual scene.

 

colocationimmersionOculus Riftparticipationpersectivepropstrackingvisual feedbackVR

Post navigation

Previous Post19. Interaction WorkshopNext Post21. iMorphia Public Demonstration

Tags

colocation daz 3d studio dixon double embodiment enactment face tracking gaming gender gesture glitches immersion improvisation interaction intermediality jo scott kinect lincoln live makehuman mikumiku mirroring navigation PaR participation performance performative research persective physics play popupplay practise praxis projection projection mapping props realism shana moulton speech tracking two performers uncanny unity visual feedback workshop

Performative Interaction and Embodiment on an Augmented Stage

Browse Research Blog

  • 22. iMorphia at the Festival of Fools Posted on: Apr 27th, 2017
  • 21. iMorphia Public Demonstration Posted on: Jun 23rd, 2016
  • 20. Colocation and the Oculus Rift Posted on: May 16th, 2016
  • 19. Interaction Workshop Posted on: Mar 8th, 2016
  • 18. Interactive Props and Physics Posted on: Dec 6th, 2015
  • 17. Interactive Props Posted on: Nov 10th, 2015
  • 16. Participation, Conversation, Collaboration Posted on: Oct 14th, 2015
  • 15. Navigation Posted on: Aug 7th, 2015
  • 14. Comparative Study Posted on: Jun 1st, 2015
  • 13. PopUpPlay Posted on: Mar 4th, 2015
  • 12. Live Performance Posted on: Feb 25th, 2015
  • 11. The Uncanny, Praxis and Intermediality Posted on: Feb 13th, 2015
  • 10. Fascinate, Evaluate and Praxis Posted on: Oct 29th, 2014
  • 9. Evaluation Exercise - Two Performers Posted on: Aug 12th, 2014
  • 8. Evaluation Workshops Posted on: May 21st, 2014
  • 7. Performance and Games Workshop Posted on: Mar 30th, 2014
  • 6. "User Testing" Posted on: Mar 17th, 2014
  • 5. Research Review: Theory and Practice Posted on: Feb 13th, 2014
  • 4. Kinect and Unity - Semi-realistic characters Posted on: Feb 9th, 2014
  • 3. Realism and The Uncanny Posted on: Feb 9th, 2014
  • 2. Unity 3D and Kinect tests Posted on: Jan 23rd, 2014
  • 1. MikuMorphia Posted on: Nov 15th, 2013

Featuring Top 22/22 of Research Blog

Subscribe

Read more

Proudly powered by WordPress