Come over here: How illusion can bring the future into the present


Virtual Reality (VR) offers a space to explore and encode new understanding into affordances beyond what is experienced in reality. This paper and artefact will explore how concepts of perceived near and far spaces represented in virtual worlds can be manipulated to allow encoding of new affordances into how we can interact spatially within these environments.

Through the implementation of an independent visual horizon and motion controllers an interaction of grasping distant spaces will be presented. This new affordance of how a virtual world can be traversed, seeks to integrate far or distant spaces within the near peripersonal space through tool usage.


This paper will seek to explore a new interpretation of movement in a virtual space that will experiment with the possible extension of peripersonal intention into extrapersonal space. This idea of an extension however will be done as an optical illusion to trigger a cognitive response to distant space that aligns with how objects within our immediate are interpreted. This approach realises the importance of hands in the process of assigning meaning and action to objects and relies on the ability of the mind to incorporate tools into the body schema otherwise (Maravita & Iriki, 2004) in order to perform tasks that are without.

This experimental approach is designed around the advantages of Virtual Reality and relies on the availability of precision motion controllers. For this examination we will first consider the current accepted affordances in traversing space in video games.

Computer games have long held replicating reality as a hallmark of the advanced state of the technology (Low, 2001). Visually over the years, graphics have become increasingly photo realistic and essentially work to provide the player with the most immersive experience as possible. With this in mind the traversal of space also is realistic where the player can guide their character around the environment through walking to running, ducking and jumping. The real world movements that are abstracted in the virtual environment are easily understood by players as affordances within that space.

With the immergence of cheap position tracked high quality head mounted displays another visual leap of acuity has occurred, and along with other developments in commercial technology such as motion controllers and hand tracking, we now have two of the major human sensory systems now contributing to the final immersive experience.


Figure 1- HTC Motion Controllers


Figure 2- Leap Motion hand tracking

Review of traversal

Currently traversal of virtual environments for most users, is limited and dictated by the technology and space requirements. Universities are actively researching physical movement in medium size spaces using techniques like redirected walking and overlapping rooms (Suma et al, 2012) as a way of reusing a physical space almost infinitely.

Traversal in VR space for consumer grade VR experiences is currently being handled in several different ways. Beyond the movement afforded in a room scale experience in order to move larger distances is required. Movement from one place to the next has to be done in a way that the user isn’t subjected to induced motion sickness. Movement in virtual vehicles is one approach by many games and places the user inside a vehicle in which they are then able to traverse large distances.


Figure 3- Cockpit view from Eve: Valkyrie

Key features are that the cockpit or the parts of the vehicle make up what is called an artificial horizon and allows the user to synchronise their orientation and position with the distant visual field. Often the field view only need to be partially obstructed for this to be effective. CCP Games. (2016) Eve: Valkyrie is a great example of the cockpit approach and also floods the distant visual field with other ocular cues like debris and dust clouds.


Figure 4- Stereoscopic cockpit view from Vox Machinae

Space Bullet Dynamics Corporation, Vox Machinae, is another example of how effective the use of the cockpit is, in this case that of being inside a bipedal robot that ‘walks’. The developers took a novel approach to minimising roll and pitch from the experience by suspending the cockpit within the robot design as if it was hanging cradle.

Independent Visual Backgrounds

The use of visual background that is not locked or independent of the player’s movement has been noted to reduce the onset of simulation sickness (Duh et al 2004). The concept works through syncing the vestibular and visual systems to disrupt the conflicting visual cues of a moving environment. Initial experiments worked around freezing or locking the background skybox with that of the user’s viewport. However, this approach is only really useful if occurring in environments that are predominately outdoor and don’t carry over indoors (Oculus Best Practices, n.d.).

Oculus research has expanded on this concept by rendering a grid to all of the virtual environments as a semi-transparent overlay, however, they noted that this implementation negates the user’s ability to ‘suspend their disbelief’ of which is a critical element of maintain emersion (Oculus Best Practices, n.d.). For this experiment we will forego the need to compel the user to suspend their disbelief in order to test out the premise of independent visual backgrounds.


The idea of teleportation to traverse space in virtual environments has only been discussed in passing in literature (Slater & Usoh, 1994; Bowman et al, 1998), but for use in modern VR experiences is becoming increasingly popular for getting around inside of virtual experiences. Typically, the implementation is ‘Blink Teleportation’ and is characterised by the mimicking the effect of if one was to blink, instantly transitioning the rendered view to black before fading back in.

The player uses the motion controllers to point at the location they wish to ‘travel’ to and then activate the teleportation by a button press on the controller. The player is then immediately moves to the new location and subsequently does not experience a transition which in VR causes a discontinuity effect when the brain visually ‘sees’ movement but the proprioceptive system of the body does not feel it.


Figure 5- Vive Teleporter for Unity


Game engine technology like Unreal Engine 4 is using a partial implementation of the independent visual background in their VR editor, where the scale of the grid is the only transform considered and locked (Fricker & Donaldson, 2016). This works to great effect when the user scales the world so that they can look around in the miniaturised environment. The illusion is that the world seems as it was scaled down but it was in fact that the user that was scaled up.


Figure 6- Unreal’s VR editor

Taking this further we know that tools are easily incorporated into the body schema and serve the purpose of extending our bodies beyond what might be physically impossible otherwise (Maravita & Iriki, 2004). This integration is seamless. Other experiments in how tools like a laser pointer indicate that there is not a correlation and replication of that the same incorporation of tools into the body schema is occurring (Holmes & Spence, 2004). Distant objects present themselves in our future and our intention with that object remains as something beyond our physical reach and subsequently are subject to different egocentric cognitive processes (Trope & Liberman, 2010).

This is where the virtual space can deviate from how distant objects are represented and what built in affordances we can extend upon. Thus while in a simulated virtual space it main seem prudent to keep within the real world limitation of translating space in order to maintain the suspension of disbelief in the experience, it fails to explore other opportunities can be used to interact with distant spaces.

In a glance it is completely within possibility to treat distance spaces as something that is entirely within our grasp, as extended by tool use. Once a tool has been utilised it is effectively incorporated without further thought to be then used to complete a task that was previously beyond reach (Maravita & Iriki, 2004). In VR we can use tools to not move ourselves closer to an action possibility but instead ‘drag’ that distance space, much like if our arms had reached out a great distance and grasped it, into our immediate personal space.

This requires the need to visually disrupt the visual ques we use to gauge object in the distance in a way that implies that it is within our immediate action possible space. One way of achieving this disruption is to overlay the reference frame of the space with an absolute grid when the user intends to interact with distant space. This absolute grid is ‘free’ and does not move relative to the local reference frame.


Figure 7- Early development Prototype

This artificial horizon applied over the visual ques provides the optical illusion of the distant object being dragged to the user rather than the user traversing space.  This combined with the abstraction of the hand through a motion controller to allow the accurate orientation and position of the hand relative to the eyes (synced with the proprioceptive system) give rise to the possibility that this illusion provided by independent visual horizons, can activate the centres of cognition that deal with objects in our immediate action space over the cognitive centres of distant objects, this to say that to incorporate extrapersonal spaces within our peripersonal action spaces (Higuchi et al, 2006) .


The artefact is designed to demonstrate and test a concept of distant grasping as applied to traversal in a virtual environment. The design revolves around having a fixed or independent visual field that is linked to the user’s movement in the virtual space.


Figure 8- Current prototype

The space layout is deliberate in that the majority of the landscape is flat. Large vertical structures (rocks) as seen in the Figure 6 – current prototype is used for a vertical perspective and provide larger ques to visually understand the world scale.

Controlling the experience

For abstracting the hand, the PlayStation Move (PSMove) controller will be implemented. This controller provides accurate positional and orientation information and works well for this experiment.

Circle of interaction: This is a sphere of possible movement that the player can make from their position.

The direction vector the motion controller is mapped directly on the ground with its vertical position being in the centre or focused on the player. As the player angles the motion controller towards the direction of the intended traversal the motion cursor moves out from the player.

The follow interactions are allowed in the simulation.

Reset Controller Orientation

This allows the orientation of the PSMove in the simulation to be reset.

Toggle Grid

This will toggle the grid overlay in the simulation off or on, by default the grid is on.


Using this will teleport the user to the desired location in the simulation world.


Grasping is the primary interaction method allowed in this artefact. The grasp mechanic utilises the trigger on the PSMove.


Figure 9 – Buttons and control layout for artefact

The summary of the controller layout as seen in Figure 9, shows the layout of the interactions as mapped to the PlayStation Move controller.


Unity Technologies – Unity Engine –

Guido Sanchez – PSMove plugin for Unity – – Stone Texture –

Hisense Criss – Wavy Sand Texture –

David Zerba – Rough Sand Normal –

Hhh316 – Beach Sand Texture –

Special Thanks

Michael Vermeulen – For crash course in Unity

Retrospective / post mortem

Retrospectively there were several problems doing the production of this artefact. They will be discussed in detail for the benefit of the reader to understand the complexities behind developing with industry and community maintained software.

Changing engines

First and foremost was the significant change in engine choice from Epic’s Unreal engine 4.11.2 to Unity 5.3.5p1, which in itself is highly irregular and undesirable as I have very little or no experience with using Unity. I will discuss the reasons for the decision as following.

Beginning with my initial test with VR as a plugin for the unreal engine I had no reason to expect any problems with implementing the proposed idea. However later on two major changes were made to 3rd party hardware drivers that required an engine version upgrade. First this was with Oculus releasing their drivers (1.3) for the consumer version of their Head Mounted Display (HMD) the Oculus CV1, and second, an overhaul of Unreal engine support for the HMD new hardware.

These two changes upstream of the development impacted on the compatibility of the community maintained drivers for supporting the Sony’s PlayStation move controller and EyeToy accessory in the windows platform with unreal engine. Basically the outcome of this was that the Move Controller and associated hardware no longer worked with the current Unreal build which supported the Oculus Rift. It was quickly determined however, that the community version of the drivers for the Unity engine was indeed compiled against the latest driver target and promised to work as advertise.

This is main reason behind shifting to a different game engine middleware.

Effectiveness of Independent Visual Backgrounds

In the early prototypes of this experience in using and independent horizon grid, was that it was very effective in virtual environment that was sparsely populated with objects and devoid of textures. It was noted from colleagues who tested the experience that they were able to sense the difference between having an independent horizon grid and not.

In later stages as the prototype proceeded deeper into production the visual fidelity of the experience was beginning to ramp up. With the inclusion of a terrain displace from a Digital Elevation Model (DEM) data and application of textures it became clear that the grid started to have a reduced effectiveness on disrupting the visual background feed and become less effective.

Future thoughts and discussion

Throughout the development of this artefact a lot of research into simulator sickness or motion sickness was discovered. The idea around independent visual backgrounds are in response to experiments done to counteract the effects of simulator sickness, further in this research may be found many of the potential roots of motion sickness and ways to counteract them are discussed.

It is in this further research around my initial dive into the subject that is perhaps of greater importance for my research thesis as a whole. Clearly for those who have used VR for extended periods of time, motion sickness and the methods to counteract it have to be built into the experience as a foundational scaffold.

There are many possible solutions and a great deal of research underway on the topic and certainly this experiment with this artefact has inadvertently tested a solution. Thus as a continuation in this area it is proposed to study this in detail and test many of current and theoretical solutions.


The idea of using VR to extend peripersonal space into extrapersonal space is and interesting field of research. First and foremost, in this example is the reinterpreting of the virtual space as one that does not have to replicate the physical world. In understanding this we can go about introducing and extending upon affordances users already have in place for interacting and traversing the virtual environment.

In immersive VR the idea of embodiment becomes very real indeed, and consequently these experiences become lived and naturally as part of how we cognate the real world we in turn try to incorporate the virtual environment into our immediate body schema. This is curious in that we don’t have to treat anything in the virtual world as we do in the real world and while certainly existing ideas an affordances work well as an introduction to these spaces they ultimately hinder us of which simulation sickness would be an obvious side effect.

As this experiment demonstrates is that we can indeed allow the user to reach out to extrapersonal space via a virtual tool and grasp. This is important as this allows for a reinterpretation of how we can process these spaces cognitively. Certainly as more and more methods are tried we will build up a new understanding of this medium and the affordances associated within it.


Suma, E. A., Lipps, Z., Finkelstein, S., Krum, D. M., & Bolas, M. (2012). Impossible Spaces: Maximizing Natural Walking in Virtual Environments with Self-Overlapping Architecture. IEEE Trans. Visual. Comput. Graphics IEEE Transactions on Visualization and Computer Graphics, 18(4), 555-564. doi:10.1109/tvcg.2012.47

Low, Gek Siong. Understanding Realism in Computer Games through Phenomenology (2001): n. pag. Stanford Computer Science. 2001. Retrieved June 01, 2016, From

Duh, H. B., Parker, D. E., & Furness, T. A. (2004). An Independent Visual Background Reduced Simulator Sickness in a Driving Simulator. Presence: Teleoperators and Virtual Environments, 13(5), 578-588. doi:10.1162/1054746042545283

CCP Games. (2016) Eve: Valkyrie. CCP Games. Retrieved June 01, 2016, From

Space Bullet Dynamics Corporation. Vox Machinae. (n.d.). Retrieved June 01, 2016, From

High and Mighty – An Immersively Seated Experience. (2015, January 19). Retrieved June 01, 2016, from

Maravita, A., & Iriki, A. (2004). Tools for the body (schema). Trends in Cognitive Sciences, 79-98.

Holmes, N. P., & Spence, C. (2004). The body schema and multisensory representation(s) of peripersonal space. Cognitive Processing, 5(2), 94-105. doi:10.1007/s10339-004-0013-3

Trope, Y., & Liberman, N. (2010). Construal-level theory of psychological distance. Review, 117(3), 1024-1024. doi:10.1037/a0020319

Prothero, J.D., Draper, M.H., Furness, T.A., Parker, D.E., and Wells, M.J. (1999). The use of an independent visual background to reduce simulator side-effects. Aviation, Space, and Environmental Medicine, 70(3), 135-187.

Oculus Best Practices. (n.d.). Retrieved June 01, 2016, from

Slater, M., & Usoh, M. (1994). Body centred interaction in immersive virtual environments. Artificial life and virtual reality, 1, 125-148.

Bowman, D. A., Koller, D., & Hodges, L. F. (1998). A methodology for the evaluation of travel techniques for immersive virtual environments. Virtual Reality, 3(2), 120–131. doi:10.1007/bf01417673

Fricker, M., and Donaldson, N. (2016). Unreal Editor in VR on Oculus Touch. Game Developers Conference. Retrieved June 05, 2016, from

Biagioli, A. (2016). Vive-Teleporter. Retrieved June 05, 2016, from

HIGUCHI, T., IMANAKA, K., & PATLA, A. E. (2006). Action-oriented representation of peripersonal and extrapersonal space: Insights from manual and locomotor actions. Japanese Psychological Research, 48(3), 126–140. doi:10.1111/j.1468-5884.2006.00314.x