Update you browser

For the best experience, we recommend you update your browser. Visit our accessibility page for a list of supported browsers. Alternatively, you can continue using your current browser by closing this message.

The Snap Visualisation Lab, funded by a grant from the Spiegel Family Fund, provides a multimodal, interactive, and immersive space, capable of modelling and simulating complex built and natural environments. With double-height, three-sided projection, and zero light and sound pollution, the lab has the capability to display large scale visuals, interactive data analytics, use motion capture technology, support VR development and immersive design reviews.

The lab is equipped with the support from the Research England quality-related research funding and supports RCA researchers and PhD students across the RCA’s four Schools and research centres including inclusive design, intelligent mobility, materials science, computer science, and robotics.

The Snap Visualisation Lab is an independent, academically rigorous and industry facing facility to support the RCA impact mission through collaborative research, consultancy and capacity building.

The Snap Visualisation Lab is open for bookings, so get in touch if you would like more information.

A projector projecting pink and yellow images onto the walls of a big room

More information

Measuring approximately 100 square metres, the Lab can be used for:

  • Multimedia (Video, Image, Audio, Presentation, HMDI input)
  • VR Experience with Techviz
  • CAD Conversion & VR Experience with SkyReal
  • Unreal Engine 5 with nDisplay
  • Data Visualisation
  • VR Training
  • VR Development
  • Simulation
  • Immersive Design Review

At the core of the Lab are six high-powered computers, each responsible for rendering a segment of the immersive display. Five of these computers are connected to projectors capable of stereoscopic 4K projection, which together create a stunning visual environment across three double-height walls.

For interactive experiences, the Lab is equipped with ten motion capture cameras, enabling precise tracking of head movements, handheld wands, or other markers, enhancing user engagement. Immersive audio is delivered through an advanced sound system featuring eleven NEXO speakers and three NEXO subwoofers, ensuring high-fidelity sound that complements the visual elements.

The Lab also supports seamless multimedia playback using Vertex, allowing for the integration of video, images, audio, and live HDMI feeds. For streamlined VR solutions, tools like Techviz and SkyReal facilitate the display of Unity projects, Blender models, or CAD files with minimal setup. Additionally, bespoke interactive and immersive experiences can be developed and displayed using Unreal Engine 5 with nDisplay, enabling full utilisation of the Lab's state-of-the-art capabilities.

See detailed Technology Overview

People sitting on the floor of a dark room with images of a forest being projected onto the walls