Hyper Real Immersion
Background to the project
Until recently the only way to cost effectively train for realistic battlefield scenarios, such as contested massive combat operations, was using a simulator. The simulator allows you to transport the trainee into a realistic situation with all the considerations that are not found during peacetime live training such as: Contested, cluttered, congested, environments, and crucially the presence of non-combatants. One of the key disadvantages of traditional simulation is that it usually takes place indoors in a benign and sterile environment which can detract from immersion and training transfer.
CAS are deep into a project designed to bring all the advantages of simulation into the live environment using Mixed Reality to take the simulation outside. It is called HYPER REAL IMMERSION (HRI).
HRI has been funded by the UK MOD through the Defence and Security Accelerator (DASA, formerly CDE).
About the project
Blend live and synthetic training using Mixed Reality for training in the live environment:
“to put imaginary things in the world that look like they are truly real.”
Hyper Real Immersion (HRI) is a project to bring all the advantages of simulation into the live environment, using cutting edge technology to deliver immersive, outdoor mixed reality training. To provide users with a convincing view of a correlated synthetic world without disrupting their view of the real world (Mixed Reality). The key being to place imaginary things (synthetic photo-realistic entities) in the real environment and have them look like they belong there ie they look to be obeying the laws of physics.
Phase One of this project ran for the six months ending in March 2016, and focused on providing a low technology readiness level (TRL) proof of concept (POC) for outdoor mixed reality military training.
The Phase One demonstration successfully highlighted the potential for the technology, and illustrated the applicability of outdoor mixed reality to JTAC and Joint Fires training.
After the successful completion of the first phase, this project was awarded further funding for another 12 months, aiming to progress from Technology Readiness Level (TRL) 3 to TRL 6. Starting in June 2016, the overall objective for Phase Two was to reach TRLs 5-6, with emphasis placed on the following key areas:
- Marker-less pose estimation techniques with improved accuracy and robustness, to support the demanding requirements needed to provide convincing outdoor mixed reality.
- Fully un-tethered operation, with all mobile equipment in a man-wearable configuration.
- Creation of an extremely high-fidelity terrain database, to support the accurate placement of synthetic objects and the correct depth masking around fixed objects such as trees and buildings.
- Integration with either emulated or issued military equipment, as required to support the context of JTAC and Joint Fires training.
Phase 2 culminated in a demonstration at the at Spadeadam Training Area, and consisted of:
1. Fixed observation post and tactical operations centre using pass-through cameras and overlaying synthetic entities in a fixed frustum.
2. Mobile Untethered Worn Rig allowing movement around the training area.
Concept Demonstration Video
The video that you see here is based on a 360-video from the trial location, combined with the HTC Vive tracking system; not the tracking system that was used in the field. We use it to demonstrate what the mixed reality perspective is supposed to look like. This is particularly useful when we can’t take people to the trial location RAF Spadeadam, we bring the location to them in the form of this 360 degree prerecorded background video. We still use the same occlusion and superimposition techniques as used in the live system to overlay the synthetic entities (rendered by MetaVRs Virtual Reality Scene Generator), and control them in real-time (using Battlespace Simulations Modern Air Combat Environment); albeit this time over a video of the live environment rather than the live environment itself.
Whats in the video
The video shows a demonstration of what occurred on our phase 2 Trial. It shows a mixed reality perspective from the roof of Berry Hill, at RAF Spadeadam. Photo realistic synthetic entities including buildings, vehicles and personnel, are controlled using MACE (BSI) and rendered in VRSG (MetaVR) on top of a highly accurate synthetic representation of the physical terrain. The synthetic and physical scenes are then combined to provide a truly compelling mixed reality experience.
The physical environment at Berry Hill is populated with friendly synthetic vehicles and personnel. Synthetic enemy artillery then fire on the compound, causing synthetic detonations and smoke effects on the physical terrain. Two friendly AH-64s then take off in response, flying over the head of the user before circling around to engage nearby synthetic targets.
The video is approximately the same resolution as the actual display system, which used prototype of the Vrvana Totem Mixed Reality Headset. However, the tracking of head movements seen in this video is better than that achieved in the live environment because it is filmed using an HTC Vive indoor marker based tracking system.
One of the critical components in an outdoor mixed reality system such as this is pose estimation and tracking. The aim for the final product is to be completely markerless; not relying on any components that are not worn. This is to be achieved using techniques such as visual simultaneous localisation and mapping (SLAM), optimised for outdoor use combined with high accuracy synchronised position and orientation sensors.
These markerless pose estimation techniques, combined with rapid terrain scanning and ingestion processes and techniques, are the main focus of the next phase of the project, due to begin in March 2018.