VR, AR, and Advanced Interfaces

A hand being recorded on a cell phone with a sensor on the dorsal of the hand.

Medical AR

FTL has developed medical visualization technology for medical education using real-time, adaptive anatomical AR. FTL’s “SkinDeep AR” is a user-friendly, graphically accurate, and extensible application for smartphone (Android or iOS), tablet, or PC. It applies rapid fiducial acquisition and tracking along with whole-body anatomical modeling to overlay sub-dermal anatomy graphics on either a physical patient or training mannequin and enable a 180-degree, dynamic viewing cone in real-time.

While many smartphone apps have the capability to approximately insert computer graphics into live video imagery, anatomical AR requires precise registration of imagery with the patient’s body, accurate scaling of imagery to match unique patient body geometry (such as a large adult vs. a small child), and rapid updating of the resulting composited imagery to provide a realistic and intuitive fusion of real and virtual components. SkinDeep AR will apply fiducial registration via markers as well as whole-body modeling and parametric 3D model shaping to create a convincing AR glimpse under the skin of the patient.

A render of a carved sphere, lacey sphere

FTL VR Portal

Virtual Reality (VR) represents a new, compelling, and informative way to experience 3D graphics. With the steady expansion of VR hardware systems such as Oculus Rift and HTC Vive, there are opportunities for the first time for teams of engineers, scientists, and medical researchers to experience critical data together in VR. FTL’s VR Portal was designed to allow users to use a web browser to establish an account and upload models in standard formats for access publicly or by a selected group. Members of the group with a VR headset were able to interact with the models in their profile. The interface placed models in a variety of lighting and background scenes to allow exploring the models at any scale, including inside the models as if they were themselves detailed landscapes.

Two different images combined to show how facial recognition works with a screen and on the screen itself

Facial Recognition and Tracking

FTL has developed a camera-based head tracking tool as part of the SonoLoop Audiometry System. FTL’s capture and tracking system is robust, repeatable, and quantitative, using a rendered 3D graphical sphere to indicate small head motions, allowing for the user to self-adjust head position.

Renders of people playing on a table
a render of people and/or kids playing on a stem table

Playsurface and STEM Table

FTL developed large-format “Playsurface” touch tables as well as application-specific classroom technology “STEM-Table” and rehabilitative surface for stroke or traumatic brain injury patients. Playsurface is a large-format (32”), collaborative, multi-touch interactive display table with object recognition and tracking capabilities. Playsurface exploits open-software and open-hardware architectures that are ideal for development of novel, engaging educational programs, with machine learning progress-tracking capabilities. STEM-Table applications include collaborative learning of math and mechanics concepts, cross-curriculum science visualization and augmented reality in addition to rehabilitative exercises such as cross-midline contralateral motor tasks. These applications capitalize on the natural fascination students and patients exhibit with touch devices like the iPad, and direct them through basic STEM educational goals and multi-touch-enabled modules. Further, with a large, multi-person form factor, Playsurface enables teamwork and detailed assessment opportunities impossible with small iPhone or even iPad interfaces. The multi-person, intuitive interface enables new approaches to team learning and quantitative assessment of both overall subject mastery and the progress of individuals within collaborating groups. 

The table augments the reality of a teaching lab experiment and allows students to explore the equipment, graphical control panels, real-time data being generated, and multimedia analysis of what is going on. It is a powerful and fun learning tool. Playsurface technology was demonstrated at CES, Las Vegas and featured on Make and Tech Crunch.

Render of a group of people with their faces on signs, to show "people" are participating

Virtual Reality Simulation for Social Research

Testing likely social outcomes to difficult-to-track real life scenarios can be difficult or impossible. To address this challenge for researchers at Smith College, FTL created a virtual reality drinking game designed to test social predation likeliness as a result of increased inebriation. FTL’s Kings Virtual Reality Drinking Game utilizes the HTC Vive VR system along with tracking devices for players to select cards, other players, take a “drink”, and progress in the game. Additionally, the game tracks several statistics associated with each player and AI non-player character (NPC) competitor.

Image of an aerial view of a flight desk in the middle of the teal ocean

Flightdeck Carrier Logistics Tool

To address the Navy’s Aircraft Carrier Analysis Laboratory need for a system to track and maximize deck space for aircraft, support equipment, and cargo, FTL personnel developed “MaxSpot”, an asset tracking physics-based carrier layout tool. Physical interaction between the ship and equipment are created using a physics engine that prevents overlapping components and invalid placements. Assets tracking and positioning is continuously updated as the mission plan is recorded.

Two people sitting and looking at a computer screen, pointing to understand what is going on


Onboard diagnosis of aircraft carrier auxiliary equipment by shipyard Subject Matter Experts (SMEs) has been pivotal in correction of mission-critical deficiencies. However, the costs and logistics associated with SME deployment to at-sea carrier locations is prohibitive. FTL personnel developed “Telemachus” for shipboard telemaintenance, providing intuitive capabilities for document sharing, text, voice and video chat as well as sharing of hand-held measurements. Additionally, a novel, optical remote-diagramming tool is used to project SME instructional figures directly onto shipboard equipment.

Kids playing on a virtual game table

ZooAtlanta Game

Working with the National Science Foundation to explore multi-player, motion-based education, FTL developed a new Playsurface game that was deployed at Zoo Atlanta, one of the top zoos in the US. The game was themed for the zoo’s “World of Reptiles” with four different Georgia-native animals that emerge from a small puddle in the middle of the game screen.  Any player may touch the animals to cause them to hop in a particular direction.  The game environment looks like a Georgia forest floor with a goal in each of the four corners of the board.  Flies and crickets also wander the forest floor and can be eaten by the animals.  As the animals grow fat, and flies and crickets run low, animated arrows indicate that the GNA’s have to get to the corner goals to supply more food. Instructional information about what each animal eats and what predators to avoid pops up, giving clues on how to win the game.

For more information on how we can find the solution for you, get in touch today.