SuitAR design objective is to provide motion capture (MOCAP) without cameras, tracking markers and/or tags to provide a more realistic virtual reality experience to the end user wearing this interactive suit.

Motion is captured or detected by a means of Microelectromechanical systems (MEMS) sensors for tracking human motion, which makes it possible for a computer to interpret any kind of movement by converting the analog sensor measurements into digital information. This digitized readable format allows for further analysis of the movements, record, process and control applications and etc.

The design comprises an array of 36 embedded 9-degrees of freedom (9-DoF) IMU MEMS sensors to capture motion data from the person wearing the suit through a process called sensor fusion. This data is used to live stream user movement via WiFi, or record data to input into software such as Unity and Unreal Engine 4.

SuitAR integrates with standard VR Gaming engines, like Unity and Unreal, to provide the wearer the deepest video game immersion experience.

Information from the virtual world is translated into the real world providing simulated effects observed in the virtual space, known as haptic feedback. This technique provides the wearer the experience of sensations; such as 3D surround sound, touch, force, and even temperature from the virtual environment through the suit, gloves, and Head-up display.

SuitAR integrates with HTC VIVE PRO and Haptic Gloves already available on the market.


In the near future, we will be providing a new type of Haptic Glove in our product offerings that will allow the user to experience the sensation of weight /mass and provide greater force feedback when objects are grasped by the user in both virtual and augmented realities. These added features will be a great addition to the existing haptic gloves on the market giving the user even more sensation of the virtual world.