Project

Tangible View Cube

Carson  Smuts

Project Contact:
Groups

 A Handheld Device for Realtime Spatial Movement, Remote Control and Perspectival Orientation between Physical objects and Digital 3-Dimensional Environments. 

The Tangible ViewCube is a handheld interface device designed to enable naturalistic interaction with architectural models and other 3D spaces where mapping of first-person perspective to an analog reference plan is desired. 

This invention detects the cartesian coordinates (latitude/longitude) of a handheld device as it is moved over a model, plan or mapped representation of physical space. It then performs real-time digital reconstruction of the handheld device's rotation and bearing, and communicates with software to create a real-time 3D visualization of the physical space/object from the handheld devices point of reference. A virtual perspective or video is then shown on a display screen. 

Collaboration:  When paired with software for the visualization of 3D spaces, the Tangible ViewCube system enables the direct manipulation of camera views, and allows for easy handoff between multiple users collaborating on the review of a digital model.

This input is normally achieved by means of controlling a pointer via a mouse and keyboard on a display screen. The device aims to alleviate the difficulties of visual hand, eye and spatial co-ordination when trying to position a virtual camera to match a location on a physical model.

Device: Orientation and positioning are calculated and performed in real-time, allowing for seamless hand-eye co-ordination between the users hand gestures and the display screen. 

The system leverages Computer Vision, Wireless protocols and Inertial Measurement Unit hardware (IMU) to create an interactive and tangible experience between digital and physical models whereby the user can alter the 3-dimensional perspective on a digital screen by moving a object over a replica physical model or plan. 

As designed, the Tangible ViewCube is a small, wireless, battery-powered device that can be picked up and handled by persons of all ages and dexterity levels. When paired with a bluetooth enabled device, the device’s location in cartesian space relative to the receiver is recorded, along with information about the orientation and motion of the device. 

System: The above diagram  demonstrates the ViewCube system and the steps involving detection, communication and digital reconstruction.

Whenever the Tangible ViewCube is repositioned, the device emits an Infrared light to communicate its position to an Infrared camera, located above and to the side of the physical model. The image is streamed to a central computer through a wired USB connection.

RemoteScope: The central computer runs software called RemoteScope designed to be used with Tangible ViewCube. The software is cross-platform, operating on multiple OS systems. RemoteScope performs a series of motion detection algorithms on the IR image stream in order to detect the position of the Tangible ViewCube creating cartesian XY co-ordinates. These co-ordinates are combined with the quaternion values received from the ViewCubes’ IMU in order to create both translation and rotation values.