Our 3D virtual reality workstations are complete desktop virtual reality systems for easy and intuitive analysis of 3D and 4D (3D + time) data. PS-Tech’s 3D workstations are the C-Station™ and the PSS™.
The principle of the PSS and C-Station is illustrated on the right. A user sits behind the system and looks at a 3D display via a mirror. In effect, the user perceives the display as being located behind the mirror, just above the table (the virtual focal plane or VFP). PS-Tech’s proprietary optical tracking technology is used to present the user with an intuitive interface to the presented 3D data. By using simple wireless devices, the user is able to hold 3D objects in his hands and perform complex 3D manipulations, such as cropping the data in any direction, drawing and erasing and performing 3D measurements.
A standard keyboard and mouse are used to solve tasks that are more effectively performed in 2D.
‘Can I see that? That question often means, can I hold the object in my hands, can I rotate it, look inside it and examine it from all angles?’
The analysis of 3D images should be no different. Seeing with your hands is the most natural way people examine objects, a workstation for the analysis of digital 3D information should create this experience as well.
This means that the interaction should be as intuitive as picking up an apple and slicing it while the visualization has to look real during the interaction. As traditional interfaces and applications have serious shortcomings in both cases PS-Tech has developed 3D virtual reality workstations and the software to have two handed interaction and realistic imaging. Within this domain PS-Tech focusses on the most difficult of all; 3D volumetric data.
Interaction where the visualization is optimal
The data is at the location where the hands are. This allows for the best intuitive lifelike experience
Optimal visualization and interaction is around the pane of the screen. The physical screen prevents optimal interaction
The only space available for interaction is in front of the screen
Extracting the image out of the screen reduces the quality of the image
Freedom to move and touch the data without physical barriers
Visually both interaction and physical interaction are at the same location – around the pane of the screen
Use the optimal viewing zone of a 3D display
Specific viewing zone usage
The freedom to move as a user is much bigger than the optimal viewing zone permits
The user is automatically invited to work in the optimal viewing zone
Unobstructed view while "touching" the data
Unobstructed view while holding the data in the hands
Ability to interact with BOTH hands
(The difference between peeling an apple with two hands and with one hand behind your back)
This logical addition of using both hands in a 3D interface is unfortunately not supported by the overwhelming majority of 3D applications. If 3D navigation is possible it is only for one device at the time
Seamless integration of 2D and 3D interfaces. Two handed interaction is fully supported
Life 3D rendering
Live interactive 3D rendering of volumetric images without perceived loss in quality
During interaction the frame rate and resolution often drops
A 3D rendering engine was specifically developed that renders volumetric data in 3D, in Real-time. This engine maintains a high frame rate during live 3D interaction without sacrificing quality