Menu
Home
Log in / Register
 
Home arrow Computer Science arrow Virtual, Augmented and Mixed Reality
< Prev   CONTENTS   Next >

4 Development of Remote Laboratories in the Virtual Theatre

The described setup of the Virtual Theatre can be used to immerse the user into a virtual reality scenario not only for demonstration purposes, but especially for the application of scenarios, in which a distinctive interaction between the user and the simulation is required. One of these applications consists in the realization of remote laboratories, which represent the first step towards the creation of real-world demonstrators like a factory or an atomic plant into virtual reality.

Fig. 2. Two cooperating ABB IRB 120 six-axis robots

The virtual remote laboratory described in this paper consists in a virtual representation of two cooperating robot arms that are setup within our laboratory environment (see figure 2). These robots are located on a table in such a way that they can perform tasks by executing collaborative actions. For our information and communication infrastructure setup, it doesnt matter, if the robots are located in the same laboratory as our Virtual Theatre or in a distant respectively remote laboratory. In this context, our aim was to virtualize a virtual representation of the actual robot movements in the first step. In a second step, we want to control and to navigate the robots.

In order to visualize the movements of the robot arms in virtual reality, first, we had to design the three-dimensional models of the robots. The robot arms, which are installed within our laboratory setup are ABB IRB 120 six-axis robotic arms [37]. For the modeling purposes of the robots, we are using the 3-D optimization and rendering software Blender [38]. After modeling the single sections of the robot, which are connected by the joints of the six rotation axes, the full robot arm model had to be merged together using a bone structure. Using PhysX engine, the resulting mesh is capable of moving its joints in connection with the according bones in the same fashion as a real robot arm. This realistic modeling principally enables movements of the six-axis robot model in virtual reality according to the movements of the real robot. The virtual environment that contains the embedded robot arms is designed using the WorldViz Vizard Framework [39], a toolkit for setting up virtual reality scenarios. After the creation of the virtual representation of the robots, an information and communication infrastructure had to be set up in order to enable the exchange of information between the real laboratory and the simulation. The concept of the intercommunication as well as its practical realization is depicted in figure 3.

Fig. 3. Information and Communication Infrastructure of the remote laboratory setup

As shown in the figure, the hardware of the remote laboratory setup is connected through an internal network. On the left side of the figure, a user is demonstrated, who operates the movements of the real robot arms manually through a control interface of the ABB IRB 120 robots. This data is processed by a computer using Linux with embedded Robot Operating System (ROS). The interconnection between the real laboratory and the virtual remote laboratory demonstrator is realized using the Protocol Buffers (Protobuf) serialization method for structured data. This interface description language, which was developed by Google [40], is capable of exchanging data between different applications in a structured form.

After the robots position data is sent through the network interface, the information is interpreted by the WorldViz Vizard engine to visualize the movements of the actual robots in virtual reality. After first test phases and a technical optimization of the network configuration, the offset time between the robot arm motion in reality and in virtual reality could be reduced to 0.2 seconds. Due to the communication design of the network infrastructure in terms of internet-based communication methods, this value would not increase significantly, if the remote laboratory would be located in a distant place, for example in another city or on the other side of the globe.

The second user, which is depicted in the right upper part of figure 3 and who is located in the Virtual Theatre, is immersed by the virtual reality scenario and can observe the positions and motions of the real robots in the virtual environment. In figure 4, the full setup of the real and the remote laboratory is illustrated.

Fig. 4. Manual control of the robots and visual representation in the Virtual Theatre

In the foreground of the figure, two users are controlling the movements of the actual robots in the real laboratory using manual control panels. In the background on the right side of the picture, the virtual representation of the two ABB IRB 120 robot arms is depicted. The picture on the right side of the wall is generated using two digital projectors, which are capable of creating a 3-D realistic picture by overlapping the pictures of both projections. The picture depicted on top of the robot arms table is a representation of the picture the user in the VR simulator is actually seeing during the simulation. It was artificially inserted into figure 4 for demonstration purposes.

This virtual remote laboratory demonstrator shows impressively that it is already possible to create an interconnection between the real world and virtual reality.

 
Found a mistake? Please highlight the word and press Shift + Enter  
< Prev   CONTENTS   Next >
 
Subjects
Accounting
Business & Finance
Communication
Computer Science
Economics
Education
Engineering
Environment
Geography
Health
History
Language & Literature
Law
Management
Marketing
Philosophy
Political science
Psychology
Religion
Sociology
Travel