Virtual Reality gives you a glimpse of the future in the production environment.
Virtual Reality gives you a glimpse of the future in the production environment.
( Source: Pubic Domain / Pixabay)

Virtual Reality VR shows the world of tomorrow - already today

Author / Editor: Kurt Lehmann, Dr. Karsten Michels / Florian Richert

What will the driverless shuttle of tomorrow look like? With Virtual Reality, possible concepts can be visualized quickly. This helps in the development of new vehicle concepts. In the cockpit, related technologies such as AR even help today.

Soon, new types of vehicles will already dominate the road scene. A mixture of private cars and driverless electric vehicles will bring more efficiency, less inner-city exhaust fumes, and relaxation in the notorious lack of parking space, primarily in conurbations.
In this new mix, individual vehicles based on today's basic pattern can be controlled manually by humans as well as be on the road automatically. This creates a variety of possible new traffic situations that require new technical foundations. A significant role will be played, for example, by the driver's communication with his vehicle, because future vehicles will be able to "do" much more and thus assume much more responsibility. Automated vehicles need to be able to communicate with each other and with pedestrians without human intervention in the vehicle. Ideally, everyone should know from each other what they are doing or planning at the moment. The responsibility for this sometimes lies with the person and sometimes with the machine.

Given these new capabilities, the car is undergoing significant changes to meet the requirements of future interaction concepts and ensure secure communication between automated vehicles and vulnerable road users — a genuine dialogue between man and machine results. People can optimally use the interior of a self-propelled vehicle if the interior concept also supports meaningful activities during an automated journey. For example, passengers should be able to turn to each other to chat in a relaxed manner. The change in the interior concept is even more significant in the case of a completely driverless vehicle. Here, the current seating concept in a passenger car will be completely revised, giving the impression of either a compact minibus or a small autonomous vehicle with swarm behavior - as in Continental's BEE mobility concept.

VR makes mobility concepts a tangible experience

BEE (Balanced Economy and Ecology Mobility Concept) is an autonomous, electrically powered vehicle for up to two adults that can also transport loads as required. The BEE is called via a smartphone app. To illustrate the completely new possibilities of mobility with the "industrious city bee," Continental has used virtual reality technology to let the BEE drive today in a city of tomorrow. VR helps you to imagine and experience what will make everyday life easier tomorrow: The window panes can serve as information screens inside and outside so that you can talk from BEE to BEE as if you were sitting next to each other. Or the windows are simply entirely transparent to look at. For a wheelchair user, BEE opens the front door wide and also kneels so low that he or she can get on board barrier-free. For older passengers, BEE places the chair in the door to take a seat. Thanks to the ability to swarm, friends can drive into town in a convoy of digitally networked BEEs. The windows, transformed into 3D screens, make you forget that your friends are traveling in different BEEs while chatting with each other. Such concepts are still in need of explanation today. VR helps to understand their potential and plan their use.

You only see what you get explained

Augmented Reality (AR) methods, on the other hand, serve to add a level of explanation of what has been seen to today's level of interpretation of uncommented environmental images by the driver. According to the motto, "What you see is what you need (to see)," one only sees what one is being explained. Optical highlights in a display system can make it possible for a driver to see how high the set distance between the adaptive cruise control and the next vehicle is or to place navigation instructions directly on the road. Augmented reality can also help in automated driving by telling the user which driving maneuvers the automated vehicle plans and performs. The unique thing about the optical insertions into reality is that they are immediately understandable and offer unmistakable action. AR-HUD can use a stopping distance projected into the environment to make it clear to the driver when the road is wet that his stopping distance will be longer and that it would be safer to drive slower.

The AR-HUD thus opens up new possibilities that can make the design of the vehicle cockpit, in particular, more flexible and efficient. At the same time, these new communication channels and forms must be tested. Drivers need the opportunity to test such concepts, as do technical decision-makers who develop such ideas. And an automobile manufacturer who wants to buy them for a vehicle model needs the certainty that real added value is created here. This also involves researching the "human factor." For a new vehicle, mobility and interaction concepts, it is crucial to find out which concepts optimally meet the requirements from the user's point of view, taking into account the state of the user and the context of use - for example, by creating a trust for automated driving or increasing situational awareness.
From a scientific point of view, the aim is to support human performance at the interface to the vehicle, to enhance the user experience and to reduce operating and decision-making errors without causing additional stress. On the contrary: the driver should be able to understand more easily why he has an advantage in making individual decisions about the longitudinal and lateral dynamics of his vehicle - or why his automated vehicle makes these decisions autonomously. After all, nothing is worse for a driver than to give up control of his vehicle without having sufficient confidence in its capabilities. AR, with its explanatory notes, confirms to the driver that his vehicle "knows" what it is doing. The same applies here: visual cues in the real world turn information into understanding.

The man enters the world of simulation

Given the immensely growing design possibilities and the new ways of interaction between man and machine, it makes sense to find out at an early stage what works best. Augmented reality serves to offer new ideas for this interaction and to make driving easier for the driver. At the same time, developers can design AR flexibly by developing new forms for integrating helpful hints into the real world. VR, on the other hand, has its strength where such new concepts need to be tested in a completely realistic 360° environment. Continental is, therefore, developing a VR ergonomics laboratory (VLab) in which traffic situations, including interaction with pedestrians, can be flexibly tested. In contrast to the modern driving simulator with vehicle mock-up, which is still in use, the scenarios of the VLab can be designed more realistically and immersively, allowing the user to "immerse" in the virtual world. Besides, several people can be in the VR at the same time and interact there.

Thanks to VR, new vehicle and cockpit concepts with modern interaction elements can be programmed long before the prototypes and tested in a realistic world. This increases the maturity of the developed products despite the enormous increase in variables and alternatives. Man gets access to simulation and can move in it like in the real world - but free of risk. If a need for change is detected in a technical system, it can be adapted relatively quickly. This initial concentration on the user's experience - the User Experience (UX) - leads to new products maturing in early, fast, and user-oriented iterations. The underlying principle is that of agile development, in which products that are already in development are tested at an early stage. The knowledge gained in this way flows immediately back into the development process. It allows individual components of the virtual world to be reprogrammed at a time when changes are still extremely economical.

From vision to (augmented) reality

This also applies to the developers themselves, who want to break new ground here and make their unique ways and means understandable and experienceable. With VR methods, this works much better than with abstract descriptions or rigid images. The division of labor between VR and AR is, therefore, a successive or cyclically interacting one: VR serves to test and further develop new designs, products, and forms of communication at an early stage. AR in the automobile is part of the visible result of this test in the virtual world.
In production, for example, AR has its application area where conversion or maintenance work has to be carried out on machines and systems. Here the worker can be guided step by step through an activity which otherwise - for example, due to a lack of routine or high complexity - entails an increased risk of error. The contents displayed in AR glasses, on the other hand, show the worker when he has to do what, and which tool or spare part he needs.

At Continental, this worldwide support option is implemented in two pilot systems. By securely connecting Smart Glasses with a desktop application, the expert can see the field of work through the eyes of his colleague and support the work on-site by uploading content into the colleague's field of vision. In addition to augmentation, communication with the desktop client also includes a voice and video connection. Images, circuit diagrams, or messages can be displayed in the wearer's field of vision via the AR glasses. A correlation of these insertions to the real world, i.e., the anchoring of the virtual information object in the representation of the actual space, is currently being piloted.
At the Continental plant in Regensburg, the subject of virtual reality is being evaluated in parallel. This is a so-called VR cave, i.e., a space for projecting a 3D world that could be used for line planning and ergonomics studies. As early as 2016, a production area was captured three-dimensionally for the first time, creating a point cloud and all-round photos. The result is a virtual walk-through map of a production hall. This makes virtual production tours possible, which can be used, for example, to illustrate training courses.

This article was initially published in German in our partner magazine, Next Industry.

* Kurt Lehmann is Corporate Technology Officer at Continental and is responsible for the company's technological orientation, including a focus on automated driving. Dr. Karsten Michels is Head of Systems & Technology in the Interior Division at Continental and responsible for product and pre-development.