When Volkswagen AG arrived at the Red Hat Open Innovation Lab two years ago, the company was looking for a solution to help them build self-driving autonomous cars. The venerable German automaker had all the internal pieces required to build those cars and write that software, but sometimes the trouble with self-driving cars is that last step through which all software must pass: testing.
Michael Denecke, head of Test Technology at Volkswagen AG, had a testing problem to solve, and a short time to solve it. On top of that, he also had a culture problem; the company had been charged with shifting to a less top-down cultural model.
“My job at Volkswagen is to make sure all the electronic control units we have in our cars work together,” said Denecke, speaking in the Red Hat Summit 2019 keynote. “Due to the new challenge of having autonomous driving connected cars and the many new functions for driver systems, we noticed that the normal way we do the tests with our dedicated hardware would not be enough. So we got the idea to have all these things--[what if we moved] all these tests we do with hardware on to virtual test environments? That’s why we came to containers in OpenShift: to put on these virtual test environments in containers in OpenShift environments.”
Marcus Greul is an IT project manager at Volkswagen AG. He works in testing and simulation research and development in Wolfsburg, Lower Saxony, Germany. He’s on Denecke’s team.
“Integration testing is one of the most complex tasks in automotive development,” said Greul, speaking at a recent OpenShift Commons Gathering. “Why? Electronic systems in the car consist of several components: sensors, activators, and control units. Those control units contain one or more software components, and those components need to integrate with each other... Integration testing means that such electronic systems in the car need to pass the integration testing process for each combination of components. For each model, and line, and every software version in the car... The more capabilities those systems gain in terms of taking over control of the car, the more test cases are needed to prove these can pass the testing process.”
And what if the output of those tests is meaningless to a human without further processing? What if those tests aren’t just pushing long strings into a text input box, but rather, stomping on the virtual gas pedal on a virtual freeway? Just setting up such a scenario requires an entire stack of virtuality upon which to test and derive results, said Greul.
“First, we need the time to build a virtual environment,” said Greul. “We would not test in the real world, so additionally, we need the virtual car... To simulate the customers’ interaction with the system, a virtual driver is put in the virtual car... The system under test, or the test object could be a software component, the control unit itself, or the complete system, or a couple of systems which represent the function.”
Such a testing system requires output that can be visually assessed by a human: a video. And because these tests are in a virtual 3D environment with virtual physics, GPUs could make the entire process into one that, conceivably, could run in a shorter amount of time in a large cluster. Those output videos would show just what happened when the test failed: did the car go off the road? Did it hit another car? Did it not move at all?
“What does all of this have to do with OpenShift? Let’s assume there’s a test bench where all the necessary components are just software running on some hardware. Components, such as virtual environments, data analytics, other tools, test cases, test case execution, or simulations of traffic, and the software components which go to a control unit. When all those are instantiated, the test bench might be setup and ready in just a few minutes, compared to let’s say, a few weeks. But in terms of exponential growth, this won’t be good enough. What we really need is massive scalability and automation. Fully automated test benches..., executing hundreds of thousands of test cases without manual effort: this is where we need to go,” said Greul.
But it wasn’t just the way that Volkswagen tested its software that changed: their culture is beginning to change as well. That was a big part of the appeal of working with Red Hat Open Innovation Labs, coupled with the tight timeframe in which the company was looking to find a solution.
Said Denecke, “We didn’t have much time to prove our idea to top management... When I talked to my colleagues from Red Hat they said, ‘We’ve got a product called the Open Innovation Lab, and it’s the only chance to meet that goal in that time frame.’ The other thing that made the Innovation Lab appealing to me is that changing culture is a big thing at Volkswagen. Working together in the Open Innovation Lab way says; ‘don’t be so top-down management driven, give the responsibility to the team,’ and we want to do that at Volkswagen.”
Denecke said that the open development model was the key to the successful development of their new testing model. “We proved we can have virtual test environments. First, we proved that we can mix virtual test environments with real test environments--I think that was the new thing in this idea--and we showed that for quality and success, the main factor is changing the culture and working together... The question from the manager should not be, ‘What’s the status?’ But, ‘How can I as a manager help you?’”
Two years ago, when Volkswagen’s team arrived at Red Hat Open Innovation Labs, it was known that their existing testing systems would not stand up to the coming influx of demand. At the time, there was no off-the-shelf way to support GPUs in containers with Kubernetes and OpenShift. Together, Red Hat and Volkswagen built that system.