Computer testing for Automotive and software updates

Interview

New features, more complexity: Shifts in automotive testing

Software complexity makes automotive testing a challenge. Experts Philip Potkowski and David Wenzel on how to deal with this efficiently.

As cars turn electric, the number of technical components decreases. At the same time, software complexity is on the rise. As experts in testing and validation – is your job getting easier or harder right now?

Philip Potkowski: We are facing an exponential growth in testing requirements, because of EVs, connected cars and autonomous driving environments. We have more than 20 years of expertise in the automotive industry, but the transformation in the past couple of years has been tremendous.

David Wenzel: Even EVs add to the challenges in testing right now. There are, for example, multiple different standards for charging protocols out there, which will hopefully consolidate at some point. The main issue, though, is in the connected vehicle space, especially in software complexity. There is a demand for a growing number of features in the car, some with major impact, like updates. And there is the need for more and more integrations. For example, with users’ devices, but also with the manufacturers’ maintenance and diagnostics systems.

“Our platform is thousands of pounds set in motion”

If we talk about features like OTA updates – how does this affect testing?

David Wenzel: An update can be a bugfix for the infotainment system, but also a performance upgrade for an engine ECU. There are 50 plus modules in a car you can potentially update, many of which are interdependent. This leads to a high number of possible permutations and therefore a high volume of required testing. Also, the update process is complex. You must make sure everything works every time, including fallbacks if it does not, because our platform is not a smartphone in your pocket where an app might crash. It is thousands of pounds set in motion.

When you start working with a new customer, what is your process to build up this safety net?

David Wenzel: For a feature like OTA, we start at the component level and check every function in isolation. If we have a well working software, we start integrating with other features of the vehicle. There is often a lot of variety here, because of different model lines and different trims. Another important step is performance testing, which usually means speed of response as well as failure rate. Basically: What happens if I do 100 iterations of a test.

Philip Potkowski: These days, finding a problem on the test bench only brings you halfway. Many components are provided by third parties – like tier-1-suppliers or software companies. They might produce the head units, backend services or infotainment systems. If you find a problem, there can be five components in the chain from vehicle to backend, and each one is developed by a different company. That is why the so-called defect manager has become quite a prominent role at umlaut – an engineer with a lot of experience and communication skills, as well as good relationships to the different companies involved. He or she makes sure issues are not only found but also resolved.

“You can have a remote team working 24/7 on the same component”

With growing testing complexity and the need for firm verification – how can OEMs keep costs for testing in check?

Philip Potkowski: This is where simulation and automation come into play. Our goal is to reduce the expense for testing environments and to maximize the usage of a component on the test stand – as in producing more iterations of a test in a given time. We use our own customized test benches, for example, to simulate signals to an ECU, removing the need to have access to many components or even a whole vehicle. There are also larger solutions like our umlaut device farm, which enables us to share a piece of hardware with multiple developers in a cloud-connected setup. You can have a remote team working 24/7 in three different time zones on the same components.

David Wenzel: There is also automated testing solutions like the umlaut iHTS, which can execute a pre-programmed set of instructions on physical hardware. If you want to test the Bluetooth function of a car, it will select the phone application on the head unit, click pair on the phone, record the changes at the screen and use image recognition to see if everything is correct. This can happen with many devices many times – without interaction. The engineer just has to check the report.

For the customer, seamless pairing of a phone or wireless updates might not seem like such a big deal. How do you ensure, people care about the features included into new vehicles?

Philip Potkowski: Since we spend so much time at our smartphones, they are often seen as the benchmark for digital user experience. Users have high expectations when it comes to access to data and information. They want updates overnight and major improvements every one or two years. They also want their applications to be intuitive and fun to use. That is why it is so hard to make people care about a wireless update for their cars – even though it might be extremely complex to implement. What we can do is: ensure the UX needs are met through testing. And evaluate the performance of a feature by benchmarking it, which we already do for example in our Connected Vehicles Benchmark. I do not think there is any turning back for OEMs. With autonomous driving and smart cities on the horizon, this is only going forward.

PHILIP POTKOWSKI

Philip Potkowski

Managing Director - Automotive & Aerospace

Americas

Phone

+1 248 854 0474

Mail

Philip.Potkowski@umlaut.com
Dave Wenzel umlaut

David Wenzel

Business Unit Lead - Test and Validation

Mail

david.wenzel@umlaut.com