We're excited to launch Viaduct Visionaries, our new video series featuring conversations with automotive industry leaders who are driving innovation in quality, manufacturing, and connected vehicle data. For our inaugural episode, Viaduct COO Matej Drev sat down with Scott Garberding, former Chief Manufacturing Officer and Chief Quality Officer at FCA (now part of Stellantis).
During his nearly 30-year tenure at FCA, Scott played a pivotal role in transforming the company's approach to quality and manufacturing. His deep experience across senior leadership roles in manufacturing, purchasing, and quality provides unique insights into how automotive OEMs can leverage data analytics and AI to drive quality improvements.
Watch the full interview here.
Here are the key highlights from their conversation.
Matej: You've been on the vanguard of using connected vehicle data to improve quality and manufacturing. How did you recognize the potential before it became an industry-wide focus?
Scott: In manufacturing, it was pretty simple because we have a lot of people and a lot of data – people making cars, and data coming in from dealers, from the field, from warranty systems, and more recently from vehicles. The challenge is taking that data and getting exactly the right specific problem to the right person in the manufacturing facility to understand and correct problems as quickly as possible.
I had a boss who once told me that the trick to improving quality was to fix problems faster than you make new ones. That sounds funny, but it's actually very much the case because you have supplier changes, product changes, and other things happening that can be the source of quality problems. You've got to make sure that you stop those at the source as quickly as possible.
Matej: Looking back at your career, what were some of the biggest challenges in shifting to a more data-driven quality or manufacturing culture?
Scott: Although we had a lot of people making cars, we had fewer people who we could assign to set priorities, understand the key problems, and make sure we got the right technical resources assigned. Understanding the root cause of a problem based on just a warranty report or dealer report could sometimes be very difficult.
We were hungry for information. Any large automaker has millions of vehicles in the field, creating a huge data flow. It was really difficult to set priorities without the right tools. We were floating in data, but we had very skilled people using very manually intensive software tools to try to set these priorities. I spent hours watching them work and pretty quickly realized that we were going to overwhelm these people if we didn't have better tools.
Matej: What are the biggest obstacles preventing automotive OEMs from fully leveraging their data to improve quality and reduce warranty costs?
Scott: Common occurrences would be problems due to component interactions rather than simple component failures. Sometimes we'd find that we thought we had two problems, or we'd think replacing a component would solve a problem, only to learn months later that it wasn't always an effective corrective action because of an interaction in the vehicle. If you spend several months learning this, you've produced tens of thousands of cars in the meantime.
One of the things that I think is absolutely terrific about Viaduct is it can make those links. It can often identify related situations and point a user toward the fact that there seems to be an interaction here amongst components. This is super valuable for a problem-solving team because they can take that information and look more broadly at the components and understand really what's happening. That ability can be a huge money saver and protect customers from issues being put into the field.
Scott: There was a specific problem with a controller in the vehicle which we were replacing in fairly large volume – a few percent of cars would fail every month. The dealer would replace the component, it seemed okay for some period, but the problem continued. When we would send these components back to the supplier for evaluation, they almost uniformly would say the components were fine.
For a period of time, the chosen solution was a daily or weekly argument between our people and the supplier's people about whether the components were in fact defective or not. Eventually, we put a team together – the supplier contributed a few people – and did some very deep data analysis. We found that there was a third component and some software involved in this failure. The supplier was, to some degree, right – the component was performing as they expected, but not as the whole system expected it to perform.
We eventually reached a solution and eliminated the problem, but unfortunately it went on for several months. We replaced a lot of components in error and disappointed customers because we didn't understand the issue. I'm confident that had we had Viaduct at that time, we could have much more quickly seen what was happening.
Matej: Where is the industry right now in terms of maturity of adoption of data analytics and AI when it comes to quality specifically?
Scott: Every automaker and the large suppliers have made a lot of progress along the lines of business intelligence – being able to organize data more quickly, focus somewhat more quickly on pulling the right data set together, or linking data sets from multiple systems inside the company.
In my former company, we'd worked on that as well, but we were still reliant on people to link the right data sets, look at the correct dates, and make all the decisions about where we were focusing. We didn't really have tools that could help us look more broadly at data sets and make associations between problems and create a definition of the vehicles likely affected by a specific problem.
Although I feel like we had highly skilled people and we made big improvements – we were getting better – we were not to the point where we were seeing problems as fast as what was possible, and in as complete a sense as what was possible. Viaduct can point out quickly where repairs are being made multiple times, almost instantly. That kind of data, when linked with other pieces of a problem, can be super helpful.
Matej: What promising applications of AI or advanced analytics do you see coming over the next 5-10 years?
Scott: There are two areas that I think would be interesting. First, on the plant floor – we collect a lot of data from many devices that can be used for problem analysis. We could apply more intelligence to understand specifically at a workstation level where problems are emanating from, or if the problem is with a component that's incoming, looking at where that component is added to the vehicle.
If the OEMs could work out data sharing with suppliers, we could understand even more deeply where problems may have come from. We talked earlier about reactive capabilities where we're looking at warranty data or something after the event. Obviously, automakers want to push toward prevention, not simply identifying something quickly, but not having the problem at all.
Second, in the process of developing vehicles. Companies build relatively small fleets for testing – tens or hundreds of vehicles depending on the type of launch. There's a lot of data taken from those vehicles, specific monitors, and observations from drivers and mechanics. You have to treat every piece of information like gold because they're all important, but it can be difficult and frustrating to try to work one's way back to what really happened. This area is really right for AI-type tools to take all that information in and help identify what's signal and what's noise so we can prioritize the work.
Another area every automaker would be interested in is software quality – that's also a huge challenge and there's more and more software in vehicles year to year. It's an area that needs focus because from what I saw in the past and certainly what I read now, every automaker has had software issues, some that have delayed launches significantly.
Scott: Over my 30-year career in the auto industry, I've seen different approaches to acting upon available data – some helpful, some not. Fast forward to now, we have a lot more data, which makes it even tougher to convert the data to information that one can act upon.
If you want to be effective with reactive quality, it's all about finding and solving problems quickly. That is the entire game, working from doing that well back upstream in the process. But that's still another game of turning data into information and then acting on the information. In my recent experience, even with very skilled, hardworking people, we still spent a lot of time grinding through data and trying to understand what really happened. The key is freeing people from grinding through information to find something, and instead notifying them of risks based on facts.
Learn more about how Viaduct can help you harness connected vehicle data to improve quality and reduce downtime.