There are mistakes and there are MISTAKES. In the automotive sector, they tend to be highly expensive. As an example, are the famous recalls that automakers make to review hundreds, thousands or, in some cases, millions of vehicles that have a certain fault. They are generally caused by quality defects in the assembly process or in some of the components, and cause problems in the operation of the vehicle.
Most auto companies have suffered from them. The largest of these was the one that Toyota suffered in 2009 due to an accelerator pedal failure that suddenly caused unintended acceleration, which ended up involving the recall of some 9 million units. The total cost? $ 1.2 billion, in addition to the negative reputational impact that was brutal.
How much do quality defects cost in companies? According to the American Association for Quality (ASQ), they represent between 15 and 20% of the income.
Diverse surveys place the quality area as the most relevant aspect to choose or change suppliers, so the cost of one failure is higher than the merely economic one.
The intention of every company is to take care of this aspect to the maximum, but, to be honest, they don't always do it flawlessly. That is why I want to address the importance of artificial intelligence as the ideal tool to reduce the risk of making quality errors in products. And I want to address two specific use cases for this.
The first one is from Computer Vision, a technology whose scope allows building use cases as specific as you need, just as BMW does at its Regensburg plant to detect whether the sealing process of a screw is done correctly or, if not, the system sends alerts when it detects anomalies.
What part of your process is so critical and sensitive to make sure it happens flawlessly and you don't want to even imagine what could cause it to fail? How much money do you lose each time you correct a problem of this type? Answering both questions can lead you to the initial definition of the problem and encourage you to try this emerging technology as a potential solution.
The second use case has to do with Machine Learning to predict quality deviations using a digital twin. An industrial company has incorporated it for the flour production process, in which the digital tool predicts potential deviations in humidity and allows the operations personnel to adjust critical variables in the process to prevent this from happening.
How could you apply something similar in discrete manufacturing? Let's think of a machining company that must maintain its equipment to calibrate it and avoid millimeter deviations in the manufacture of parts. A use case could be a predictive maintenance solution that allows you to eliminate unscheduled downtime and act according to an anticipated, efficient, and cost-calculated schedule.
Isn't it the dream of every COO to eliminate unscheduled downtime and long downtime from their equipment?
As you see it, building a use case is not that complicated and on the contrary, you can see the benefits immediately. I dare you to try.