The Quality Amplification Challenge: Throughput Time's Effect on Quality Variability and Quality Loss

The Quality Amplification Challenge: Throughput Time's Effect on Quality Variability and Quality Loss

Zhen Li, Pamela Rogers, Wai Kwan Lau
Copyright: © 2020 |Pages: 21
DOI: 10.4018/IJSDS.2020040102
OnDemand:
(Individual Articles)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

In dynamic production environments, positive business outcomes often require pursuing high quality, shorter throughput time, and low cost simultaneously. This research tries to track the timing effect of having undetected defects continue to progress through the supply chain. The purpose is to demonstrate the influence of throughput time on quality variance amplification and the effect of quality loss on manufacturing performance. A time series model is developed, and numerical analyses are conducted to understand how various parameters affect the quality variance fluctuation and quality loss amplification phenomena. The results show that the variance in quality increases as materials move down a supply chain, away from a supplier. In addition, the quality loss experienced by a manufacturer is greater than that faced by a supplier. Finally, both quality variability and quality loss amplification phenomena are larger when throughput time is longer.
Article Preview
Top

1. Introduction

In today’s dynamic production environments, business success often requires pursuing high quality, shorter throughput time, and low cost simultaneously. These goals are identified as competitive capacities and used to measure manufacturing performance (Machuca et al., 2011; Yang and Pan, 2004; Cua et al., 2001). The pursuit of ongoing quality improvement is studied heavily since the quality revolution of the 1980s (Crosby, 1979). Ferdows and De Meyer (1990) propose that manufacturing capabilities build upon each other and quality is the foundation required for sustainable business performance. The study of variability amplification has been of interest to supply chain management scholars since the first study on demand amplification appeared in the late 1950s. Forrester (1958) proposes that decisions made by a company depend on the flow of information, materials, money, manpower, and capital equipment among its upstream and downstream partners. He provides an initial explanation and observational evidence where a small, sudden change in retail sales results in increasing variability of order rates, factory output, warehouse inventory, and unfilled orders throughout the supply chain. The most influential studies are by Lee et al. (1997a, b), who coin the term “bullwhip effect” referring to the phenomenon of increasing order quantity variance moving upstream from end customers to suppliers in a supply chain. They identify four main operational causes of the bullwhip effect and discuss options for alleviating their effects.

Since the 1990s, a considerable body of literature related to the bullwhip effect emerges with the majority focusing on quantifying the demand-order process through different types of supply chains (Agrawal et al., 2009). Such variability amplification, however, is not unique to demand-order management. Quality changes in a supply chain exhibit similar behavior. In any manufacturing system, the quality of outputs is significantly influenced by the quality of inputs and activities conducted at each stage in the production process (Baiman et al., 2000; Fernandes et al., 2017). As suggested by Taguchi’s quality model, quality generally does not drop suddenly; instead a loss of quality occurs progressively as variation increases within specification limits (Upadhayay and Vrat, 2016). These processing noises deteriorate the production gradually over time and, ultimately, can increase the variance in final product quality. As a result, the variance in quality increases as materials move downstream from the supplier through successive stages of the supply chain. Without stable quality as a foundation of the manufacturing and supply chain systems, performance will gradually deteriorate likely resulting in additional production and quality issues emerging over time (Ferdows and De Meyer, 1990; Zu and Cui, 2013).

This amplification is magnified as throughput time lengthens. Throughput time is the total time required for a product to pass through a manufacturing process, which includes inspection time, moving time, waiting/storage time, and processing time. With longer throughput time, uncertainty at each stage of a supply chain increases; consequently, the variance in quality also increases. Therefore, the objective of this paper is to demonstrate the influence of throughput time on quality variance amplification and quality loss to supply chain members which ultimately affects business performance. In order to do so, we consider a two-echelon supply chain with a manufacturer and a supplier. A time series model is developed to identify the relationship between the variance in quality incurred by the manufacturer and the variance in quality experienced by its supplier. Additionally, the supply chain members’ expected quality costs are compared.

Complete Article List

Search this Journal:
Reset
Volume 15: 1 Issue (2024): Forthcoming, Available for Pre-Order
Volume 14: 1 Issue (2023)
Volume 13: 4 Issues (2022): 1 Released, 3 Forthcoming
Volume 12: 3 Issues (2021)
Volume 11: 4 Issues (2020)
Volume 10: 4 Issues (2019)
Volume 9: 4 Issues (2018)
Volume 8: 4 Issues (2017)
Volume 7: 4 Issues (2016)
Volume 6: 4 Issues (2015)
Volume 5: 4 Issues (2014)
Volume 4: 4 Issues (2013)
Volume 3: 4 Issues (2012)
Volume 2: 4 Issues (2011)
Volume 1: 4 Issues (2010)
View Complete Journal Contents Listing