Recurrent Reinforcement Learning for Predictive Overall Equipment Effectiveness
Journal
e-Manufacturing and Design Collaboration Symposium 2018, eMDC 2018 - Proceedings
Date Issued
2018
Author(s)
Abstract
With the increasing huge amount of real-time data being collected in modern manufacturing systems, the conventional indices defined to evaluate productivity, quality and performance become less effective. Compared to the conventional Overall Equipment Effectiveness (OEE), the predictive OEE (POEE) evaluates and monitors the forthcoming effectiveness of a single tool. Its predictive effectiveness is based on the extra production time due to anomaly tool conditions and undesired product quality. This research develops a recurrent reinforcement learning model to predict the predictable elements in calculating the POEE. Our model combines supervised Long-Short Term Memory (LSTM) and reinforced Deep Q-Network (DQN) techniques in predicting stochastic dynamics in production and quality. A Chemical Vapor Deposition (CVD) tool is taken as an exemplary case to illustrate the calculation of POEE. ? 2018 Taiwan Semiconductor Industry Association.
Subjects
Chemical vapor deposition; Long short-term memory; Quality control; Real time systems; Recurrent neural networks; Stochastic models; Stochastic systems; Chemical vapor depositions (CVD); Overall equipment effectiveness; Predictive; Production time; Real-time data; Recurrent reinforcement learning; Stochastic dynamics; Tool condition; Reinforcement learning
Type
conference paper