The semiconductor industry is a highly complex manufacturing field with more than 500 processing steps. Due to the complexities of the production process, harsh environment, and fluctuation in the demand of the electronics market, companies have to adapt and meet the demand of the market in the short term (Sun & Rose, 2015). Mainly production process of the semiconductor industry consists of the below steps:
•Raw wafer production from pure silicon
• Applying integrated circuits to raw wafers
• Implementation of the received manuscript into integrated circuits
• Packaging the integrated circuits to create final products
• Product testing (Munirathinam & Ramadoss, 2016).
There are more steps to apply different layers over the wafer surface during the production called the wafer fabrication process. During these processes, the defects in process can make the wafer unusable. Therefore, detection of the faults in the early stages is highly important to save money and time by isolating the defective parts from further processing (Sait and Patel, 2013). Sensor solutions that generate valuable data are used to manage, control, and evaluate these complex processes to improve the system and optimize the process of manufacturing. (Kerdprasop & Kerdprasop, 2011). Detecting faulty wafers is difficult during production because of the large scale of the data that is gathered from the sensors. However, it is possible to implement machine learning algorithms that learn from the existing sets of data to detect the faulty wafers. Machine Learning (ML) is an application of algorithms that tend to learn, adapt and improve from existing data without being programmed to analyze and make inferences from the patterns in the data (Munirathinam & Ramadoss, 2016).
This study aims to inquire about the optimization of the processes of early fault detection by ML algorithms in the production of silicon wafers based on the dataset