As the semiconductor scaling continues from a technology node to another node to follow Moore's law, computational jobs such as big data handling, data processing have become intensive to carry out with introduction of more data and models for predicting the chip manufacturability. 'Big Data' usually refers to data volumes that are so large that traditional data processing applications are inadequate, which exactly represents current computational lithography. In consequence, increased risk with increased ramp-up time from research to high volume manufacturing has been pointed out as risk as well as risks.
The student will learn conventional computational lithography flow and work toward developing and applying "Machine learning" and optimization algorithms with a goal to 1) reduce number of iteration, 2) reduce computational resource and 3) optimize/minimize DOE to shorten turn-around-time (TAT) in early develop and yield ramp up in Semiconductor manufacturing. Machine learning and optimization may include study of:
- Pattern recognition (layout pixilation), Pattern extraction, Classification
- Machine learning Modeling - training using optimization algorithm. Optimization incorporates mathematical and statistical concepts (effect, global/local min. search algorithm), or a new concept
- OPC: run optimized set across chip, DFM: Design DOE based on output
- Data analytics
Required background: Knowledge in program language and UNIX (or linux) is a plus. Interest in Machine learning, and optimization theory is required. This project suits students from electrical engineering, computer science or mathematics.
Type of work: 50% for preparation and execution of experiments, 30% for data analysis, 20% for literature study.
Type of Project: Combination of internship and thesis
Master's degree: Master of Science
Master program: Electrotechnics/Electrical Engineering
Supervising scientist: For further information or for application, please contactRyan Ryoung-han Kim (Ryan.Ryoung.han.Kim@imec.be).