Building a Wafer Map Visualization Dashboard
😛
Oops we are not talking about this kind of wafer..
 

We are taking about the kind of wafers that eventually become chips and most likely can be found in the device that you are using to read this article right now.

 
👌🏽
Now this looks more like it..
 

The standard size of a wafer is 200mm or 8 inches (like the one seen above) which means there is roughly 270 dies on a wafer. Notice those small gold squares, each of those squares is referred to as a die.

Each of those dies are tested and assigned a bin value depending on the test results. So you can imagine that these bins represent different colors making it easy for engineers to visualize test results and recognize failure patterns at a glance. Engineers find it extremely useful to visualize the performance of a wafer on what is called a wafer map.

 
💡
Hmm so what’s the problem with the wafer map?
 

Wafer sizes started to increase.. more data to process.. more dies to visualize..

In order to save costs and yield more, the fabrication plants or fabs, by which they are commonly known, started to make larger wafers, so this means there are far more dies on a wafer and usually each batch or lot will consist of 25 of these wafers.

 

Let’s consider a scenario like this:

1 lot = 25 (200mm wafers) * 250 (dies per wafer) * 1000 (test records) = 6,250,000

1 lot = 25 (300mm wafers) * 640 (dies per wafer) * 1000 (test records) = 16,000,000

That’s a 156% increase of test records! (and there are plans to make 17 inch wafers in the future 😮)

 
 

The performance of the existing frontend application began to degrade and users started to see loading times of a minute or more and eventually page timeouts. The frontend was built off of a traditional synchronous monolithic architecture that just could not support this change.

 

This is the reason for this non-blocking architecture explained below.. 😎

Notion image

We went with Python for the backend of this project mainly to take advantage of the data processing libraries to transform wafer test data.

Using Redis Queue (RQ), wafer and chart data is broken into chunks and pre-processed in background jobs. Job results are kept in Redis which allows RQ to serve as a caching mechanism as well.

In the front end, Oboe.js consumes job results stored as JSON and streams it to the front end, allowing visualizations to be drawn incrementally. This improves the perceptual performance for users because the user does not have to wait until all the data is available to start seeing the visualizations.

D3.js is primarily used to build up the wafer map visualization. There was some pretty complicated calculations that went into getting the wafer map generated based on the wafer dimension data.

 

Tech stack:

  • Pandas
  • Matplotlib
  • D3.js
  • Highcharts
  • Redis Queue (Python RQ)
  • Flask/Gunicorn
  • NGINX
 

This cache setup worked particularly well because during an ongoing project it was more likely that certain lots were being accessed more often by different users. So data was readily available, they were pretty impressed with the performance improvements!

Note: Sadly Oboe.js does not seem to be maintained now so it would probably be better to switch out the frontend to an option that is better supported.