Cleanrooms are essentially an area or room dedicated to a particular process, and wherein such processes are required to be carried out in an ultra-clean environment. Traditionally, these “rooms” are kept clean by high-tech air filtration systems, custom HVAC systems, constant air changes and then monitored around the clock for particulate counts using expensive particle counters while simultaneously monitoring temperature, relative humidity and differential room air pressure. Particulates are a cleanroom’s worst nightmare. Particulates can include dust, dirt, viruses, bacteria, mold, allergens and a host of other contaminants – all which can be increased if any one part of the control and/or monitoring system fails.
Keeping cleanrooms clean is an ongoing exercise in futility; after all, the Earth is a dirty place, and keeping a microcosm within it clean is a very, very difficult job. On the other hand, the integration of Artificial Intelligence (AI) will one day soon take this task head-on and do it with ease and reliability never before seen – opening new doors of what’s possible in developing new breakthroughs in the fields of biotechnology, nano technology and computer processors.
Artificial Intelligence, or as it is collectively known “AI” is not something comparable to the Terminator, nor is it created in the likeness of JARVIS – the AI companion of Tony Stark in popular Iron Man movies. In fact, AI – in most cases – is not even a physical entity. Sure, AI-based algorithms can be embedded in hardware, robots, and even your refrigerator, but most commonly, AI is a specially written algorithm (computer code), created to solve a particular task. in some cases (as with deep learning) it is programmed to learn form it’s own mistakes, and develop autonomous ways to accomplish tasks with greater efficiency.
Keeping cleanrooms cleaner is one such possible task.
In order to translate theory into practice and create our “cleanroom bot”, we must first “learn” how AI “learns” – a process by which an algorithm is fed known variables, and then tasked to identify these variables with greater precision and speed. In this case, we would theoretically create the code (most likely in Python and R programming languages), and then add peripheral “senses” – like an ultra-high resolution camera and particle sensor/counter to detect particulates; and a system of identifying particulates by size and type. We would also integrate a trusted temperature, RH and differential pressure sensor system, and maybe even a biological detecting sensor system for live particulate detection.
Then we would do something quite novel – we would add sensors and learning variables for conditions existing outside of the controlled environment. We would monitor atmospheric pressure, relative humidity, differential pressure in the containing facility, geographical analysis, human bio-metrics, biofeedback, body odor, pheromone analysis, etc. Essentially, our AI-based cleanroom bot would constantly pull data from every conceivable source within and outside of the cleanroom.
We would ideally then start to train our “cleanroom bot” to identify and understand unfathomable variations of particulates. This is not as difficult as it may seem. In fact, using cameras and algorithms to identify faces, license plates, and even early detection of genetic disorder predisposition is already being used daily in the world around us. Basically, our cleanroom bot would need two distinct comparative models to start – a ultra-clean ISO-1 cleanroom with the baseline ISO-9 cleanroom, and then add the interval cleanroom types. In fact, in its simplest form, we could readily accomplish this using the TensorFlow platform, adding in some custom tweaks to the TensorFlow Image training/retraining code, add in analysis differential for areas in and outside the room, develop a baseline clean and dirty classification and voila! We have a working (infantile-level) cleanroom bot.
Up until this point, we have been focused on the process of the why’s and how’s regarding creating a cleanroom bot, or for better description, an AI-based algorithm for identifying whether or not a particular controlled environment is clean or dirty – and if it is – just how clean or dirt it is. This alone would be great, especially since it would cut the current time it takes to verify such data by thousands of times less! However, it is the predictive analysis we are really after.
Knowing is only half the battle. It is far better to speculate conditions with flawless precision and ultimate accuracy, than to know at a particular moment in time what a controlled environment can be classified as. To explain a bit further, imagine a scenario where our cleanroom bot can predict with near 100% accuracy the most ideal times and conditions by which to carry out specific processes in that cleanroom. Imagine a bit further that the rendered data collected by our cleanroom bot will enable us to know not only when a cleanroom will be “dirty” but also what exactly is making it dirty, and why. One more step – our cleanroom bot will be capable of making recommendations for building cleanrooms at the ISO -9 level and beyond, keeping them at that level and even taking them to cleaner levels – all at a fraction of the cost of operating a current ISO-9 cleanroom.
This is no longer science fiction. It is a reality, and only a matter of time before the expansive capabilities of AI-based deep learning infiltrates and enhances almost all aspects of our daily lives. Cleaner cleanrooms and controlled environments means better vaccines, better medications, more advances in biotechnology, greater capacity for maximizing processing power in chipsets, potential to identify causes for disease (and cures) and so much more. These things would not otherwise be possible without the integration of AI-based algorithms in cleanrooms and controlled environments.