Published

Demystifying Artificial Intelligence

Have you familiarized yourself with the Artificial Intelligence lingo? Here are some AI terms every manufacturing person should know.
#columns

Share

Photo Credit: Getty Images

Photo Credit: Getty Images

Imagine walking into your next continuous improvement meeting and announcing to your teammates that you have your best idea yet.  

“I think,” you say, “that we should use a neural network-driven subcategory of artificial intelligence — machine learning, for example — to improve our machining processes. Relying on the huge amount of data available to us — speeds, feeds, tool ware, torque, machine load, throughput and much more — we could leverage deep learning and a combination of classification learning, supervised learning, unsupervised learning, reinforcement learning, computer vision and image recognition to drastically lower cost and improve lead times. No, I’m serous! There are developers right now writing code in Python and C++, and using application frameworks like TensorFlow and PyTorch to create AI solutions that would enable us to this today! I think by the time we’re done our machine operators could run 100% conforming parts, every time, using nothing more than NLP!”

Featured Content

Confused yet? Let’s demystify artificial intelligence (AI).

I have written in this space previously about how artificial intelligence (AI) is revolutionizing manufacturing. Autonomous vehicles, robot vision systems, predictive analytics, rejects driven to zero, smart technology embedded on manufacturing equipment and so on. It’s all coming. Have you familiarized yourself with the AI lingo? Here are some AI terms every manufacturing person should know:

Artificial Intelligence. First, thanks to my friends Murtaza Bohra and Paul Karam, my go-to AI experts at Quanser in Markham, Ontario, for this definition and for much of what I have learned about AI. Murtaza defines AI as the mathematic and algorithmic implementation of human experience (how we connect and link ideas) and learning (the process of increasing our knowledge). What’s an Algorithm? Simply a set of instructions and rules that a computer uses to complete a task.

Neural Networks are sets of these instructions and rules designed and influenced to operate like the biological systems that comprise human brains.

Machine Learning (ML) is a type or branch of AI that proceeds from the idea that technology can learn from information and data, identify patterns and correlations within that data and can use this information to make human-level decisions largely independent of human involvement. There are several types of machine learning.

Supervised Learning is a subcategory of ML wherein an ML platform is provided with structured data and a potential answer to a problem or targeted outcome — for instance, meeting a production line cycle time or yield goal — as well as process variable data gathered by smart sensors and devices combined with outcomes such as yield and cycle time data. From there, the ML platform identifies correlations between process variables and quality or productivity. Once these correlations are understood using ML, the process variables that would otherwise produce unfavorable yield or cycle time results can be adjusted, often automatically.

In similar fashion we can suggest a desired outcome to a Reinforcement Learning algorithm (for instance, improve parts per million defects to 2 ppm) and give the algorithm little rewards or reinforcements as it gets closer and closer to the results we want.

Unsupervised Learning consumes seas of raw and even unlabeled data, and finds correlations that humans might not even know could exist, enabling us for example to adjust process variables that we didn’t realize were producing nonconforming parts.

Classification Learning (another ML subcategory) employs technologies such as Image Recognition and Computer Vision (an AI field that uses highly advanced cameras and computers to discern information from images and videos). Classification learning systems rely on huge data sets and neural networks to identify objects and conditions, essentially seeing and thinking as humans see and think. Consider training a classification learning algorithm the difference between a conforming and nonconforming production part, and using that system to perform final and in-process inspection with super-human accuracy.

Deep Learning is an ML approach that teaches computers to think and act like humans. Consider an autonomous mobile robot (AMR) or “driverless forklift” moving material about a manufacturing operation in the same way a forklift and driver do. The driver of a human-operated forklift gathers information about the environment (material locations and destinations, aisles, obstructions, locations and paths of other forklifts and workers, and so on) and uses this data to determine the safest and most efficient transportation route. Deep Learning enables an AMR to do exactly the same thing, often exceeding human performance.

Not only can we use AI to recognize and analyze images and videos, but voice and text as well. If you’ve ever asked a question of or delivered a command to Siri or Alexa, you used Natural Language Processing (NLP). Machine tools that follow voice commands in the same way are not far off. 

AI necessitates the use of Programming Languages or systems that AI developers use to write computer programs. Some of the more common languages used in AI include Python, MatLab, Java, C++ and JavaScript. 

Developers also use Application Frameworks that facilitate and speed the development of AI solutions. In the same way that manufacturers build and improve manufacturing processes developed by other manufacturers, frameworks enable developers to build AI applications using code and components previously developed by others. Examples include PyTorch, TensorFlow and Microsoft Cognitive Toolkit.

The list of AI tools and terms is virtually limitless — but the key ones appear above. Armed with your new knowledge of AI, go back and read the opening of this column, amazed at what you have learned. Then head to your next team meeting with just enough AI knowledge to be the “expert” and launch your journey.

RELATED CONTENT