Advancements in machine vision technology have played a significant role in facilitating more incorporation of machine vision to cobots. Unlike past robots, it is now quite difficult to find a collaborative robot that does not have a vision system.
The machine vision is used to increase efficiency in collaborative robot applications. Such applications include
- Picking and Placing: Guiding a pick and place robot and locating the items a pick and place robot ought to pick.
- Assembly: Locating the pieces to assemble and checking to see if the assembly is complete.
- Labeling: Locating the item to label and reading to verify it is labeled right.
With a cobot’s vision system having such critical functions, it is essential to learn how to get the most out of your robotic vision system to ensure best results. Here are some tips.
1. Lighting Is Key
A simple consideration but arguably the most important. For a second, think about the differences observed whenever you take photos in the dark and in good lighting.
A photo taken in the dark is often blurry, dark and unclear. The opposite is true for a photo taken in good lighting. Similarly, if your cobot’s vision system is operating under bad light, it follows that errors will increase.
Ensuring good lighting is especially crucial for collaborative robots that work overnight. Imagine for a second that after a night of working, you wake up and find out that the collaborative robot has mislabeled the products because of the bad lighting that made it impossible for the robot to differentiate one product from the other.
In such a situation, you not only miss the benefits of cobots such as speed and accuracy, but you also end up wasting a lot of time.
Solution? Always make sure the cobot’s work environment has excellent lighting.
2. When Sourcing for Collaborative Robots Ensure They Incorporate the Latest Technologies and Most Accurate Cameras
Such a consideration is particularly critical if you have a sensitive work environment. What does sensitive mean?
Here is a scenario to consider. A specific company X manufactures balls. Balls are easy to detect because of their consistent shape. For most vision systems, object identification is through the outline of a given shape. What happens, however, when one or several balls are squashed?
If your robot does not employ a high definition vision system, then identification becomes impossible.
Sensitive workplace, therefore, means a work environment where deformed products and products with non-uniform articulations are the norms but the robot vision system is still required to perform optimally and recognize the product correctly either for labeling, picking and placing or packaging.
The goal, therefore, should be to source for the best. 3D cameras id possible and the most advanced technologies that incorporated advanced machine learning for pattern recognition.
3. Keep The Background of Your Work Environment Simple
A key goal is ensuring that whenever your cobot focuses its vision system on the object, it can quickly identify the object. The background plays a vital role in this.
Consider a background composed of patterns and shapes. Whenever a vision system focuses, especially if the product is also not of one solid color, it becomes almost impossible to differentiate between the background and the object.
Such confusions result either in errors or the cobot not working because it cannot identify which object to either pick, assemble or package.
The agreed upon consensus is that if possible, you should keep the background of your work environment as simple as possible especially if it is in a factory setting.
The above three tips seem simple and even obvious, but it is quite common to find a collaborative robot working in an environment where the business has not considered any of these tips.
While the cobot will still work, chances are, the company or business isn’t getting the maximum possible work rate from the cobot. Such a situation results in less return on investment which is the ultimate goal whenever a company chooses to incorporate robots.
If you are already using collaborative robots, it is time to assess if you are getting the most out of your robotic vision system. If you are not, then it would behoove you to adjust.