Finally to start the application, either the robot uses its end effector to push the start button (like a human would do) or use the PLC to send a signal to the CNC to automatically start the program. We don’t reocmmend trying to coordinate this functionality with timers, this is a recipe for disaster – timers don’t adjust over time, and don’t respond to an exception – so you can end up with the robot trying to take a part while the CNC door is still locked for milling.
In the end, you should not expect your robot integration to have the same dexterity and flexibility as a human worker.
Your gripper selection will depend on the parts you will handle. In fact, if you are handling parts from 5mm to 80mm you should choose a gripper that has the flexibility to adapt to all these lengths. Robotiq’s 2-Finger 85 Gripper can accommodate parts within these dimensions without any additional programming.
If you are only handling one part that is a 27.5mm all year long, you may want to go with a more rigid custom gripper that will perfectly fit your part.
A robot will do what it has been taught. In fact, there are very few logical decision that can be entirely done by a robot. In this section, you will see what type of signal or input can be given to the robot in order to allow it to achieve its task.
Related: Is Finance Ready for Machine Learning?
Most robotic newbies will talk about their future robots like: ‘’The robot will do this and that according to the situation A and B.’’ In reality, the robot will not do or decide anything, it will simply execute its programming. There are very few decisions that can be taken by the robot. In other words, the decision will be made according to specific data.
For example, a robot might be weighing parts and then ‘deciding’ if the part is going into the right bin or in the left bin. In this particular case, the parameter is not: “Is the part heavier than the previous part”. It is rather: “Is the part under 10oz? If yes, drop it in the right bin; if not, drop it in the left bin”.
Not so long ago, it was really complicated to have a part recognition system. In fact, robotic vision systems are relatively new to the market. With built in solutions such as the Robotiq Wrist Camera and the vision system included in YuMi for example; it is quite easy to locate a part and do something with it.
This type of technology is growing so fast that even Google has robots that are learning to pick objects using their vision system. After putting all the data together they can build their own vision library which can grasp parts with more ease.
That being said, if your process needs to use a vision system, it is now easier than ever to introduce it.
Communication with the CNC
Since there are a bunch of signals that can be sent from the CNC or from the robot; and there is a limited number of I/Os that can be processed by the PLC; it is a good idea to limit the data that you would like to access. In fact, even if you would like to read a bunch of information on either machine’s status, you need to set your priorities.
As this discussion on DoF explains, you can use a simple signal from the CNC machine to start the robot program i.e. a light signal, but you can hardly extract enough precise data from the CNC to trigger your robot. Keep the communications between the robot and the CNC machine as simple as possible, this will simplify your integration and keep you away from complex logical decisions.