I am currently trying to teach the current position by adjusting the d1 parameter of the Gripper robot and try to set it as a target and when I simulate it by clicking move to target the gripper fingers are not moving as expected. Can some one help me with how I can use this robotic gripper in tamdem with a UR3e arm to generate simulation programs

Hi. Thank you for the reply. That video helped alot. Also, how do I create custom objects in robodk. For example I need for the robot to move a wooden block of specific dimensions from point A to B along X-direction.

Thank you very much for your response. Actually I am evaluating RoboDK software currently under the direction of my advisor who is contemplating getting the software as we work a lot on industrial and assistive robots (UR3s, UR5s, Kuka, Jaco). I want to know if we can generate feedback controllers in the simulation based on information from the inbuilt force sensor/external cameras. If yes, could you point me towards a tutorial where I start?

This is not an already developed integration, the API will let you interface the robot and RoboDK if, and only if, the robot controller lets you export the data from the force sensor and camera to an external PC.

But a good starting point could be the "UR_ActivateMonitoring" macro. You can find this macro in your RoboDK local library.
(Top left -> Blue folder -> Macros)

Simply that the robot needs to have a way to communicate with the PC running RoboDK.
And more than just communication, the robot controller needs to be able to send the data of the force sensor in realtime. This is part of the project is the integration part. You need to find a way to retrieve the data from the controller and push it to the Python script that will manage RoboDK.
Once you find a way to send the data in realtime from the controller to the PC running RoboDK, the API we provide will let you manage this data and make your robot react according to it.