Technical session talks from ICRA 2012

TechTalks from event: Technical session talks from ICRA 2012

Conference registration code to access these videos can be accessed by visiting this link: PaperPlaza. Step-by-step to access these videos are here: step-by-step process .
Why some of the videos are missing? If you had provided your consent form for your video to be published and still it is missing, please contact support@techtalks.tv

Industrial Robotics

A sensor fusion method for state estimation of a flexible industrial robot is presented. By measuring the acceleration at the end-effector, the accuracy of the arm angular position is improved significantly when these measurements are fused with motor angle observation. The problem is formulated in a Bayesian estimation framework and two solutions are proposed; one using the extended Kalman filter (EKF) and one using the particle filter (PF). The technique is verified on experiments on the ABB IRB4600 robot, where the accelerometer method is showing a significant better dynamic performance, even when model errors are present.

Industrial robotic manipulators have excellent repeatability while accuracy is significantly poorer. Numerous error sources in the robotic workcell contributes to the accuracy problem. Modeling and identification of all the errors to achieve the required levels of accuracy may be difficult. To resolve the accuracy issues, a sensor based indirect error compensation approach is proposed in this paper where the errors are compensated online via measurements of the work object. The sensor captures a point cloud of the work object and with the CAD model of the work object, the actual relative pose of the sensor frame and work object frame can be established via a point cloud registration. Once this relationship has been established, the robot will be able to move the tool accurately relative to the work object frame near the point of compensation. A data pre-processing technique is proposed to reduce computation time and prevent a local minima solution during point cloud registration. A simulation study is presented to illustrate the effectiveness of the proposed solution.

For the motion control of industrial robots, the end-effector performance is of the ultimate interest. However, industrial robots are generally only equipped with motor-side encoders. Accurate estimation of the end-effector position and velocity is thus difficult due to complex joint dynamics. To overcome this problem, this paper presents an optical sensor based on position sensitive detector (PSD), referred as PSD camera, for direct end-effector position sensing. PSD features high precision and fast response while being cost-effective, thus is favorable for real-time feedback applications. In addition, to acquire good velocity estimation, a kinematic Kalman filter (KKF) is applied to fuse the measurement from the PSD camera with that from inertial sensors mounted on the end-effector. The performance of the developed PSD camera and the application of the KKF sensor fusion scheme have been validated through experiments on an industrial robot.

In this paper a concept for automated multi-robot-aided sewing is presented. The objective of the work is to demonstrate automatic sewing of 3D-shaped covers for recliners, by assembling two different hide parts with different shapes, using two robots to align the parts during sewing. The system consists of an industrial sewing machine and two real-time controlled Universal Robots 6-axis industrial manipulators. A force feedback system combined with optical edge sensors is evaluated for the control of the sewing process. The force sensors are used to synchronize the velocity and feed rate between the robots and the sewing machine. A test cell was built to determine the feasibility of the force feedback control and velocity synchronization. Experiments are presented which investigate the ability of the robot to feed a hide part into the sewing machine using a force sensor and different strategies for velocity synchronization.

A new approach for transportation of objects within production systems by automated throwing and capturing is investigated. This paper presents an implementation, consisting of a throwing robot and a capturing robot. The throwing robot uses a linear and the capturing robot a rotary axis. The throwing robot is capable of throwing cylinder-shaped objects onto a target point with high precision. The capturing robot there smoothly grips the cylinders during flight by means of a rotational movement. In order to synchronize the capturing robot and the cylinderâ€™s pose and velocity, its trajectory has to be modeled as well as the motion sequences of both robots. The throwing and capturing tasks are performed by the robots automatically without the use of any external sensor system.