Academic Commons Search Resultshttps://academiccommons.columbia.edu/catalog?action=index&controller=catalog&f%5Bauthor_facet%5D%5B%5D=Timcenko%2C+Aleksandar&f%5Bdepartment_facet%5D%5B%5D=Computer+Science&format=rss&fq%5B%5D=has_model_ssim%3A%22info%3Afedora%2Fldpd%3AContentAggregator%22&q=&rows=500&sort=record_creation_date+desc
Academic Commons Search Resultsen-usHand-eye coordination for grasping moving objectshttps://academiccommons.columbia.edu/catalog/ac:154465
Allen, Peter K.; Timcenko, Aleksandar; Yoshimi, Billibon; Michelman, Paulhttp://hdl.handle.net/10022/AC:P:15264Mon, 12 Nov 2012 14:34:48 +0000Most robotic grasping tasks assume a stationary or fixed object. In this paper, we explore the requirements for grasping a moving object. This task requires proper coordination between at least 3 separate subsystems: dynamic vision sensing, real-time arm control, and grasp control. As with humans, our system first visually tracks the object's 3-D position. Because the object is in motion, this must be done in a dynamic manner to coordinate the motion of the robotic arm as it tracks the object. The dynamic vision system is used to feed a real-time arm control algorithm that plans a trajectory. The arm control algorithm is implemented in two steps: 1) filtering and prediction, and 2) kinematic transformation computation. Once the trajectory of the object is tracked, the hand must intercept the object to actually grasp it. We present 3 different strategies for intercepting the object and results from the tracking algorithm.Roboticspka1Computer ScienceArticlesReal-time visual servoinghttps://academiccommons.columbia.edu/catalog/ac:154443
Allen, Peter K.; Yoshimi, Billibon; Timcenko, Aleksandarhttp://hdl.handle.net/10022/AC:P:15245Mon, 12 Nov 2012 11:04:35 +0000A real-time tracking algorithm in conjunction with a predictive filter to allow real-time visual servoing of a robotic arm that is tracking a moving object is described. The system consists of two calibrated (but unregistered) cameras that provide images to a real-time, pipeline-parallel optic-flow algorithm that can robustly compute optic-flow and calculate the 3-D position of a moving object at approximately 5-Hz rates. These 3-D positions of the moving object serve as input to a predictive kinematic control algorithm that uses an α-β-γ filter to update the position of a robotic arm tracking the moving object. Experimental results are presented for the tracking of a moving model train in a variety of different trajectories.Roboticspka1Computer ScienceArticlesTrajectory filtering and prediction for automated tracking and grasping of a moving objecthttps://academiccommons.columbia.edu/catalog/ac:154294
Allen, Peter K.; Timcenko, Aleksandar; Yoshimi, Billibon; Michelman, Paulhttp://hdl.handle.net/10022/AC:P:15243Thu, 08 Nov 2012 16:06:35 +0000The authors explore the requirements for grasping a moving object. This task requires proper coordination between at least three separate subsystems: real-time vision sensing, trajectory-planning/arm-control, and grasp planning. As with humans, the system first visually tracks the object's 3D position. Because the object is in motion, this must be done in real-time to coordinate the motion of the robotic arm as it tracks the object. The vision system is used to feed an arm control algorithm that plans a trajectory. The arm control algorithm is implemented into two steps: filtering and prediction and kinematic transformation computation. Once the trajectory of the object is tracked, the hand must intercept the object to actually grasp it. Experimental results are presented in which a moving model train was tracked, stably grasped, and picked up by the system.Roboticspka1Computer ScienceArticlesPlanning velocity profiles from task-level constraints and environment uncertaintieshttps://academiccommons.columbia.edu/catalog/ac:154291
Timcenko, Aleksandar; Allen, Peter K.http://hdl.handle.net/10022/AC:P:15242Thu, 08 Nov 2012 15:58:39 +0000A method for parameterizing robot trajectories in the presence of uncertainties is presented. The planning process is defined as a problem of constrained optimization and the concept of a task's difficulty is used as an optimization criterion. The task difficulty, as defined by the authors, comprises the combined effects of velocity and uncertainty, mimicking human perception of difficulty in positioning tasks. The success probability is used as a constraint necessary for planning tasks with contradicting requirements. This planning paradigm is demonstrated with an experiment that contains opposing requirements: reaching the obstacle in a given time, but without exceeding certain maximal impact force. The planner is implemented on a real system.Roboticspka1Computer ScienceArticlesModeling dynamic uncertainty in robot motionshttps://academiccommons.columbia.edu/catalog/ac:154288
Timcenko, Aleksandar; Allen, Peter K.http://hdl.handle.net/10022/AC:P:15241Thu, 08 Nov 2012 15:54:15 +0000A method for modeling uncertainties that exist in a robotic system, based on stochastic differential equations, is presented. The use of such a model permits the capture in an analytical structure of the ability to properly express uncertainty within the motion descriptions and the dynamic, changing nature of the task and its constraints. With respect to the dynamic nature of robotic motion tasks, the model of the environment uncertainty proposed is dynamic rather than static. The amount of knowledge about the environment is allowed to change as the robot moves. These results suggest that computational models traditionally found in the lower levels in robot systems may have application in the upper planning levels as well. Some experimental results using the model are presented.Roboticspka1Computer ScienceArticlesProbability-driven motion planning for mobile robotshttps://academiccommons.columbia.edu/catalog/ac:154240
Timcenko, Aleksandar; Allen, Peter K.http://hdl.handle.net/10022/AC:P:15228Wed, 07 Nov 2012 16:21:47 +0000This paper proposes a path-planning method for mobile robots in the presence of uncertainty. We analyze environment and control uncertainty and propose methods for incorporating each of them into the planning algorithm. We model the environment using the pyramid structure that encodes the information on occupancy probabilities for each pixel as well as the partial information on conditional probabilities among different pixels. This structure allows for efficient and accurate computation of collision probabilities in the presence of environment uncertainty. The control uncertainty is mainly characterized by its expansion in space and time and is accordingly modeled by a stochastic differential equation that mathematically captures this phenomenon. Models that we develop are inevitably approximate but experiments confirm that they can be used as a reasonable model for motion planning. We have conducted a series of experiments on the mobile platform and some of these results are presented.Roboticspka1Computer ScienceArticlesAutomated tracking and grasping of a moving object with a robotic hand-eye systemhttps://academiccommons.columbia.edu/catalog/ac:153826
Allen, Peter K.; Timcenko, Aleksandar; Yoshimi, Billibon; Michelman, Paulhttp://hdl.handle.net/10022/AC:P:15076Thu, 25 Oct 2012 12:22:46 +0000An attempt to achieve a high level of interaction between a real-time vision system capable of tracking moving objects in 3-D and a robot arm with gripper that can be used to pick up a moving object is described. The interplay of hand-eye coordination in dynamic grasping tasks such as grasping of parts on a moving conveyor system, assembly of articulated parts, or for grasping from a mobile robotic system is explored. The goal is to build an integrated sensing and actuation system that can operate in dynamic as opposed to static environments. The system built addresses three distinct problems in using robotic hand-eye coordination for grasping moving objects: fast computation of 3-D motion parameters from vision, predictive control of a moving robotic arm to track a moving object, and interception and grasping. The system operates at approximately human arm movement rates. Experimental results in which a moving model train is tracked, stably grasped, and picked up by the system are presented. The algorithms developed to relate sensing to actuation are quite general and applicable to a variety of complex robotic tasks.Roboticspka1Computer ScienceArticlesReal-Time Visual Servoinghttps://academiccommons.columbia.edu/catalog/ac:145194
Allen, Peter K.; Yoshimi, Billibon; Timcenko, Aleksandarhttp://hdl.handle.net/10022/AC:P:12762Thu, 08 Mar 2012 11:00:18 +0000This paper describes a new real-time tracking algorithm in conjunction with a predictive filter to allow real-time visual servoing of a robotic arm that is following a moving object. The system consists of two calibrated (but unregistered) cameras that provide images to a real-time, pipelined-parallel optic-flow algorithm that can robustly compute optic-flow and calculate the 3-D position of a moving object at approximately 5 Hz rates. These 3-D positions of the moving object serve as input to a predictive kinematic control algorithm that uses an α - β – γ filter to update the position of a robotic arm tracking the moving object. Experimental results are presented for the tracking of a moving model train in a variety of different trajectories.Computer sciencepka1Computer ScienceTechnical reports