Technical session talks from ICRA 2012
TechTalks from event: Technical session talks from ICRA 2012
Conference registration code to access these videos can be accessed by visiting this link: PaperPlaza. Step-by-step to access these videos are here: step-by-step process .
Why some of the videos are missing? If you had provided your consent form for your video to be published and still it is missing, please contact email@example.com
Simulation and Search in Grasping
Simulating Robot Handling of Large Scale Deformable Objects: Manufacturing of Unique Concrete Reinforcement StructuresAutomatic offline programming of industrial robotic systems is becoming increasingly important due to the larger percentage of desired automation of low volume tasks. Often, such tasks may involve handling of items that can have rather large deflections which are important to take into account when doing offline programming. In this paper such a problem is presented, namely robotic assembly of unique concrete reinforcement structures. Reinforcement bars of 3 meters may deflect up to around 50cm. We illustrate experimentally how the reinforcement bar can be precisely modelled by a structure consisting of rigid parts connected by â€deflection jointsâ€. Such a model can be directly integrated into existing physics simulation engines such as the Open Dynamics Engine (ODE). Finally, we discuss how the simulation will be used for automatic offline programming and present a video with a dynamic simulation of the reinforcement assembly process.
Hybrid Physics Simulation of Multi-Fingered Hands for Dexterous In-Hand ManipulationDextrous object manipulation with multi-fingered robot hands remains one of the key challenges of service robotics. So far, most theoretical approaches and simulators have concentrated on the search for and evaluation of static stable grasps, but with neither a model of the full hand-arm system nor the system dynamics. GraspIt! is probably the bestknown simulator of this kind. In this work we present a simulator that uses the JBullet physics engine to realistically model grasps with multi-fingered hands. It supports manipulation tasks based on a complete arm and hand system, with full calculation of hand and object dynamics. A hybrid dynamics and kinematics approach avoids the oscillations introduced by the different size scale of the arm and hand, so that force-closure grasps are possible in addition to form-closure grasps. The software includes detailed models of our 24-DOF Shadow Dextrous hand and the 6-DOF Mitsubishi PA-10 robot arm. A real-time interface allows us to prepare or to replay and analyze grasp experiments performed on our real robots.
Search-Based Planning for Dual-Arm Manipulation with Upright Orientation ConstraintsDual-arm manipulation is an increasingly important skill for robots operating in home, retail and industrial environments. Dual-arm manipulation is especially essential for tasks involving large objects which are harder to grasp and manipulate using a single arm. In this work, we address dual-arm manipulation of objects in indoor environments. We are particularly focused on tasks that involve an upright orientation constraint on the grasped object. Such constraints are often present in human environments, e.g. when manipulating a tray of food or a container with fluids. In this paper, we present a search-based approach that is capable of planning dual-arm motions, often within one second, in cluttered environments while adhering to the orientation constraints. Our approach systematically constructs a graph in task space and generates motions that are consistent across runs with similar start/goal configurations and are low-cost. These motions come with guarantees on completeness and bounds on the suboptimality with respect to the graph that encodes the planning problem. For many problems, the consistency of the generated motions is important as it helps make the actions of the robot more predictable for a human interacting with the robot.
Generalizing Grasps across Partly Similar ObjectsThe paper starts by reviewing the challenges associated to grasp planning, and previous work on robot grasping. Our review emphasizes the importance of agents that generalize grasping strategies across objects, and that are able to transfer these strategies to novel objects. In the rest of the paper, we then devise a novel approach to the grasp transfer problem, where generalization is achieved by <i>learning</i>, from a set of grasp examples, a dictionary of object parts by which objects are often grasped. We detail the application of dimensionality reduction and unsupervised clustering algorithms to the end of identifying the size and shape of parts that often predict the application of a grasp. The learned dictionary allows our agent to grasp novel objects which share a part with previously seen objects, by matching the learned parts to the current view of the new object, and selecting the grasp associated to the best-fitting part. We present and discuss a proof-of-concept experiment in which a dictionary is learned from a set of synthetic grasp examples. While prior work in this area focused primarily on shape analysis (parts identified, e.g., through visual clustering, or salient structure analysis), the key aspect of this work is the emergence of parts from <i>both</i> object shape <i>and</i> grasp examples. As a result, parts intrinsically encode the intention of executing a grasp.
A Grasp Strategy with the Geometric Centroid of a Groped Object Shape Derived from Contact SpotsThis paper proposes a strategy for grasp and manipulation of unknown objects. In order to derive force relations of fingers, the groped shape of soft fingers is introduced. The groped shape is not equal to the real object shape, but is tightly related to the force equilibrium of fingers. By considering contact forces of the groped shape, a simple control parameters can be derived. The manipulation of an object is easily accomplished by embedding the concept of virtual centroid into the grasp control to redistribute internal forces. With these concepts, an object can be easily translated, rotated, and manipulated by relocating fingers. The proposed method is verified with experiments.
The Application of Particle Filtering to Grasping Acquisition with Visual Occlusion and Tactile SensingAdvanced grasp control algorithms could benefit greatly from accurate tracking of the object as well as an accurate all-around knowledge of the system when the robot attempts a grasp. This motivates our study of the G-SL(AM)2 problem, in which two goals are simultaneously pursued: object tracking relative to the hand and estimation of parameters of the dynamic model. We view the G-SL(AM)2 problem as a filtering problem. Because of stick-slip friction and collisions between the object and hand, suitable dynamic models exhibit strong nonlinearities and jump discontinuities. This fact makes Kalman filters (which assume linearity) and extended Kalman filters (which assume differentiability) inapplicable, and leads us to develop a particle filter. An important practical problem that arises during grasping is occlusion of the view of the object by the robotâ€™s hand. To combat the resulting loss of visual tracking fidelity, we designed a particle filter that incorporates tactile sensor data. The filter is evaluated off-line with data gathered in advance from grasp acquisition experiments conducted with a planar test rig. The results show that our particle filter performs quite well, especially during periods of visual occlusion, in which it is much better than the same filter without tactile data.