Articulated Object Tracking from Visual Sensory Data for Robotic Manipulation

نویسندگان

  • Anastasia Bolotnikova
  • Gholamreza Anbarjafari
  • Abderrahmane Kheddar
  • Iiris Lüsi
  • Antonio Paolillo
  • Kévin Chappellet
چکیده

In order for a robot to manipulate an articulated object, it needs to know its state (i.e. its pose); that is to say: where and in which configuration it is. The result of the object’s state estimation is to be provided as a feedback to the control to compute appropriate robot motion and achieve the desired manipulation outcome. This is the main topic of this thesis, where articulated object state estimation is solved using visual feedback. Vision based servoing is implemented in a Quadratic Programming task space control framework to enable humanoid robot to perform articulated objects manipulation. We thoroughly developed our methodology for vision based articulated object state estimation on these bases. We demonstrate its efficiency by assessing it on several real experiments involving the HRP-4 humanoid robot. We also propose to combine machine learning and edge extraction techniques to achieve markerless, real-time and robust visual feedback for articulated object manipulation.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Visual Articulated Tracking in the Presence of Occlusions

This paper focuses on visual tracking of a robotic manipulator during manipulation. In this situation, tracking is prone to failure when visual distractions are created by the object being manipulated and the clutter in the environment. Current state-of-the-art approaches, which typically rely on model-fitting using Iterative Closest Point (ICP), fail in the presence of distracting data points ...

متن کامل

Sensor Integration and Task Planning for Mobile Manipulation

Robotic mobile manipulation in unstructured environments requires integration of a number of key reasearch areas such as localization, navigation, object recognition, visual tracking/servoing, grasping and object manipulation. It has been demonstrated that, given the above, and through simple sequencing of basic skills, a robust system can be designed, [19]. In order to provide the robustness a...

متن کامل

Interactive Segmentation, Tracking, and Kinematic Modeling of Unknown Articulated Objects

We present an interactive perceptual skill for segmenting, tracking, and kinematic modeling of 3D articulated objects. This skill is a prerequisite for general manipulation in unstructured environments. Robot-environment interaction is used to move an unknown object, creating a perceptual signal that reveals the kinematic properties of the object. The resulting perceptual information can then i...

متن کامل

Markerless Self-Recognition and Segmentation of Robotic Manipulator in Still Images

Vision is a crucial capability for enabling robots to perceive and interact with their environment, e.g. manipulating or grasping objects. A current trend is bringing closer the aspects of interaction and perception, on the one hand by integrating visual information directly in the control process, and on the other hand, using interaction itself to help perception, allowing robots to explore th...

متن کامل

Visual Perception and Robotic Manipulation - 3D Object Recognition, Tracking and Hand-Eye Coordination

Now welcome, the most inspiring book today from a very professional writer in the world, visual perception and robotic manipulation 3d object recognition tracking and hand eye coordination springer tracts in advanced robotics. This is the book that many people in the world waiting for to publish. After the announced of this book, the book lovers are really curious to see how this book is actual...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2017