Functional Gradient Motion Planning in Reproducing Kernel Hilbert Spaces

نویسندگان

  • Zita Marinho
  • Byron Boots
  • Anca D. Dragan
  • Arunkumar Byravan
  • Geoffrey J. Gordon
  • Siddhartha S. Srinivasa
چکیده

We introduce a functional gradient descent trajectory optimization algorithm for robot motion planning in Reproducing Kernel Hilbert Spaces (RKHSs). Functional gradient algorithms are a popular choice for motion planning in complex many-degree-of-freedom robots, since they (in theory) work by directly optimizing within a space of continuous trajectories to avoid obstacles while maintaining geometric properties such as smoothness. However, in practice, functional gradient algorithms typically commit to a fixed, finite parameterization of trajectories, often as a list of waypoints. Such a parameterization can lose much of the benefit of reasoning in a continuous trajectory space: e.g., it can require taking an inconveniently small step size and large number of iterations to maintain smoothness. Our work generalizes functional gradient trajectory optimization by formulating it as minimization of a cost functional in an RKHS. This generalization lets us represent trajectories as linear combinations of kernel functions, without any need for waypoints. As a result, we are able to take larger steps and achieve a locally optimal trajectory in just a few iterations. Depending on the selection of kernel, we can directly optimize in spaces of trajectories that are inherently smooth in velocity, jerk, curvature, etc., and that have a low-dimensional, adaptively chosen parameterization. Our experiments illustrate the effectiveness of the planner for different kernels, including Gaussian RBFs, Laplacian RBFs, and B-splines, as compared to the standard discretized waypoint representation. 1 ar X iv :1 60 1. 03 64 8v 1 [ cs .R O ] 1 4 Ja n 20 16

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Fisher’s Linear Discriminant Analysis for Weather Data by reproducing kernel Hilbert spaces framework

Recently with science and technology development, data with functional nature are easy to collect. Hence, statistical analysis of such data is of great importance. Similar to multivariate analysis, linear combinations of random variables have a key role in functional analysis. The role of Theory of Reproducing Kernel Hilbert Spaces is very important in this content. In this paper we study a gen...

متن کامل

Some Properties of Reproducing Kernel Banach and Hilbert Spaces

This paper is devoted to the study of reproducing kernel Hilbert spaces. We focus on multipliers of reproducing kernel Banach and Hilbert spaces. In particular, we try to extend this concept and prove some related theorems. Moreover, we focus on reproducing kernels in vector-valued reproducing kernel Hilbert spaces. In particular, we extend reproducing kernels to relative reproducing kernels an...

متن کامل

Solving multi-order fractional differential equations by reproducing kernel Hilbert space method

In this paper we propose a relatively new semi-analytical technique to approximate the solution of nonlinear multi-order fractional differential equations (FDEs). We present some results concerning to the uniqueness of solution of nonlinear multi-order FDEs and discuss the existence of solution for nonlinear multi-order FDEs in reproducing kernel Hilbert space (RKHS). We further give an error a...

متن کامل

Regret bounds for Non Convex Quadratic Losses Online Learning over Reproducing Kernel Hilbert Spaces

We present several online algorithms with dimension-free regret bounds for general nonconvex quadratic losses by viewing them as functions in Reproducing Hilbert Kernel Spaces. In our work we adapt the Online Gradient Descent, Follow the Regularized Leader and the Conditional Gradient method meta algorithms for RKHS spaces and provide regret bounds in this setting. By analyzing them as algorith...

متن کامل

Error analysis for online gradient descent algorithms in reproducing kernel Hilbert spaces†

We consider online gradient descent algorithms with general convex loss functions in reproducing kernel Hilbert spaces (RKHS). These algorithms offer an advantageous way for learning from large training sets. We provide general conditions ensuring convergence of the algorithm in the RKHS norm. Explicit generalization error rates for q-norm ε-insensitive regression loss are given by choosing the...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • CoRR

دوره abs/1601.03648  شماره 

صفحات  -

تاریخ انتشار 2016