Bayesian optimization with adaptive kernels for robot control
Resumen: Active policy search combines the trial-and-error methodology from policy search with Bayesian optimization to actively find the optimal policy. First, policy search is a type of reinforcement learning which has become very popular for robot control, for its ability to deal with complex continuous state and action spaces. Second, Bayesian optimization is a sample efficient global optimization method that uses a surrogate model, like a Gaussian process, and optimal decision making to carefully select each sample during the optimization process. Sample efficiency is of paramount importance when each trial involves the real robot, expensive Monte Carlo runs, or a complex simulator. Black-box Bayesian optimization generally assumes a cost function from a stationary process, because nonstationary modeling is usually based on prior knowledge. However, many control problems are inherently nonstationary due to their failure conditions, terminal states and other abrupt effects. In this paper, we present a kernel function specially designed for Bayesian optimization, that allows nonstationary modeling without prior knowledge, using an adaptive local region. The new kernel results in an improved local search (exploitation), without penalizing the global search (exploration), as shown experimentally in well-known optimization benchmarks and robot control scenarios. We finally show its potential for the design of the wing shape of a UAV.
Idioma: Inglés
DOI: 10.1109/ICRA.2017.7989380
Año: 2017
Publicado en: Proceedings - IEEE International Conference on Robotics and Automation 17057742 (2017), 3350-3356
ISSN: 1050-4729

Financiación: info:eu-repo/grantAgreement/ES/MINECO/DPI2015-65962-R
Financiación: info:eu-repo/grantAgreement/ES/UZ/CUD2013-05
Financiación: info:eu-repo/grantAgreement/ES/UZ/CUD2016-17
Tipo y forma: Artículo (PostPrint)

Derechos Reservados Derechos reservados por el editor de la revista


Exportado de SIDERAL (2024-02-09-14:27:55)


Visitas y descargas

Este artículo se encuentra en las siguientes colecciones:
Artículos



 Registro creado el 2024-02-09, última modificación el 2024-02-09


Postprint:
 PDF
Valore este documento:

Rate this document:
1
2
3
 
(Sin ninguna reseña)