Por favor, use este identificador para citar o enlazar este ítem: http://hdl.handle.net/11531/100530
Registro completo de metadatos
Campo DC Valor Lengua/Idioma
dc.contributor.authorGarrido Merchán, Eduardo Césares-ES
dc.contributor.authorHernández Lobato, Danieles-ES
dc.date.accessioned2025-07-10T14:19:02Z-
dc.date.available2025-07-10T14:19:02Z-
dc.date.issued2025-07-08es_ES
dc.identifier.issn0950-7051es_ES
dc.identifier.urihttps:doi.org10.1016j.knosys.2025.113612es_ES
dc.identifier.urihttp://hdl.handle.net/11531/100530-
dc.descriptionArtículos en revistases_ES
dc.description.abstractes-ES
dc.description.abstractBayesian optimization (BO) methods based on information theory have obtained state-of-the-art results in several tasks. These techniques rely on the Kullback–Leibler (KL) divergence to compute the acquisition function. We introduce a novel information-based class of acquisition functions for BO called Alpha Entropy Search (AES). AES is based on the alpha-divergence, which generalizes the KL-divergence. Iteratively, AES selects the next evaluation point as the one whose associated target value has the highest level of dependency with respect to the location and associated value of the global maximum of the optimization problem. Dependency is measured in terms of the alpha-divergence, as an alternative to the KL-divergence. Intuitively, this favors evaluating the objective function at the most informative points about the global maximum. The alpha-divergence has a free parameter α, which determines the behavior of the divergence, balancing local and global differences. Therefore, different values of α result in different acquisition functions. AES acquisition lacks a closed-form expression. However, we propose an efficient and accurate approximation using a truncated Gaussian distribution. In practice, the value of α can be chosen by the practitioner, but here we suggest using a combination of acquisition functions obtained by simultaneously considering a range of α values. We provide an implementation of AES in BOTorch and we evaluate its performance in synthetic, benchmark, and real-world experiments involving the tuning of the hyper-parameters of a deep neural network. These experiments show that AES performance is competitive with other information-based acquisition functions such as JES, MES, or PES.en-GB
dc.language.isoen-GBes_ES
dc.sourceRevista: Knowledge-Based Systems, Periodo: 1, Volumen: online, Número: , Página inicial: 113612-1, Página final: 113612-24es_ES
dc.subject.otherInstituto de Investigación Tecnológica (IIT)es_ES
dc.titleAlpha entropy search for new information-based Bayesian optimizationes_ES
dc.typeinfo:eu-repo/semantics/articlees_ES
dc.description.versioninfo:eu-repo/semantics/publishedVersiones_ES
dc.rights.holderes_ES
dc.rights.accessRightsinfo:eu-repo/semantics/openAccesses_ES
dc.keywordses-ES
dc.keywordsBayesian optimization; Information theory; Entropy search; Alpha-divergenceen-GB
Aparece en las colecciones: Artículos

Ficheros en este ítem:
Fichero Descripción Tamaño Formato  
IIT-25-147R6,34 MBUnknownVisualizar/Abrir
IIT-25-147R_preview3,66 kBUnknownVisualizar/Abrir


Los ítems de DSpace están protegidos por copyright, con todos los derechos reservados, a menos que se indique lo contrario.