Tighter Generalisation Bounds via Interpolation - Université PSL (Paris Sciences & Lettres) Accéder directement au contenu
Pré-Publication, Document De Travail Année : 2024

Tighter Generalisation Bounds via Interpolation

Résumé

This paper contains a recipe for deriving new PAC-Bayes generalisation bounds based on the $(f, \Gamma)$-divergence, and, in addition, presents PAC-Bayes generalisation bounds where we interpolate between a series of probability divergences (including but not limited to KL, Wasserstein, and total variation), making the best out of many worlds depending on the posterior distributions properties. We explore the tightness of these bounds and connect them to earlier results from statistical learning, which are specific cases. We also instantiate our bounds as training objectives, yielding non-trivial guarantees and practical performances.

Dates et versions

hal-04456925 , version 1 (14-02-2024)

Licence

Paternité

Identifiants

Citer

Paul Viallard, Maxime Haddouche, Umut Şimşekli, Benjamin Guedj. Tighter Generalisation Bounds via Interpolation. 2024. ⟨hal-04456925⟩
19 Consultations
0 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More