Issue |
ESAIM: ProcS
Volume 71, 2021
FGS’2019 - 19th French-German-Swiss conference on Optimization
|
|
---|---|---|
Page(s) | 101 - 113 | |
DOI | https://doi.org/10.1051/proc/202171109 | |
Published online | 01 September 2021 |
Box-constrained optimization for minimax supervised learning*,**
1
University of Côte d’Azur, CNRS, I3S laboratory, Sophia-Antipolis, France
2 University of Côte d’Azur, CNRS, laboratory IPMC, Sophia-Antipolis, France
3 University of Côte d’Azur, CNRS, I3S laboratory, Sophia-Antipolis, France
In this paper, we present the optimization procedure for computing the discrete boxconstrained minimax classifier introduced in [1, 2]. Our approach processes discrete or beforehand discretized features. A box-constrained region defines some bounds for each class proportion independently. The box-constrained minimax classifier is obtained from the computation of the least favorable prior which maximizes the minimum empirical risk of error over the box-constrained region. After studying the discrete empirical Bayes risk over the probabilistic simplex, we consider a projected subgradient algorithm which computes the prior maximizing this concave multivariate piecewise affine function over a polyhedral domain. The convergence of our algorithm is established.
Résumé
Nous présentons dans cet article le problème d’optimisation lié au calcul d’un classifieur minimax à contrainte de boîte, ainsi que l’algorithme permettant de calibrer ce classifieur, que nous avons introduit dans [1, 2]. Notre approche considère des variables descriptives discrètes ou préalablement discrétisées. Les contraintes de boîte définissent des bornes sur chaque proporition par classe. Le classifieur est calibré en calculant la distribution a priori qui maximise le risque d’erreur minimum sur le simplexe contraint par boîte. Après avoir montré que ce risque d’erreur minimum est une fonction concave affine par morceaux avec un nombre fini de faces sur le simplexe, nous considérons un algorithme de sous-gradient projeté pour calculer la distribution a priori qui maximise ce risque de Bayes discret sur un domaine polyhédral. La convergence de l’algorithme est démontrée.
© The authors. Published by EDP Sciences, SMAI 2021
This is an Open Access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Current usage metrics show cumulative count of Article Views (full-text article views including HTML views, PDF and ePub downloads, according to the available data) and Abstracts Views on Vision4Press platform.
Data correspond to usage on the plateform after 2015. The current usage metrics is available 48-96 hours after online publication and is updated daily on week days.
Initial download of the metrics may take a while.