Chair of Artificial Intelligence and Machine Learning
print


Breadcrumb Navigation


Content

Forward Regression for Online Algorithm Selection (BA/MA)

Topic for a bachelor/master's thesis

Short Description:

Ridge regression is a standard approach for estimating the coefficients of a linear model when the features are highly correlated [1]. Especially in several online learning settings, it has been used quite intensively as it comes with a closed-form solution for the coefficient estimates, which in turn allows efficient coefficient updates once new data is available. One practically relevant example of an online learning setting is the online algorithm selection (OAS) problem, in which instances of an algorithmic problem class are presented to an algorithm selector one after another, and the algorithm selector has to quickly select a presumably best algorithm from a fixed set of candidate algorithms [2]. In many online prediction tasks, such as OAS, one usually observes the feature for which a prediction of the label(s) is to be made before the prediction is made. This has motivated to adapt ridge regression, which only takes the previously seen features into account, such that the current feature is considered for the prediction as well. One such adaptation is the forward regression [3,4], for which recent work has shown theoretically that it might be advantageous over ridge regression in specific online learning settings [5]. These theoretical results were corroborated by a simulation study on synthetic data sets. The goal of the thesis is to investigate whether the forward regression leads to superior results over ridge regression in OAS problems. To this end, the ridge regression underlying the various algorithm selectors proposed in [6] will be replaced by the forward algorithm, and the resulting algorithm selectors will be evaluated using the ASlib benchmark.

Prerequisites

Solid background in machine learning, especially supervised learning (e.g., classification, regression) and experimental evaluation, programming skills.

Contact

Dr. Viktor Bengs or Prof. Eyke Hüllermeier

References

  • [1] A.E. Hoerl, and R. Kennard. Ridge regression: biased estimation for nonorthogonal problems, Technometrics 12: 55–67, 1970
  • [2] H. Degroote, P.D. Causmaecker, B. Bischl, and L. Kotthoff. A Regression-Based Methodology for Online Algorithm Selection. In Proceedings of the Eleventh International Symposium on Combinatorial Search, SOCS 2018, 37–45, 2018
  • [3] V. Vovk. Competitive on-line statistics. International Statistical Review, 69(2): 213–248, 2001
  • [4] K. S. Azoury, and M.K. Warmuth. Relative loss bounds for on-line density estimation with the exponential family of distributions. Machine Learning, 43(3):211–246, 2001
  • [5] R. Ouhamma, O.A. Maillard, and V. Perchet. Stochastic Online Linear Regression: the Forward Algorithm to Replace Ridge. Advances in Neural Information Processing Systems 34: 24430-41, 2021
  • [6] A. Tornede, V. Bengs, and E. Hüllermeier. Machine Learning for Online Algorithm Selection under Censored Feedback. AAAI Conference on Artificial Intelligence 2022, Vol. 36, No. 9, pp. 10370-10380, 2022