Jump to ContentJump to Main Navigation
Machine Learning in Non-Stationary EnvironmentsIntroduction to Covariate Shift Adaptation$
Users without a subscription are not able to see the full content.

Masashi Sugiyama and Motoaki Kawanabe

Print publication date: 2012

Print ISBN-13: 9780262017091

Published to MIT Press Scholarship Online: September 2013

DOI: 10.7551/mitpress/9780262017091.001.0001

Show Summary Details
Page of

PRINTED FROM MIT PRESS SCHOLARSHIP ONLINE (www.mitpress.universitypressscholarship.com). (c) Copyright The MIT Press, 2022. All Rights Reserved. An individual user may print out a PDF of a single chapter of a monograph in MITSO for personal use.date: 30 June 2022

Model Selection

Model Selection

(p.46) (p.47) 3 Model Selection
Machine Learning in Non-Stationary Environments

Masashi Sugiyama

Motoaki Kawanabe

The MIT Press

This chapter addresses the problem of model selection. The success of machine learning techniques depends heavily on the choice of hyperparameters such as basis functions, the kernel bandwidth, the regularization parameter, and the importance-flattening parameter. Thus, model selection is one of the most fundamental and crucial topics in machine learning. Standard model selection schemes such as the Akaike information criterion, cross-validation, and the subspace information criterion have their own theoretical justification in terms of the unbiasedness as generalization error estimators. However, such theoretical guarantees are no longer valid under covariate shift. The chapter introduces their modified variants using importance-weighting techniques, and shows that the modified methods are properly unbiased even under covariate shift. The usefulness of these modified model selection criteria is illustrated through numerical experiments.

Keywords:   machine learning, model selection, importance-weighting techniques, Akaike information criterion, covariate shift

MIT Press Scholarship Online requires a subscription or purchase to access the full text of books within the service. Public users can however freely search the site and view the abstracts and keywords for each book and chapter.

Please, subscribe or login to access full text content.

If you think you should have access to this title, please contact your librarian.

To troubleshoot, please check our FAQs, and if you can't find the answer there, please contact us.