Function Approximation
Function Approximation
This chapter discusses function learning methods under covariate shift. Ordinary empirical risk minimization learning is not consistent under covariate shift for misspecified models, and this inconsistency issue can be resolved by considering importance-weighted loss functions. Here, various importance-weighted empirical risk minimization methods are introduced, including least squares and Huber’s method for regression, and Fisher discriminant analysis, logistic regression, support vector machines, and boosting for classification. Their adaptive and regularized variants are also described. The numerical behavior of these importance-weighted learning methods is illustrated through experiments.
Keywords: covariate shift adaptation, misspecified models, importance-weighted empirical risk minimization methods, least squares regression, Huber regression, Fisher discriminant analysis, logistic regression, support vector machines, boosting
MIT Press Scholarship Online requires a subscription or purchase to access the full text of books within the service. Public users can however freely search the site and view the abstracts and keywords for each book and chapter.
Please, subscribe or login to access full text content.
If you think you should have access to this title, please contact your librarian.
To troubleshoot, please check our FAQs, and if you can't find the answer there, please contact us.