Jump to ContentJump to Main Navigation
Semi-Supervised Learning$
Users without a subscription are not able to see the full content.

Olivier Chapelle, Bernhard Scholkopf, and Alexander Zien

Print publication date: 2006

Print ISBN-13: 9780262033589

Published to MIT Press Scholarship Online: August 2013

DOI: 10.7551/mitpress/9780262033589.001.0001

Show Summary Details
Page of

PRINTED FROM MIT PRESS SCHOLARSHIP ONLINE (www.mitpress.universitypressscholarship.com). (c) Copyright The MIT Press, 2021. All Rights Reserved. An individual user may print out a PDF of a single chapter of a monograph in MITSO for personal use.date: 27 July 2021

Data-Dependent Regularization

Data-Dependent Regularization

Chapter:
(p.169) 10 Data-Dependent Regularization
Source:
Semi-Supervised Learning
Author(s):

Corduneanu Adrian

Jaakkola Tommi

Publisher:
The MIT Press
DOI:10.7551/mitpress/9780262033589.003.0010

This chapter considers two ways of representing the topology over examples, either based on complete knowledge of the marginal density or by grouping together examples whose labels should be related. The learning algorithms and sample complexity issues that result from each representation is discussed here. Information regularization is a principle for assigning labels to unlabeled data points in a semi-supervised setting. The broader principle is based on finding labels that minimize the information induced between examples and labels relative to a topology over the examples; any label variation within a small local region of examples ties together the identities of examples and their labels. Such variation should be minimized unless supported directly or indirectly by the available labeled examples.

Keywords:   topology over examples, marginal density, learning algorithms, sample complexity issues, information regularization, unlabeled data points

MIT Press Scholarship Online requires a subscription or purchase to access the full text of books within the service. Public users can however freely search the site and view the abstracts and keywords for each book and chapter.

Please, subscribe or login to access full text content.

If you think you should have access to this title, please contact your librarian.

To troubleshoot, please check our FAQs, and if you can't find the answer there, please contact us.