Jump to ContentJump to Main Navigation
Semi-Supervised Learning$
Users without a subscription are not able to see the full content.

Olivier Chapelle, Bernhard Scholkopf, and Alexander Zien

Print publication date: 2006

Print ISBN-13: 9780262033589

Published to MIT Press Scholarship Online: August 2013

DOI: 10.7551/mitpress/9780262033589.001.0001

Show Summary Details
Page of

PRINTED FROM MIT PRESS SCHOLARSHIP ONLINE (www.mitpress.universitypressscholarship.com). (c) Copyright The MIT Press, 2021. All Rights Reserved. An individual user may print out a PDF of a single chapter of a monograph in MITSO for personal use.date: 29 July 2021

Large-Scale Algorithms

Large-Scale Algorithms

Chapter:
(p.332) (p.333) 18 Large-Scale Algorithms
Source:
Semi-Supervised Learning
Author(s):

Delalleau Olivier

Bengio Yoshua

Le Roux Nicolas

Publisher:
The MIT Press
DOI:10.7551/mitpress/9780262033589.003.0018

This chapter presents a subset selection method that can be used to reduce the original system to one of size m 〈〈 n. The idea is to solve for the labels of a subset S ⊂ X of only m points, while still retaining information from the rest of the data by approximating their label with a linear combination of the labels in S—using the induction formula presented in Chapter 11. This leads to an algorithm whose computational requirements scale as O(m2n) and memory requirements as O(m2), thus allowing one to take advantage of significantly bigger unlabeled data sets than with the original algorithms.

Keywords:   subset selection method, linear combination, induction formula, algorithm, computational requirements, memory requirements, unlabeled data sets

MIT Press Scholarship Online requires a subscription or purchase to access the full text of books within the service. Public users can however freely search the site and view the abstracts and keywords for each book and chapter.

Please, subscribe or login to access full text content.

If you think you should have access to this title, please contact your librarian.

To troubleshoot, please check our FAQs, and if you can't find the answer there, please contact us.