 Title Pages
 Series Foreword
 Preface

1 Introduction to SemiSupervised Learning 
1 A Taxonomy for SemiSupervised Learning Methods 
3 SemiSupervised Text Classification Using EM 
4 Risks of SemiSupervised Learning: How Unlabeled Data Can Degrade Performance of Generative Classifiers 
5 Probabilistic SemiSupervised Clustering with Constraints 
6 Transductive Support Vector Machines 
7 SemiSupervised Learning Using SemiDefinite Programming 
8 Gaussian Processes and the NullCategory Noise Model 
9 Entropy Regularization 
10 DataDependent Regularization 
11 Label Propagation and Quadratic Criterion 
12 The Geometric Basis of SemiSupervised Learning 
13 Discrete Regularization 
14 SemiSupervised Learning with Conditional Harmonic Mixing 
15 Graph Kernels by Spectral Transforms 
16 Spectral Methods for Dimensionality Reduction 
17 Modifying Distances 
18 LargeScale Algorithms 
19 SemiSupervised Protein Classification Using Cluster Kernels 
20 Prediction of Protein Function from Networks 
25 Analysis of Benchmarks 
22 An Augmented PAC Model for SemiSupervised Learning 
23 MetricBased Approaches for SemiSupervised Regression and Classification 
24 Transductive Inference and SemiSupervised Learning 
25 A Discussion of SemiSupervised Learning and Transduction  References
 Notation and Symbols
 Contributors
 Index
Graph Kernels by Spectral Transforms
Graph Kernels by Spectral Transforms
 Chapter:
 (p.276) (p.277) 15 Graph Kernels by Spectral Transforms
 Source:
 SemiSupervised Learning
 Author(s):
Zhu Xiaojin
Kandola Jaz
Lafferty John
Ghahramani Zoubin
 Publisher:
 The MIT Press
This chapter develops an approach to searching over a nonparametric family of spectral transforms by using convex optimization to maximize kernel alignment to the labeled data. Order constraints are imposed to encode a preference for smoothness with respect to the graph structure. This results in a flexible family of kernels that is more datadriven than the standard parametric spectral transforms. This approach relies on a quadratically constrained quadratic program (QCQP) and is computationally practical for large data sets. Many graphbased semisupervised learning methods can be viewed as imposing smoothness conditions on the target function with respect to a graph representing the data points to be labeled. The smoothness properties of the functions are encoded in terms of Mercer kernels over the graph. The central quantity in such regularization is the spectral decomposition of the graph Laplacian, a matrix derived from the graph’s edge weights.
Keywords: nonparametric family, spectral transforms, convex optimization, quadratically constrained quadratic program, QCQP, smoothness conditions, Mercer kernels, graph Laplacian
MIT Press Scholarship Online requires a subscription or purchase to access the full text of books within the service. Public users can however freely search the site and view the abstracts and keywords for each book and chapter.
Please, subscribe or login to access full text content.
If you think you should have access to this title, please contact your librarian.
To troubleshoot, please check our FAQs, and if you can't find the answer there, please contact us.
 Title Pages
 Series Foreword
 Preface

1 Introduction to SemiSupervised Learning 
1 A Taxonomy for SemiSupervised Learning Methods 
3 SemiSupervised Text Classification Using EM 
4 Risks of SemiSupervised Learning: How Unlabeled Data Can Degrade Performance of Generative Classifiers 
5 Probabilistic SemiSupervised Clustering with Constraints 
6 Transductive Support Vector Machines 
7 SemiSupervised Learning Using SemiDefinite Programming 
8 Gaussian Processes and the NullCategory Noise Model 
9 Entropy Regularization 
10 DataDependent Regularization 
11 Label Propagation and Quadratic Criterion 
12 The Geometric Basis of SemiSupervised Learning 
13 Discrete Regularization 
14 SemiSupervised Learning with Conditional Harmonic Mixing 
15 Graph Kernels by Spectral Transforms 
16 Spectral Methods for Dimensionality Reduction 
17 Modifying Distances 
18 LargeScale Algorithms 
19 SemiSupervised Protein Classification Using Cluster Kernels 
20 Prediction of Protein Function from Networks 
25 Analysis of Benchmarks 
22 An Augmented PAC Model for SemiSupervised Learning 
23 MetricBased Approaches for SemiSupervised Regression and Classification 
24 Transductive Inference and SemiSupervised Learning 
25 A Discussion of SemiSupervised Learning and Transduction  References
 Notation and Symbols
 Contributors
 Index