Learning with Invariance via Linear Functionals on Reproducing Kernel Hilbert Space

Xinhua Zhang (NICTA)

NICTA SML SEMINAR

DATE: 2013-08-15
TIME: 11:15:00 - 12:15:00
LOCATION: NICTA - 7 London Circuit
CONTACT: JavaScript must be enabled to display this email address.

ABSTRACT:
Incorporating invariance information is important for many learning problems. To exploit invariances, most existing methods resort to approximations that either lead to expensive optimization problems such as semi-definite programming, or rely on separation oracles to retain tractability. Some methods further limit the space of functions and settle for non-convex models. In this paper, we propose a framework for learning in reproducing kernel Hilbert spaces (RKHS) using local invariances that explicitly characterize the behaviour of the target function around data instances. These invariances are emph{compactly} encoded as linear functionals whose value are penalized by some loss function. Based on a representer theorem that we establish, our formulation can be efficiently optimized via a convex program. For the representer theorem to hold, the linear functionals are required to be bounded in the RKHS, and we show that this is true for a variety of commonly used RKHS and invariances. Experiments on learning with unlabelled data and transform invariances show that the proposed method yields better or similar results compared with the state of the art.
BIO:
http://users.cecs.anu.edu.au/~xzhang/



Updated:  13 August 2013 / Responsible Officer:  JavaScript must be enabled to display this email address. / Page Contact:  JavaScript must be enabled to display this email address. / Powered by: Snorkel 1.4