Active Learning With Multiple Kernels

Songnam Hong, Jeongmin Chae

Research output: Contribution to journalArticlepeer-review


Online multiple kernel learning (OMKL) has provided an attractive performance in nonlinear function learning tasks. Leveraging a random feature (RF) approximation, the major drawback of OMKL, known as the curse of dimensionality, has been recently alleviated. These advantages enable RF-based OMKL to be considered in practice. In this article, we introduce a new research problem, named stream-based active MKL (AMKL), in which a learner is allowed to label some selected data from an oracle according to a selection criterion. This is necessary for many real-world applications as acquiring a true label is costly or time consuming. We theoretically prove that the proposed AMKL achieves an optimal sublinear regret O(√T) as in OMKL with little labeled data, implying that the proposed selection criterion indeed avoids unnecessary label requests. Furthermore, we present AMKL with an adaptive kernel selection (named AMKL-AKS) in which irrelevant kernels can be excluded from a kernel dictionary ``on the fly.'' This approach improves the efficiency of active learning and the accuracy of function learning. Via numerical tests with real data sets, we verify the superiority of AMKL-AKS, yielding a similar accuracy performance with OMKL counterpart using a fewer number of labeled data.


  • Active learning (AL)
  • Biomedical imaging
  • Dictionaries
  • Kernel
  • Labeling
  • Optimization
  • Radio frequency
  • Task analysis
  • multiple kernel learning (MKL)
  • online learning
  • reproducing kernel Hilbert space (RKHS).


Dive into the research topics of 'Active Learning With Multiple Kernels'. Together they form a unique fingerprint.

Cite this