Header logo is


2019


no image
Selecting causal brain features with a single conditional independence test per feature

Mastakouri, A., Schölkopf, B., Janzing, D.

Advances in Neural Information Processing Systems 32, 33rd Annual Conference on Neural Information Processing Systems, December 2019 (conference) Accepted

ei

[BibTex]

2019


[BibTex]


Thumb xl 0050 samples slip fig
A Learnable Safety Measure

Heim, S., Rohr, A. V., Trimpe, S., Badri-Spröwitz, A.

Conference on Robot Learning, November 2019 (conference) Accepted

dlg ics

Arxiv [BibTex]

Arxiv [BibTex]


no image
Neural Signatures of Motor Skill in the Resting Brain

Ozdenizci, O., Meyer, T., Wichmann, F., Peters, J., Schölkopf, B., Cetin, M., Grosse-Wentrup, M.

Proceedings of the IEEE International Conference on Systems, Man and Cybernetics (SMC 2019), October 2019 (conference) Accepted

ei

[BibTex]

[BibTex]


Thumb xl distributed pt control
Predictive Triggering for Distributed Control of Resource Constrained Multi-agent Systems

Mastrangelo, J. M., Baumann, D., Trimpe, S.

In Proceedings of the 8th IFAC Workshop on Distributed Estimation and Control in Networked Systems, 8th IFAC Workshop on Distributed Estimation and Control in Networked Systems (NecSys), September 2019 (inproceedings) Accepted

ics

arXiv PDF [BibTex]

arXiv PDF [BibTex]


no image
Beta Power May Mediate the Effect of Gamma-TACS on Motor Performance

Mastakouri, A., Schölkopf, B., Grosse-Wentrup, M.

Engineering in Medicine and Biology Conference (EMBC), July 2019 (conference) Accepted

ei

arXiv PDF [BibTex]

arXiv PDF [BibTex]


no image
Coordinating Users of Shared Facilities via Data-driven Predictive Assistants and Game Theory

Geiger, P., Besserve, M., Winkelmann, J., Proissl, C., Schölkopf, B.

Proceedings of the 35th Conference on Uncertainty in Artificial Intelligence (UAI), pages: 49, (Editors: Amir Globerson and Ricardo Silva), AUAI Press, July 2019 (conference)

ei

link (url) [BibTex]

link (url) [BibTex]


Thumb xl acc pulse ctrl
Event-triggered Pulse Control with Model Learning (if Necessary)

Baumann, D., Solowjow, F., Johansson, K. H., Trimpe, S.

In Proceedings of the American Control Conference, pages: 792-797, American Control Conference (ACC), July 2019 (inproceedings)

ics

arXiv PDF [BibTex]

arXiv PDF [BibTex]


no image
The Sensitivity of Counterfactual Fairness to Unmeasured Confounding

Kilbertus, N., Ball, P. J., Kusner, M. J., Weller, A., Silva, R.

Proceedings of the 35th Conference on Uncertainty in Artificial Intelligence (UAI), pages: 213, (Editors: Amir Globerson and Ricardo Silva), AUAI Press, July 2019 (conference)

ei

link (url) [BibTex]

link (url) [BibTex]


no image
The Incomplete Rosetta Stone problem: Identifiability results for Multi-view Nonlinear ICA

Gresele*, L., Rubenstein*, P. K., Mehrjou, A., Locatello, F., Schölkopf, B.

Proceedings of the 35th Conference on Uncertainty in Artificial Intelligence (UAI), pages: 53, (Editors: Amir Globerson and Ricardo Silva), AUAI Press, July 2019, *equal contribution (conference)

ei

link (url) [BibTex]

link (url) [BibTex]


no image
Random Sum-Product Networks: A Simple and Effective Approach to Probabilistic Deep Learning

Peharz, R., Vergari, A., Stelzner, K., Molina, A., Shao, X., Trapp, M., Kersting, K., Ghahramani, Z.

Proceedings of the 35th Conference on Uncertainty in Artificial Intelligence (UAI), pages: 124, (Editors: Amir Globerson and Ricardo Silva), AUAI Press, July 2019 (conference)

ei

link (url) [BibTex]

link (url) [BibTex]


Thumb xl pic
Data-driven inference of passivity properties via Gaussian process optimization

Romer, A., Trimpe, S., Allgöwer, F.

In Proceedings of the European Control Conference, European Control Conference (ECC), June 2019 (inproceedings) Accepted

ics

PDF [BibTex]

PDF [BibTex]


no image
Kernel Mean Matching for Content Addressability of GANs

Jitkrittum*, W., Sangkloy*, P., Gondal, M. W., Raj, A., Hays, J., Schölkopf, B.

Proceedings of the 36th International Conference on Machine Learning (ICML), 97, pages: 3140-3151, Proceedings of Machine Learning Research, (Editors: Chaudhuri, Kamalika and Salakhutdinov, Ruslan), PMLR, June 2019, *equal contribution (conference)

ei

PDF link (url) [BibTex]

PDF link (url) [BibTex]


no image
Challenging Common Assumptions in the Unsupervised Learning of Disentangled Representations

Locatello, F., Bauer, S., Lucic, M., Raetsch, G., Gelly, S., Schölkopf, B., Bachem, O.

Proceedings of the 36th International Conference on Machine Learning (ICML), 97, pages: 4114-4124, Proceedings of Machine Learning Research, (Editors: Chaudhuri, Kamalika and Salakhutdinov, Ruslan), PMLR, June 2019 (conference)

ei

PDF link (url) [BibTex]

PDF link (url) [BibTex]


Thumb xl cvpr2019 demo v2.001
Local Temporal Bilinear Pooling for Fine-grained Action Parsing

Zhang, Y., Tang, S., Muandet, K., Jarvers, C., Neumann, H.

In Proceedings IEEE Conf. on Computer Vision and Pattern Recognition (CVPR), IEEE International Conference on Computer Vision and Pattern Recognition (CVPR) 2019, June 2019 (inproceedings)

Abstract
Fine-grained temporal action parsing is important in many applications, such as daily activity understanding, human motion analysis, surgical robotics and others requiring subtle and precise operations in a long-term period. In this paper we propose a novel bilinear pooling operation, which is used in intermediate layers of a temporal convolutional encoder-decoder net. In contrast to other work, our proposed bilinear pooling is learnable and hence can capture more complex local statistics than the conventional counterpart. In addition, we introduce exact lower-dimension representations of our bilinear forms, so that the dimensionality is reduced with neither information loss nor extra computation. We perform intensive experiments to quantitatively analyze our model and show the superior performances to other state-of-the-art work on various datasets.

ei ps

Code video demo pdf link (url) [BibTex]

Code video demo pdf link (url) [BibTex]


no image
Generate Semantically Similar Images with Kernel Mean Matching

Jitkrittum*, W., Sangkloy*, P., Gondal, M. W., Raj, A., Hays, J., Schölkopf, B.

6th Workshop Women in Computer Vision (WiCV) (oral presentation), June 2019, *equal contribution (conference) Accepted

ei

[BibTex]

[BibTex]


Thumb xl coverimage1
Trajectory-Based Off-Policy Deep Reinforcement Learning

Doerr, A., Volpp, M., Toussaint, M., Trimpe, S., Daniel, C.

In Proceedings of the International Conference on Machine Learning (ICML), International Conference on Machine Learning (ICML), June 2019 (inproceedings) Accepted

Abstract
Policy gradient methods are powerful reinforcement learning algorithms and have been demonstrated to solve many complex tasks. However, these methods are also data-inefficient, afflicted with high variance gradient estimates, and frequently get stuck in local optima. This work addresses these weaknesses by combining recent improvements in the reuse of off-policy data and exploration in parameter space with deterministic behavioral policies. The resulting objective is amenable to standard neural network optimization strategies like stochastic gradient descent or stochastic gradient Hamiltonian Monte Carlo. Incorporation of previous rollouts via importance sampling greatly improves data-efficiency, whilst stochastic optimization schemes facilitate the escape from local optima. We evaluate the proposed approach on a series of continuous control benchmark tasks. The results show that the proposed algorithm is able to successfully and reliably learn solutions using fewer system interactions than standard policy gradient methods.

ics

arXiv PDF [BibTex]

arXiv PDF [BibTex]


no image
Robustly Disentangled Causal Mechanisms: Validating Deep Representations for Interventional Robustness

Suter, R., Miladinovic, D., Schölkopf, B., Bauer, S.

Proceedings of the 36th International Conference on Machine Learning (ICML), 97, pages: 6056-6065, Proceedings of Machine Learning Research, (Editors: Chaudhuri, Kamalika and Salakhutdinov, Ruslan), PMLR, June 2019 (conference)

ei

PDF link (url) [BibTex]

PDF link (url) [BibTex]


no image
First-Order Adversarial Vulnerability of Neural Networks and Input Dimension

Simon-Gabriel, C., Ollivier, Y., Bottou, L., Schölkopf, B., Lopez-Paz, D.

Proceedings of the 36th International Conference on Machine Learning (ICML), 97, pages: 5809-5817, Proceedings of Machine Learning Research, (Editors: Chaudhuri, Kamalika and Salakhutdinov, Ruslan), PMLR, June 2019 (conference)

ei

PDF link (url) [BibTex]

PDF link (url) [BibTex]


no image
Overcoming Mean-Field Approximations in Recurrent Gaussian Process Models

Ialongo, A. D., Van Der Wilk, M., Hensman, J., Rasmussen, C. E.

In Proceedings of the 36th International Conference on Machine Learning (ICML), 97, pages: 2931-2940, Proceedings of Machine Learning Research, (Editors: Chaudhuri, Kamalika and Salakhutdinov, Ruslan), PMLR, June 2019 (inproceedings)

ei

PDF link (url) [BibTex]

PDF link (url) [BibTex]


no image
Meta learning variational inference for prediction

Gordon, J., Bronskill, J., Bauer, M., Nowozin, S., Turner, R.

7th International Conference on Learning Representations (ICLR), May 2019 (conference) Accepted

ei

arXiv link (url) [BibTex]

arXiv link (url) [BibTex]


no image
Deep Lagrangian Networks: Using Physics as Model Prior for Deep Learning

Lutter, M., Ritter, C., Peters, J.

7th International Conference on Learning Representations (ICLR), May 2019 (conference) Accepted

ei

link (url) [BibTex]

link (url) [BibTex]


no image
DeepOBS: A Deep Learning Optimizer Benchmark Suite

Schneider, F., Balles, L., Hennig, P.

7th International Conference on Learning Representations (ICLR), May 2019 (conference) Accepted

ei pn

link (url) [BibTex]

link (url) [BibTex]


no image
Disentangled State Space Models: Unsupervised Learning of Dynamics across Heterogeneous Environments

Miladinović*, D., Gondal*, M. W., Schölkopf, B., Buhmann, J. M., Bauer, S.

Deep Generative Models for Highly Structured Data Workshop at ICLR, May 2019, *equal contribution (conference) Accepted

ei

link (url) [BibTex]

link (url) [BibTex]


no image
SOM-VAE: Interpretable Discrete Representation Learning on Time Series

Fortuin, V., Hüser, M., Locatello, F., Strathmann, H., Rätsch, G.

7th International Conference on Learning Representations (ICLR), May 2019 (conference) Accepted

ei

link (url) [BibTex]

link (url) [BibTex]


no image
Resampled Priors for Variational Autoencoders

Bauer, M., Mnih, A.

22nd International Conference on Artificial Intelligence and Statistics, April 2019 (conference) Accepted

ei

arXiv [BibTex]

arXiv [BibTex]


no image
Semi-Generative Modelling: Covariate-Shift Adaptation with Cause and Effect Features

von Kügelgen, J., Mey, A., Loog, M.

Proceedings of the 22nd International Conference on Artificial Intelligence and Statistics (AISTATS), 89, pages: 1361-1369, (Editors: Kamalika Chaudhuri and Masashi Sugiyama), PMLR, April 2019 (conference)

ei

PDF link (url) [BibTex]

PDF link (url) [BibTex]


Thumb xl testbed v5
Feedback Control Goes Wireless: Guaranteed Stability over Low-power Multi-hop Networks

(Best Paper Award)

Mager, F., Baumann, D., Jacob, R., Thiele, L., Trimpe, S., Zimmerling, M.

In Proceedings of the 10th ACM/IEEE International Conference on Cyber-Physical Systems, pages: 97-108, 10th ACM/IEEE International Conference on Cyber-Physical Systems, April 2019 (inproceedings)

Abstract
Closing feedback loops fast and over long distances is key to emerging applications; for example, robot motion control and swarm coordination require update intervals below 100 ms. Low-power wireless is preferred for its flexibility, low cost, and small form factor, especially if the devices support multi-hop communication. Thus far, however, closed-loop control over multi-hop low-power wireless has only been demonstrated for update intervals on the order of multiple seconds. This paper presents a wireless embedded system that tames imperfections impairing control performance such as jitter or packet loss, and a control design that exploits the essential properties of this system to provably guarantee closed-loop stability for linear dynamic systems. Using experiments on a testbed with multiple cart-pole systems, we are the first to demonstrate the feasibility and to assess the performance of closed-loop control and coordination over multi-hop low-power wireless for update intervals from 20 ms to 50 ms.

ics

arXiv PDF DOI Project Page [BibTex]

arXiv PDF DOI Project Page [BibTex]


no image
Sobolev Descent

Mroueh, Y., Sercu, T., Raj, A.

Proceedings of the 22nd International Conference on Artificial Intelligence and Statistics (AISTATS), 89, pages: 2976-2985, (Editors: Kamalika Chaudhuri and Masashi Sugiyama), PMLR, April 2019 (conference)

ei

PDF link (url) [BibTex]

PDF link (url) [BibTex]


no image
Fast and Robust Shortest Paths on Manifolds Learned from Data

Arvanitidis, G., Hauberg, S., Hennig, P., Schober, M.

Proceedings of the 22nd International Conference on Artificial Intelligence and Statistics (AISTATS), 89, pages: 1506-1515, (Editors: Kamalika Chaudhuri and Masashi Sugiyama), PMLR, April 2019 (conference)

ei pn

PDF link (url) [BibTex]

PDF link (url) [BibTex]


no image
Fast Gaussian Process Based Gradient Matching for Parameter Identification in Systems of Nonlinear ODEs

Wenk, P., Gotovos, A., Bauer, S., Gorbach, N., Krause, A., Buhmann, J. M.

Proceedings of the 22nd International Conference on Artificial Intelligence and Statistics (AISTATS), 89, pages: 1351-1360, (Editors: Kamalika Chaudhuri and Masashi Sugiyama), PMLR, April 2019 (conference)

ei

PDF PDF link (url) [BibTex]

PDF PDF link (url) [BibTex]


Thumb xl 543 figure0 1
Active Probabilistic Inference on Matrices for Pre-Conditioning in Stochastic Optimization

de Roos, F., Hennig, P.

Proceedings of the 22nd International Conference on Artificial Intelligence and Statistics (AISTATS), 89, pages: 1448-1457, (Editors: Kamalika Chaudhuri and Masashi Sugiyama), PMLR, April 2019 (conference)

Abstract
Pre-conditioning is a well-known concept that can significantly improve the convergence of optimization algorithms. For noise-free problems, where good pre-conditioners are not known a priori, iterative linear algebra methods offer one way to efficiently construct them. For the stochastic optimization problems that dominate contemporary machine learning, however, this approach is not readily available. We propose an iterative algorithm inspired by classic iterative linear solvers that uses a probabilistic model to actively infer a pre-conditioner in situations where Hessian-projections can only be constructed with strong Gaussian noise. The algorithm is empirically demonstrated to efficiently construct effective pre-conditioners for stochastic gradient descent and its variants. Experiments on problems of comparably low dimensionality show improved convergence. In very high-dimensional problems, such as those encountered in deep learning, the pre-conditioner effectively becomes an automatic learning-rate adaptation scheme, which we also empirically show to work well.

pn ei

PDF link (url) [BibTex]

PDF link (url) [BibTex]


no image
AReS and MaRS Adversarial and MMD-Minimizing Regression for SDEs

Abbati*, G., Wenk*, P., Osborne, M. A., Krause, A., Schölkopf, B., Bauer, S.

Proceedings of the 36th International Conference on Machine Learning (ICML), 97, pages: 1-10, Proceedings of Machine Learning Research, (Editors: Chaudhuri, Kamalika and Salakhutdinov, Ruslan), PMLR, 2019, *equal contribution (conference)

ei

PDF link (url) [BibTex]

PDF link (url) [BibTex]


no image
Kernel Stein Tests for Multiple Model Comparison

Lim, J. N., Yamada, M., Schölkopf, B., Jitkrittum, W.

Advances in Neural Information Processing Systems 32, 33rd Annual Conference on Neural Information Processing Systems, 2019 (conference) To be published

ei

[BibTex]

[BibTex]


no image
MYND: A Platform for Large-scale Neuroscientific Studies

Hohmann, M. R., Hackl, M., Wirth, B., Zaman, T., Enficiaud, R., Grosse-Wentrup, M., Schölkopf, B.

Proceedings of the 2019 Conference on Human Factors in Computing Systems (CHI), 2019 (conference) Accepted

ei

[BibTex]

[BibTex]


no image
A Kernel Stein Test for Comparing Latent Variable Models

Kanagawa, H., Jitkrittum, W., Mackey, L., Fukumizu, K., Gretton, A.

2019 (conference) Submitted

ei

arXiv [BibTex]

arXiv [BibTex]


Thumb xl rae
From Variational to Deterministic Autoencoders

Ghosh*, P., Sajjadi*, M. S. M., Vergari, A., Black, M. J., Schölkopf, B.

2019, *equal contribution (conference) Submitted

Abstract
Variational Autoencoders (VAEs) provide a theoretically-backed framework for deep generative models. However, they often produce “blurry” images, which is linked to their training objective. Sampling in the most popular implementation, the Gaussian VAE, can be interpreted as simply injecting noise to the input of a deterministic decoder. In practice, this simply enforces a smooth latent space structure. We challenge the adoption of the full VAE framework on this specific point in favor of a simpler, deterministic one. Specifically, we investigate how substituting stochasticity with other explicit and implicit regularization schemes can lead to a meaningful latent space without having to force it to conform to an arbitrarily chosen prior. To retrieve a generative mechanism for sampling new data points, we propose to employ an efficient ex-post density estimation step that can be readily adopted both for the proposed deterministic autoencoders as well as to improve sample quality of existing VAEs. We show in a rigorous empirical study that regularized deterministic autoencoding achieves state-of-the-art sample quality on the common MNIST, CIFAR-10 and CelebA datasets.

ei ps

arXiv [BibTex]


no image
Fisher Efficient Inference of Intractable Models

Liu, S., Kanamori, T., Jitkrittum, W., Chen, Y.

Advances in Neural Information Processing Systems 32, 33rd Annual Conference on Neural Information Processing Systems, 2019 (conference) To be published

ei

arXiv [BibTex]

arXiv [BibTex]

2007


no image
Towards compliant humanoids: an experimental assessment of suitable task space position/orientation controllers

Nakanishi, J., Mistry, M., Peters, J., Schaal, S.

In IROS 2007, 2007, pages: 2520-2527, (Editors: Grant, E. , T. C. Henderson), IEEE Service Center, Piscataway, NJ, USA, IEEE/RSJ International Conference on Intelligent Robots and Systems, November 2007 (inproceedings)

Abstract
Compliant control will be a prerequisite for humanoid robotics if these robots are supposed to work safely and robustly in human and/or dynamic environments. One view of compliant control is that a robot should control a minimal number of degrees-of-freedom (DOFs) directly, i.e., those relevant DOFs for the task, and keep the remaining DOFs maximally compliant, usually in the null space of the task. This view naturally leads to task space control. However, surprisingly few implementations of task space control can be found in actual humanoid robots. This paper makes a first step towards assessing the usefulness of task space controllers for humanoids by investigating which choices of controllers are available and what inherent control characteristics they have—this treatment will concern position and orientation control, where the latter is based on a quaternion formulation. Empirical evaluations on an anthropomorphic Sarcos master arm illustrate the robustness of the different controllers as well as the eas e of implementing and tuning them. Our extensive empirical results demonstrate that simpler task space controllers, e.g., classical resolved motion rate control or resolved acceleration control can be quite advantageous in face of inevitable modeling errors in model-based control, and that well chosen formulations are easy to implement and quite robust, such that they are useful for humanoids.

ei

PDF Web DOI [BibTex]

2007


PDF Web DOI [BibTex]


no image
Performance Stabilization and Improvement in Graph-based Semi-supervised Learning with Ensemble Method and Graph Sharpening

Choi, I., Shin, H.

In Korean Data Mining Society Conference, pages: 257-262, Korean Data Mining Society, Seoul, Korea, Korean Data Mining Society Conference, November 2007 (inproceedings)

ei

PDF [BibTex]

PDF [BibTex]


no image
Discriminative Subsequence Mining for Action Classification

Nowozin, S., BakIr, G., Tsuda, K.

In ICCV 2007, pages: 1919-1923, IEEE Computer Society, Los Alamitos, CA, USA, 11th IEEE International Conference on Computer Vision, October 2007 (inproceedings)

Abstract
Recent approaches to action classification in videos have used sparse spatio-temporal words encoding local appearance around interesting movements. Most of these approaches use a histogram representation, discarding the temporal order among features. But this ordering information can contain important information about the action itself, e.g. consider the sport disciplines of hurdle race and long jump, where the global temporal order of motions (running, jumping) is important to discriminate between the two. In this work we propose to use a sequential representation which retains this temporal order. Further, we introduce Discriminative Subsequence Mining to find optimal discriminative subsequence patterns. In combination with the LPBoost classifier, this amounts to simultaneously learning a classification function and performing feature selection in the space of all possible feature sequences. The resulting classifier linearly combines a small number of interpretable decision functions, each checking for the presence of a single discriminative pattern. The classifier is benchmarked on the KTH action classification data set and outperforms the best known results in the literature.

ei

PDF Web DOI [BibTex]

PDF Web DOI [BibTex]


no image
A Hilbert Space Embedding for Distributions

Smola, A., Gretton, A., Song, L., Schölkopf, B.

In Algorithmic Learning Theory, Lecture Notes in Computer Science 4754 , pages: 13-31, (Editors: M Hutter and RA Servedio and E Takimoto), Springer, Berlin, Germany, 18th International Conference on Algorithmic Learning Theory (ALT), October 2007 (inproceedings)

Abstract
We describe a technique for comparing distributions without the need for density estimation as an intermediate step. Our approach relies on mapping the distributions into a reproducing kernel Hilbert space. Applications of this technique can be found in two-sample tests, which are used for determining whether two sets of observations arise from the same distribution, covariate shift correction, local learning, measures of independence, and density estimation.

ei

PDF PDF DOI [BibTex]

PDF PDF DOI [BibTex]


no image
Cluster Identification in Nearest-Neighbor Graphs

Maier, M., Hein, M., von Luxburg, U.

In ALT 2007, pages: 196-210, (Editors: Hutter, M. , R. A. Servedio, E. Takimoto), Springer, Berlin, Germany, 18th International Conference on Algorithmic Learning Theory, October 2007 (inproceedings)

Abstract
Assume we are given a sample of points from some underlying distribution which contains several distinct clusters. Our goal is to construct a neighborhood graph on the sample points such that clusters are ``identified‘‘: that is, the subgraph induced by points from the same cluster is connected, while subgraphs corresponding to different clusters are not connected to each other. We derive bounds on the probability that cluster identification is successful, and use them to predict ``optimal‘‘ values of k for the mutual and symmetric k-nearest-neighbor graphs. We point out different properties of the mutual and symmetric nearest-neighbor graphs related to the cluster identification problem.

ei

PDF PDF DOI [BibTex]

PDF PDF DOI [BibTex]


no image
Inducing Metric Violations in Human Similarity Judgements

Laub, J., Macke, J., Müller, K., Wichmann, F.

In Advances in Neural Information Processing Systems 19, pages: 777-784, (Editors: Schölkopf, B. , J. Platt, T. Hofmann), MIT Press, Cambridge, MA, USA, Twentieth Annual Conference on Neural Information Processing Systems (NIPS), September 2007 (inproceedings)

Abstract
Attempting to model human categorization and similarity judgements is both a very interesting but also an exceedingly difficult challenge. Some of the difficulty arises because of conflicting evidence whether human categorization and similarity judgements should or should not be modelled as to operate on a mental representation that is essentially metric. Intuitively, this has a strong appeal as it would allow (dis)similarity to be represented geometrically as distance in some internal space. Here we show how a single stimulus, carefully constructed in a psychophysical experiment, introduces l2 violations in what used to be an internal similarity space that could be adequately modelled as Euclidean. We term this one influential data point a conflictual judgement. We present an algorithm of how to analyse such data and how to identify the crucial point. Thus there may not be a strict dichotomy between either a metric or a non-metric internal space but rather degrees to which potentially large subsets of stimuli are represented metrically with a small subset causing a global violation of metricity.

ei

PDF Web [BibTex]

PDF Web [BibTex]


no image
Cross-Validation Optimization for Large Scale Hierarchical Classification Kernel Methods

Seeger, M.

In Advances in Neural Information Processing Systems 19, pages: 1233-1240, (Editors: Schölkopf, B. , J. Platt, T. Hofmann), MIT Press, Cambridge, MA, USA, Twentieth Annual Conference on Neural Information Processing Systems (NIPS), September 2007 (inproceedings)

Abstract
We propose a highly efficient framework for kernel multi-class models with a large and structured set of classes. Kernel parameters are learned automatically by maximizing the cross-validation log likelihood, and predictive probabilities are estimated. We demonstrate our approach on large scale text classification tasks with hierarchical class structure, achieving state-of-the-art results in an order of magnitude less time than previous work.

ei

PDF Web [BibTex]

PDF Web [BibTex]


no image
A Local Learning Approach for Clustering

Wu, M., Schölkopf, B.

In Advances in Neural Information Processing Systems 19, pages: 1529-1536, (Editors: B Schölkopf and J Platt and T Hofmann), MIT Press, Cambridge, MA, USA, 20th Annual Conference on Neural Information Processing Systems (NIPS), September 2007 (inproceedings)

Abstract
We present a local learning approach for clustering. The basic idea is that a good clustering result should have the property that the cluster label of each data point can be well predicted based on its neighboring data and their cluster labels, using current supervised learning methods. An optimization problem is formulated such that its solution has the above property. Relaxation and eigen-decomposition are applied to solve this optimization problem. We also briefly investigate the parameter selection issue and provide a simple parameter selection method for the proposed algorithm. Experimental results are provided to validate the effectiveness of the proposed approach.

ei

PDF Web [BibTex]

PDF Web [BibTex]


no image
Branch and Bound for Semi-Supervised Support Vector Machines

Chapelle, O., Sindhwani, V., Keerthi, S.

In Advances in Neural Information Processing Systems 19, pages: 217-224, (Editors: Schölkopf, B. , J. Platt, T. Hofmann), MIT Press, Cambridge, MA, USA, Twentieth Annual Conference on Neural Information Processing Systems (NIPS), September 2007 (inproceedings)

Abstract
Semi-supervised SVMs (S3VMs) attempt to learn low-density separators by maximizing the margin over labeled and unlabeled examples. The associated optimization problem is non-convex. To examine the full potential of S3VMs modulo local minima problems in current implementations, we apply branch and bound techniques for obtaining exact, globally optimal solutions. Empirical evidence suggests that the globally optimal solution can return excellent generalization performance in situations where other implementations fail completely. While our current implementation is only applicable to small datasets, we discuss variants that can potentially lead to practically useful algorithms.

ei

PDF Web [BibTex]

PDF Web [BibTex]


no image
A Kernel Method for the Two-Sample-Problem

Gretton, A., Borgwardt, K., Rasch, M., Schölkopf, B., Smola, A.

In Advances in Neural Information Processing Systems 19, pages: 513-520, (Editors: B Schölkopf and J Platt and T Hofmann), MIT Press, Cambridge, MA, USA, 20th Annual Conference on Neural Information Processing Systems (NIPS), September 2007 (inproceedings)

Abstract
We propose two statistical tests to determine if two samples are from different distributions. Our test statistic is in both cases the distance between the means of the two samples mapped into a reproducing kernel Hilbert space (RKHS). The first test is based on a large deviation bound for the test statistic, while the second is based on the asymptotic distribution of this statistic. The test statistic can be computed in $O(m^2)$ time. We apply our approach to a variety of problems, including attribute matching for databases using the Hungarian marriage method, where our test performs strongly. We also demonstrate excellent performance when comparing distributions over graphs, for which no alternative tests currently exist.

ei

PDF Web [BibTex]

PDF Web [BibTex]


no image
An Efficient Method for Gradient-Based Adaptation of Hyperparameters in SVM Models

Keerthi, S., Sindhwani, V., Chapelle, O.

In Advances in Neural Information Processing Systems 19, pages: 673-680, (Editors: Schölkopf, B. , J. Platt, T. Hofmann), MIT Press, Cambridge, MA, USA, Twentieth Annual Conference on Neural Information Processing Systems (NIPS), September 2007 (inproceedings)

Abstract
We consider the task of tuning hyperparameters in SVM models based on minimizing a smooth performance validation function, e.g., smoothed k-fold cross-validation error, using non-linear optimization techniques. The key computation in this approach is that of the gradient of the validation function with respect to hyperparameters. We show that for large-scale problems involving a wide choice of kernel-based models and validation functions, this computation can be very efficiently done; often within just a fraction of the training time. Empirical results show that a near-optimal set of hyperparameters can be identified by our approach with very few training rounds and gradient computations.

ei

PDF Web [BibTex]

PDF Web [BibTex]


no image
Learning Dense 3D Correspondence

Steinke, F., Schölkopf, B., Blanz, V.

In Advances in Neural Information Processing Systems 19, pages: 1313-1320, (Editors: B Schölkopf and J Platt and T Hofmann), MIT Press, Cambridge, MA, USA, 20th Annual Conference on Neural Information Processing Systems (NIPS), September 2007 (inproceedings)

Abstract
Establishing correspondence between distinct objects is an important and nontrivial task: correctness of the correspondence hinges on properties which are difficult to capture in an a priori criterion. While previous work has used a priori criteria which in some cases led to very good results, the present paper explores whether it is possible to learn a combination of features that, for a given training set of aligned human heads, characterizes the notion of correct correspondence. By optimizing this criterion, we are then able to compute correspondence and morphs for novel heads.

ei

PDF Web [BibTex]

PDF Web [BibTex]