Header logo is



Thumb xl thumb 9780262028370
Advanced Structured Prediction

Nowozin, S., Gehler, P. V., Jancsary, J., Lampert, C. H.

Advanced Structured Prediction, pages: 432, Neural Information Processing Series, MIT Press, November 2014 (book)

Abstract
The goal of structured prediction is to build machine learning models that predict relational information that itself has structure, such as being composed of multiple interrelated parts. These models, which reflect prior knowledge, task-specific relations, and constraints, are used in fields including computer vision, speech recognition, natural language processing, and computational biology. They can carry out such tasks as predicting a natural language sentence, or segmenting an image into meaningful components. These models are expressive and powerful, but exact computation is often intractable. A broad research effort in recent years has aimed at designing structured prediction models and approximate inference and learning procedures that are computationally efficient. This volume offers an overview of this recent research in order to make the work accessible to a broader research community. The chapters, by leading researchers in the field, cover a range of topics, including research trends, the linear programming relaxation approach, innovations in probabilistic modeling, recent theoretical progress, and resource-aware learning.

ps

publisher link (url) [BibTex]

publisher link (url) [BibTex]


Thumb xl toc image patent
Convertor

Fischer, P., Mark, A.

May 2014 (patent)

pf

[BibTex]

[BibTex]


no image
Method and device for blind correction of optical aberrations in a digital image

Schuler, C., Hirsch, M., Harmeling, S., Schölkopf, B.

International Patent Application, No. PCT/EP2012/068868, April 2014 (patent)

ei

[BibTex]


no image
Learning Motor Skills: From Algorithms to Robot Experiments

Kober, J., Peters, J.

97, pages: 191, Springer Tracts in Advanced Robotics, Springer, 2014 (book)

ei

DOI [BibTex]

DOI [BibTex]


no image
Single-Source Domain Adaptation with Target and Conditional Shift

Zhang, K., Schölkopf, B., Muandet, K., Wang, Z., Zhou, Z., Persello, C.

In Regularization, Optimization, Kernels, and Support Vector Machines, pages: 427-456, 19, Chapman & Hall/CRC Machine Learning & Pattern Recognition, (Editors: Suykens, J. A. K., Signoretto, M. and Argyriou, A.), Chapman and Hall/CRC, Boca Raton, USA, 2014 (inbook)

ei

[BibTex]

[BibTex]


no image
Higher-Order Tensors in Diffusion Imaging

Schultz, T., Fuster, A., Ghosh, A., Deriche, R., Florack, L., Lim, L.

In Visualization and Processing of Tensors and Higher Order Descriptors for Multi-Valued Data, pages: 129-161, Mathematics + Visualization, (Editors: Westin, C.-F., Vilanova, A. and Burgeth, B.), Springer, 2014 (inbook)

ei

[BibTex]

[BibTex]


no image
Fuzzy Fibers: Uncertainty in dMRI Tractography

Schultz, T., Vilanova, A., Brecheisen, R., Kindlmann, G.

In Scientific Visualization: Uncertainty, Multifield, Biomedical, and Scalable Visualization, pages: 79-92, 8, Mathematics + Visualization, (Editors: Hansen, C. D., Chen, M., Johnson, C. R., Kaufman, A. E. and Hagen, H.), Springer, 2014 (inbook)

ei

[BibTex]

[BibTex]


no image
Nonconvex Proximal Splitting with Computational Errors

Sra, S.

In Regularization, Optimization, Kernels, and Support Vector Machines, pages: 83-102, 4, (Editors: Suykens, J. A. K., Signoretto, M. and Argyriou, A.), CRC Press, 2014 (inbook)

ei

[BibTex]

[BibTex]


no image
Addressing of Micro-robot Teams and Non-contact Micro-manipulation

Diller, E., Ye, Z., Giltinan, J., Sitti, M.

In Small-Scale Robotics. From Nano-to-Millimeter-Sized Robotic Systems and Applications, pages: 28-38, Springer Berlin Heidelberg, 2014 (incollection)

pi

Project Page [BibTex]

Project Page [BibTex]


no image
Robot Learning by Guided Self-Organization

Martius, G., Der, R., Herrmann, J. M.

In Guided Self-Organization: Inception, 9, pages: 223-260, Emergence, Complexity and Computation, Springer Berlin Heidelberg, 2014 (incollection)

al

link (url) DOI [BibTex]

link (url) DOI [BibTex]


Thumb xl simulated annealing
Simulated Annealing

Gall, J.

In Encyclopedia of Computer Vision, pages: 737-741, 0, (Editors: Ikeuchi, K. ), Springer Verlag, 2014, to appear (inbook)

ps

[BibTex]

[BibTex]

2009


no image
Text Clustering with Mixture of von Mises-Fisher Distributions

Sra, S., Banerjee, A., Ghosh, J., Dhillon, I.

In Text mining: classification, clustering, and applications, pages: 121-161, Chapman & Hall/CRC data mining and knowledge discovery series, (Editors: Srivastava, A. N. and Sahami, M.), CRC Press, Boca Raton, FL, USA, June 2009 (inbook)

ei

Web DOI [BibTex]

2009


Web DOI [BibTex]


no image
Data Mining for Biologists

Tsuda, K.

In Biological Data Mining in Protein Interaction Networks, pages: 14-27, (Editors: Li, X. and Ng, S.-K.), Medical Information Science Reference, Hershey, PA, USA, May 2009 (inbook)

Abstract
In this tutorial chapter, we review basics about frequent pattern mining algorithms, including itemset mining, association rule mining and graph mining. These algorithms can find frequently appearing substructures in discrete data. They can discover structural motifs, for example, from mutation data, protein structures and chemical compounds. As they have been primarily used for business data, biological applications are not so common yet, but their potential impact would be large. Recent advances in computers including multicore machines and ever increasing memory capacity support the application of such methods to larger datasets. We explain technical aspects of the algorithms, but do not go into details. Current biological applications are summarized and possible future directions are given.

ei

Web [BibTex]

Web [BibTex]


no image
Large Margin Methods for Part of Speech Tagging

Altun, Y.

In Automatic Speech and Speaker Recognition: Large Margin and Kernel Methods, pages: 141-160, (Editors: Keshet, J. and Bengio, S.), Wiley, Hoboken, NJ, USA, January 2009 (inbook)

ei

Web [BibTex]

Web [BibTex]


no image
Covariate shift and local learning by distribution matching

Gretton, A., Smola, A., Huang, J., Schmittfull, M., Borgwardt, K., Schölkopf, B.

In Dataset Shift in Machine Learning, pages: 131-160, (Editors: Quiñonero-Candela, J., Sugiyama, M., Schwaighofer, A. and Lawrence, N. D.), MIT Press, Cambridge, MA, USA, 2009 (inbook)

Abstract
Given sets of observations of training and test data, we consider the problem of re-weighting the training data such that its distribution more closely matches that of the test data. We achieve this goal by matching covariate distributions between training and test sets in a high dimensional feature space (specifically, a reproducing kernel Hilbert space). This approach does not require distribution estimation. Instead, the sample weights are obtained by a simple quadratic programming procedure. We provide a uniform convergence bound on the distance between the reweighted training feature mean and the test feature mean, a transductive bound on the expected loss of an algorithm trained on the reweighted data, and a connection to single class SVMs. While our method is designed to deal with the case of simple covariate shift (in the sense of Chapter ??), we have also found benefits for sample selection bias on the labels. Our correction procedure yields its greatest and most consistent advantages when the learning algorithm returns a classifier/regressor that is simpler" than the data might suggest.

ei

PDF Web [BibTex]

PDF Web [BibTex]


no image
Metal-Organic Frameworks

Panella, B., Hirscher, M.

In Encyclopedia of Electrochemical Power Sources, pages: 493-496, Elsevier, Amsterdam [et al.], 2009 (incollection)

mms

[BibTex]

[BibTex]


no image
Carbon Materials

Hirscher, M.

In Encyclopedia of Electrochemical Power Sources, pages: 484-487, Elsevier, Amsterdam [et al.], 2009 (incollection)

mms

[BibTex]

[BibTex]

2007


no image
Support Vector Machine Learning for Interdependent and Structured Output Spaces

Altun, Y., Hofmann, T., Tsochantaridis, I.

In Predicting Structured Data, pages: 85-104, Advances in neural information processing systems, (Editors: Bakir, G. H. , T. Hofmann, B. Schölkopf, A. J. Smola, B. Taskar, S. V. N. Vishwanathan), MIT Press, Cambridge, MA, USA, September 2007 (inbook)

ei

Web [BibTex]

2007


Web [BibTex]


no image
Brisk Kernel ICA

Jegelka, S., Gretton, A.

In Large Scale Kernel Machines, pages: 225-250, Neural Information Processing, (Editors: Bottou, L. , O. Chapelle, D. DeCoste, J. Weston), MIT Press, Cambridge, MA, USA, September 2007 (inbook)

Abstract
Recent approaches to independent component analysis have used kernel independence measures to obtain very good performance in ICA, particularly in areas where classical methods experience difficulty (for instance, sources with near-zero kurtosis). In this chapter, we compare two efficient extensions of these methods for large-scale problems: random subsampling of entries in the Gram matrices used in defining the independence measures, and incomplete Cholesky decomposition of these matrices. We derive closed-form, efficiently computable approximations for the gradients of these measures, and compare their performance on ICA using both artificial and music data. We show that kernel ICA can scale up to much larger problems than yet attempted, and that incomplete Cholesky decomposition performs better than random sampling.

ei

PDF Web [BibTex]

PDF Web [BibTex]


no image
Predicting Structured Data

Bakir, G., Hofmann, T., Schölkopf, B., Smola, A., Taskar, B., Vishwanathan, S.

pages: 360, Advances in neural information processing systems, MIT Press, Cambridge, MA, USA, September 2007 (book)

Abstract
Machine learning develops intelligent computer systems that are able to generalize from previously seen examples. A new domain of machine learning, in which the prediction must satisfy the additional constraints found in structured data, poses one of machine learning’s greatest challenges: learning functional dependencies between arbitrary input and output domains. This volume presents and analyzes the state of the art in machine learning algorithms and theory in this novel field. The contributors discuss applications as diverse as machine translation, document markup, computational biology, and information extraction, among others, providing a timely overview of an exciting field.

ei

Web [BibTex]

Web [BibTex]


no image
Training a Support Vector Machine in the Primal

Chapelle, O.

In Large Scale Kernel Machines, pages: 29-50, Neural Information Processing, (Editors: Bottou, L. , O. Chapelle, D. DeCoste, J. Weston), MIT Press, Cambridge, MA, USA, September 2007, This is a slightly updated version of the Neural Computation paper (inbook)

Abstract
Most literature on Support Vector Machines (SVMs) concentrate on the dual optimization problem. In this paper, we would like to point out that the primal problem can also be solved efficiently, both for linear and non-linear SVMs, and that there is no reason to ignore this possibility. On the contrary, from the primal point of view new families of algorithms for large scale SVM training can be investigated.

ei

PDF Web [BibTex]

PDF Web [BibTex]


no image
Approximation Methods for Gaussian Process Regression

Quiñonero-Candela, J., Rasmussen, CE., Williams, CKI.

In Large-Scale Kernel Machines, pages: 203-223, Neural Information Processing, (Editors: Bottou, L. , O. Chapelle, D. DeCoste, J. Weston), MIT Press, Cambridge, MA, USA, September 2007 (inbook)

Abstract
A wealth of computationally efficient approximation methods for Gaussian process regression have been recently proposed. We give a unifying overview of sparse approximations, following Quiñonero-Candela and Rasmussen (2005), and a brief review of approximate matrix-vector multiplication methods.

ei

PDF Web [BibTex]

PDF Web [BibTex]


no image
Trading Convexity for Scalability

Collobert, R., Sinz, F., Weston, J., Bottou, L.

In Large Scale Kernel Machines, pages: 275-300, Neural Information Processing, (Editors: Bottou, L. , O. Chapelle, D. DeCoste, J. Weston), MIT Press, Cambridge, MA, USA, September 2007 (inbook)

Abstract
Convex learning algorithms, such as Support Vector Machines (SVMs), are often seen as highly desirable because they offer strong practical properties and are amenable to theoretical analysis. However, in this work we show how nonconvexity can provide scalability advantages over convexity. We show how concave-convex programming can be applied to produce (i) faster SVMs where training errors are no longer support vectors, and (ii) much faster Transductive SVMs.

ei

PDF Web [BibTex]

PDF Web [BibTex]


no image
Classifying Event-Related Desynchronization in EEG, ECoG and MEG signals

Hill, N., Lal, T., Tangermann, M., Hinterberger, T., Widman, G., Elger, C., Schölkopf, B., Birbaumer, N.

In Toward Brain-Computer Interfacing, pages: 235-260, Neural Information Processing, (Editors: G Dornhege and J del R Millán and T Hinterberger and DJ McFarland and K-R Müller), MIT Press, Cambridge, MA, USA, September 2007 (inbook)

ei

PDF Web [BibTex]

PDF Web [BibTex]


no image
Joint Kernel Maps

Weston, J., Bakir, G., Bousquet, O., Mann, T., Noble, W., Schölkopf, B.

In Predicting Structured Data, pages: 67-84, Advances in neural information processing systems, (Editors: GH Bakir and T Hofmann and B Schölkopf and AJ Smola and B Taskar and SVN Vishwanathan), MIT Press, Cambridge, MA, USA, September 2007 (inbook)

ei

Web [BibTex]

Web [BibTex]


no image
Brain-Computer Interfaces for Communication in Paralysis: A Clinical Experimental Approach

Hinterberger, T., Nijboer, F., Kübler, A., Matuz, T., Furdea, A., Mochty, U., Jordan, M., Lal, T., Hill, J., Mellinger, J., Bensch, M., Tangermann, M., Widman, G., Elger, C., Rosenstiel, W., Schölkopf, B., Birbaumer, N.

In Toward Brain-Computer Interfacing, pages: 43-64, Neural Information Processing, (Editors: G. Dornhege and J del R Millán and T Hinterberger and DJ McFarland and K-R Müller), MIT Press, Cambridge, MA, USA, September 2007 (inbook)

ei

PDF Web [BibTex]

PDF Web [BibTex]


no image
Probabilistic Structure Calculation

Rieping, W., Habeck, M., Nilges, M.

In Structure and Biophysics: New Technologies for Current Challenges in Biology and Beyond, pages: 81-98, NATO Security through Science Series, (Editors: Puglisi, J. D.), Springer, Berlin, Germany, March 2007 (inbook)

ei

Web DOI [BibTex]

Web DOI [BibTex]


no image
On the Pre-Image Problem in Kernel Methods

BakIr, G., Schölkopf, B., Weston, J.

In Kernel Methods in Bioengineering, Signal and Image Processing, pages: 284-302, (Editors: G Camps-Valls and JL Rojo-Álvarez and M Martínez-Ramón), Idea Group Publishing, Hershey, PA, USA, January 2007 (inbook)

Abstract
In this chapter we are concerned with the problem of reconstructing patterns from their representation in feature space, known as the pre-image problem. We review existing algorithms and propose a learning based approach. All algorithms are discussed regarding their usability and complexity and evaluated on an image denoising application.

ei

DOI [BibTex]

DOI [BibTex]


no image
Dynamics systems vs. optimal control ? a unifying view

Schaal, S, Mohajerian, P., Ijspeert, A.

In Progress in Brain Research, (165):425-445, 2007, clmc (inbook)

Abstract
In the past, computational motor control has been approached from at least two major frameworks: the dynamic systems approach and the viewpoint of optimal control. The dynamic system approach emphasizes motor control as a process of self-organization between an animal and its environment. Nonlinear differential equations that can model entrainment and synchronization behavior are among the most favorable tools of dynamic systems modelers. In contrast, optimal control approaches view motor control as the evolutionary or development result of a nervous system that tries to optimize rather general organizational principles, e.g., energy consumption or accurate task achievement. Optimal control theory is usually employed to develop appropriate theories. Interestingly, there is rather little interaction between dynamic systems and optimal control modelers as the two approaches follow rather different philosophies and are often viewed as diametrically opposing. In this paper, we develop a computational approach to motor control that offers a unifying modeling framework for both dynamic systems and optimal control approaches. In discussions of several behavioral experiments and some theoretical and robotics studies, we demonstrate how our computational ideas allow both the representation of self-organizing processes and the optimization of movement based on reward criteria. Our modeling framework is rather simple and general, and opens opportunities to revisit many previous modeling results from this novel unifying view.

am

link (url) [BibTex]

link (url) [BibTex]


no image
Bacteria integrated swimming microrobots

Behkam, B., Sitti, M.

In 50 years of artificial intelligence, pages: 154-163, Springer Berlin Heidelberg, 2007 (incollection)

pi

[BibTex]

[BibTex]


no image
Micromagnetism-microstructure relations and the hysteresis loop

Goll, D.

In Handbook of Magnetism and Advanced Magnetic Materials. Vol. 2: Micromagnetism, pages: 1023-1058, John Wiley & Sons Ltd., Chichester, UK, 2007 (incollection)

mms

[BibTex]

[BibTex]


no image
Synchrotron radiation techniques based on X-ray magnetic circular dichroism

Schütz, G., Goering, E., Stoll, H.

In Handbook of Magnetism and Advanced Magnetic Materials. Vol. 3: Materials Novel Techniques for Characterizing and Preparing Samples, pages: 1311-1363, John Wiley & Sons Ltd., Chichester, UK, 2007 (incollection)

mms

[BibTex]

[BibTex]


no image
Micromagnetism-microstructure relations and the hysteresis loop

Goll, D.

In Handbook of Magnetism and Advanced Magnetic Materials. Vol. 2: Micromagnetism, pages: 1023-1058, John Wiley & Sons Ltd., Chichester, UK, 2007 (incollection)

mms

[BibTex]

[BibTex]


no image
Dissipative magnetization dynamics close to the adiabatic regime

Fähnle, M., Steiauf, D.

In Handbook of Magnetism and Advanced Magnetic Materials. Vol. 1: Fundamental and Theory, pages: 282-302, John Wiley & Sons Ltd., Chichester, UK, 2007 (incollection)

mms

[BibTex]

[BibTex]

2004


no image
Kernel Methods in Computational Biology

Schölkopf, B., Tsuda, K., Vert, J.

pages: 410, Computational Molecular Biology, MIT Press, Cambridge, MA, USA, August 2004 (book)

Abstract
Modern machine learning techniques are proving to be extremely valuable for the analysis of data in computational biology problems. One branch of machine learning, kernel methods, lends itself particularly well to the difficult aspects of biological data, which include high dimensionality (as in microarray measurements), representation as discrete and structured data (as in DNA or amino acid sequences), and the need to combine heterogeneous sources of information. This book provides a detailed overview of current research in kernel methods and their applications to computational biology. Following three introductory chapters—an introduction to molecular and computational biology, a short review of kernel methods that focuses on intuitive concepts rather than technical details, and a detailed survey of recent applications of kernel methods in computational biology—the book is divided into three sections that reflect three general trends in current research. The first part presents different ideas for the design of kernel functions specifically adapted to various biological data; the second part covers different approaches to learning from heterogeneous data; and the third part offers examples of successful applications of support vector machine methods.

ei

Web [BibTex]

2004


Web [BibTex]


no image
Distributed Command Execution

Stark, S., Berlin, M.

In BSD Hacks: 100 industrial-strength tips & tools, pages: 152-152, (Editors: Lavigne, Dru), O’Reilly, Beijing, May 2004 (inbook)

Abstract
Often you want to execute a command not only on one computer, but on several at once. For example, you might want to report the current statistics on a group of managed servers or update all of your web servers at once.

ei

[BibTex]

[BibTex]


no image
Gaussian Processes in Machine Learning

Rasmussen, CE.

In 3176, pages: 63-71, Lecture Notes in Computer Science, (Editors: Bousquet, O., U. von Luxburg and G. Rätsch), Springer, Heidelberg, 2004, Copyright by Springer (inbook)

Abstract
We give a basic introduction to Gaussian Process regression models. We focus on understanding the role of the stochastic process and how it is used to define a distribution over functions. We present the simple equations for incorporating training data and examine how to learn the hyperparameters using the marginal likelihood. We explain the practical advantages of Gaussian Process and end with conclusions and a look at the current trends in GP work.

ei

PDF PostScript [BibTex]

PDF PostScript [BibTex]


no image
Protein Classification via Kernel Matrix Completion

Kin, T., Kato, T., Tsuda, K.

In pages: 261-274, (Editors: Schoelkopf, B., K. Tsuda and J.P. Vert), MIT Press, Cambridge, MA; USA, 2004 (inbook)

ei

PDF [BibTex]

PDF [BibTex]


no image
Introduction to Statistical Learning Theory

Bousquet, O., Boucheron, S., Lugosi, G.

In Lecture Notes in Artificial Intelligence 3176, pages: 169-207, (Editors: Bousquet, O., U. von Luxburg and G. Rätsch), Springer, Heidelberg, Germany, 2004 (inbook)

ei

PDF [BibTex]

PDF [BibTex]


no image
A Primer on Kernel Methods

Vert, J., Tsuda, K., Schölkopf, B.

In Kernel Methods in Computational Biology, pages: 35-70, (Editors: B Schölkopf and K Tsuda and JP Vert), MIT Press, Cambridge, MA, USA, 2004 (inbook)

ei

PDF [BibTex]

PDF [BibTex]


no image
Concentration Inequalities

Boucheron, S., Lugosi, G., Bousquet, O.

In Lecture Notes in Artificial Intelligence 3176, pages: 208-240, (Editors: Bousquet, O., U. von Luxburg and G. Rätsch), Springer, Heidelberg, Germany, 2004 (inbook)

ei

PDF [BibTex]

PDF [BibTex]


no image
Kernels for graphs

Kashima, H., Tsuda, K., Inokuchi, A.

In pages: 155-170, (Editors: Schoelkopf, B., K. Tsuda and J.P. Vert), MIT Press, Cambridge, MA; USA, 2004 (inbook)

ei

PDF [BibTex]

PDF [BibTex]


no image
A primer on molecular biology

Zien, A.

In pages: 3-34, (Editors: Schoelkopf, B., K. Tsuda and J. P. Vert), MIT Press, Cambridge, MA, USA, 2004 (inbook)

Abstract
Modern molecular biology provides a rich source of challenging machine learning problems. This tutorial chapter aims to provide the necessary biological background knowledge required to communicate with biologists and to understand and properly formalize a number of most interesting problems in this application domain. The largest part of the chapter (its first section) is devoted to the cell as the basic unit of life. Four aspects of cells are reviewed in sequence: (1) the molecules that cells make use of (above all, proteins, RNA, and DNA); (2) the spatial organization of cells (``compartmentalization''); (3) the way cells produce proteins (``protein expression''); and (4) cellular communication and evolution (of cells and organisms). In the second section, an overview is provided of the most frequent measurement technologies, data types, and data sources. Finally, important open problems in the analysis of these data (bioinformatics challenges) are briefly outlined.

ei

PDF PostScript Web [BibTex]

PDF PostScript Web [BibTex]


no image
Computational approaches to motor learning by imitation

Schaal, S., Ijspeert, A., Billard, A.

In The Neuroscience of Social Interaction, (1431):199-218, (Editors: Frith, C. D.;Wolpert, D.), Oxford University Press, Oxford, 2004, clmc (inbook)

Abstract
Movement imitation requires a complex set of mechanisms that map an observed movement of a teacher onto one's own movement apparatus. Relevant problems include movement recognition, pose estimation, pose tracking, body correspondence, coordinate transformation from external to egocentric space, matching of observed against previously learned movement, resolution of redundant degrees-of-freedom that are unconstrained by the observation, suitable movement representations for imitation, modularization of motor control, etc. All of these topics by themselves are active research problems in computational and neurobiological sciences, such that their combination into a complete imitation system remains a daunting undertaking - indeed, one could argue that we need to understand the complete perception-action loop. As a strategy to untangle the complexity of imitation, this paper will examine imitation purely from a computational point of view, i.e. we will review statistical and mathematical approaches that have been suggested for tackling parts of the imitation problem, and discuss their merits, disadvantages and underlying principles. Given the focus on action recognition of other contributions in this special issue, this paper will primarily emphasize the motor side of imitation, assuming that a perceptual system has already identified important features of a demonstrated movement and created their corresponding spatial information. Based on the formalization of motor control in terms of control policies and their associated performance criteria, useful taxonomies of imitation learning can be generated that clarify different approaches and future research directions.

am

link (url) [BibTex]

link (url) [BibTex]


no image
Effect of Grain Boundary Phase Transitions on the Superplasticity in the Al-Zn System

Lopez, G.A., Straumal, B.B., Gust, W., Mittemeijer, E.J.

In Nanomaterials by Severe Plastic Deformation, pages: 642-647, Wiley-VCH Verlag, Weinheim, 2004 (incollection)

mms

[BibTex]

[BibTex]


no image
test jon
(book)

[BibTex]