Header logo is


2011


no image
Optimization for Machine Learning

Sra, S., Nowozin, S., Wright, S.

pages: 494, Neural information processing series, MIT Press, Cambridge, MA, USA, December 2011 (book)

Abstract
The interplay between optimization and machine learning is one of the most important developments in modern computational science. Optimization formulations and methods are proving to be vital in designing algorithms to extract essential knowledge from huge volumes of data. Machine learning, however, is not simply a consumer of optimization technology but a rapidly evolving field that is itself generating new optimization ideas. This book captures the state of the art of the interaction between optimization and machine learning in a way that is accessible to researchers in both fields. Optimization approaches have enjoyed prominence in machine learning because of their wide applicability and attractive theoretical properties. The increasing complexity, size, and variety of today's machine learning models call for the reassessment of existing assumptions. This book starts the process of reassessment. It describes the resurgence in novel contexts of established frameworks such as first-order methods, stochastic approximations, convex relaxations, interior-point methods, and proximal methods. It also devotes attention to newer themes such as regularized optimization, robust optimization, gradient and subgradient methods, splitting techniques, and second-order methods. Many of these techniques draw inspiration from other fields, including operations research, theoretical computer science, and subfields of optimization. The book will enrich the ongoing cross-fertilization between the machine learning community and these other fields, and within the broader optimization community.

ei

Web [BibTex]

2011


Web [BibTex]


no image
Bayesian Time Series Models

Barber, D., Cemgil, A., Chiappa, S.

pages: 432, Cambridge University Press, Cambridge, UK, August 2011 (book)

ei

[BibTex]

[BibTex]


no image
Handbook of Statistical Bioinformatics

Lu, H., Schölkopf, B., Zhao, H.

pages: 627, Springer Handbooks of Computational Statistics, Springer, Berlin, Germany, 2011 (book)

ei

Web DOI [BibTex]

Web DOI [BibTex]


no image
Preparation of high-efficiency nanostructures of crystalline silicon at low temperatures, as catalyzed by metals: The decisive role of interface thermodynamics

Wang, Zumin, Jeurgens, Lars P. H., Mittemeijer, Eric J.

2011 (mpi_year_book)

Abstract
Metals may help to convert semiconductors from a disordered (amorphous) to an ordered (crystalline) form at low temperatures. A general, quantitative model description has been developed on the basis of interface thermodynamics, which provides fundamental understanding of such so-called metal-induced crystallization (MIC) of amorphous semiconductors. This fundamental understanding can allow the low-temperature (< 200 ºC) manufacturing of high-efficiency solar cells and crystalline-Si-based nanostructures on cheap and flexible substrates such as glasses, plastics and possibly even papers.

link (url) [BibTex]


no image
The sweet coat of living cells – from supramolecular organization and dynamics to biological function

Richter, Ralf

2011 (mpi_year_book)

Abstract
Many biological cells endow themselves with a sugar-rich coat that plays a key role in the protection of the cell and in structuring and communicating with its environment. An outstanding property of these pericellular coats is their dynamic self-organization into strongly hydrated and gel-like meshworks. Tailor-made model systems that are constructed from the molecular building blocks of pericellular coats can help to understand how the coats function.

link (url) [BibTex]

2010


no image
From Motor Learning to Interaction Learning in Robots

Sigaud, O., Peters, J.

pages: 538, Studies in Computational Intelligence ; 264, (Editors: O Sigaud, J Peters), Springer, Berlin, Germany, January 2010 (book)

Abstract
From an engineering standpoint, the increasing complexity of robotic systems and the increasing demand for more autonomously learning robots, has become essential. This book is largely based on the successful workshop "From motor to interaction learning in robots" held at the IEEE/RSJ International Conference on Intelligent Robot Systems. The major aim of the book is to give students interested the topics described above a chance to get started faster and researchers a helpful compandium.

ei

Web DOI [BibTex]

2010


Web DOI [BibTex]


no image
Handbook of Hydrogen Storage

Hirscher, M.

pages: 353 p., Wiley-VCH, Weinheim, 2010 (book)

mms

[BibTex]

[BibTex]

2003


no image
Magnetism and the Microstructure of Ferromagnetic Solids

Kronmüller, H., Fähnle, M.

pages: 432 p., 1st ed., Cambridge University Press, Cambridge, 2003 (book)

mms

[BibTex]

2003


[BibTex]

2000


no image
Advances in Large Margin Classifiers

Smola, A., Bartlett, P., Schölkopf, B., Schuurmans, D.

pages: 422, Neural Information Processing, MIT Press, Cambridge, MA, USA, October 2000 (book)

Abstract
The concept of large margins is a unifying principle for the analysis of many different approaches to the classification of data from examples, including boosting, mathematical programming, neural networks, and support vector machines. The fact that it is the margin, or confidence level, of a classification--that is, a scale parameter--rather than a raw training error that matters has become a key tool for dealing with classifiers. This book shows how this idea applies to both the theoretical analysis and the design of algorithms. The book provides an overview of recent developments in large margin classifiers, examines connections with other methods (e.g., Bayesian inference), and identifies strengths and weaknesses of the method, as well as directions for future research. Among the contributors are Manfred Opper, Vladimir Vapnik, and Grace Wahba.

ei

Web [BibTex]

2000


Web [BibTex]


no image
test jon
(book)

[BibTex]