Header logo is



no image
Interface-controlled phenomena in nanomaterials

Mittemeijer, Eric J.; Wang, Zumin

2016 (mpi_year_book)

Abstract
Nanosized material systems characteristically exhibit an excessively high internal interface density. A series of previously unknown phenomena in nanomaterials have been disclosed that are fundamentally caused by the presence of interfaces. Thus anomalously large and small lattice parameters in nanocrystalline metals, quantum stress oscillations in growing nanofilms, and extraordinary atomic mobility at ultralow temperatures have been observed and explained. The attained understanding for these new phenomena can lead to new, sophisticated applications of nanomaterials in advanced technologies.

link (url) [BibTex]

link (url) [BibTex]


no image
Robots learn how to see

Geiger, A.

2016 (mpi_year_book)

Abstract
Autonomous vehicles and intelligent service robots could soon contribute to making our lives more pleasant and secure. However, for autonomous operation such systems first need to learn the perception process itself. This involves measuring distances and motions, detecting objects and interpreting the threedimensional world as a whole. While humans perceive their environment with seemingly little efforts, computers first need to be trained for these tasks. Our research is concerned with developing mathematical models which allow computers to robustly perceive their environment.

link (url) DOI [BibTex]

2014


Thumb xl thumb 9780262028370
Advanced Structured Prediction

Nowozin, S., Gehler, P. V., Jancsary, J., Lampert, C. H.

Advanced Structured Prediction, pages: 432, Neural Information Processing Series, MIT Press, November 2014 (book)

Abstract
The goal of structured prediction is to build machine learning models that predict relational information that itself has structure, such as being composed of multiple interrelated parts. These models, which reflect prior knowledge, task-specific relations, and constraints, are used in fields including computer vision, speech recognition, natural language processing, and computational biology. They can carry out such tasks as predicting a natural language sentence, or segmenting an image into meaningful components. These models are expressive and powerful, but exact computation is often intractable. A broad research effort in recent years has aimed at designing structured prediction models and approximate inference and learning procedures that are computationally efficient. This volume offers an overview of this recent research in order to make the work accessible to a broader research community. The chapters, by leading researchers in the field, cover a range of topics, including research trends, the linear programming relaxation approach, innovations in probabilistic modeling, recent theoretical progress, and resource-aware learning.

ps

publisher link (url) [BibTex]

2014


publisher link (url) [BibTex]


no image
Learning Motor Skills: From Algorithms to Robot Experiments

Kober, J., Peters, J.

97, pages: 191, Springer Tracts in Advanced Robotics, Springer, 2014 (book)

ei

DOI [BibTex]

DOI [BibTex]


no image
Exploring complex diseases with intelligent systems

Borgwardt, K.

2014 (mpi_year_book)

Abstract
Physicians are collecting an ever increasing amount of data describing the health state of their patients. Is new knowledge about diseases hidden in this data, which could lead to better therapies? The field of Machine Learning in Biomedicine is concerned with the development of approaches which help to gain such insights from massive biomedical data.

link (url) [BibTex]


no image
The cellular life-death decision – how mitochondrial membrane proteins can determine cell fate

García-Sáez, Ana J.

2014 (mpi_year_book)

Abstract
Living organisms have a very effective method for eliminating cells that are no longer needed: programmed death. Researchers in the group of Ana García Sáez work with a protein called Bax, a key regulator of apoptosis that creates pores with a flexible diameter inside the outer mitochondrial membrane. This step inevitably triggers the final death of the cell. These insights into the role of important key enzymes in setting off apoptosis could provide useful for developing drugs that can directly influence apoptosis.

link (url) [BibTex]

2007


no image
Predicting Structured Data

Bakir, G., Hofmann, T., Schölkopf, B., Smola, A., Taskar, B., Vishwanathan, S.

pages: 360, Advances in neural information processing systems, MIT Press, Cambridge, MA, USA, September 2007 (book)

Abstract
Machine learning develops intelligent computer systems that are able to generalize from previously seen examples. A new domain of machine learning, in which the prediction must satisfy the additional constraints found in structured data, poses one of machine learning’s greatest challenges: learning functional dependencies between arbitrary input and output domains. This volume presents and analyzes the state of the art in machine learning algorithms and theory in this novel field. The contributors discuss applications as diverse as machine translation, document markup, computational biology, and information extraction, among others, providing a timely overview of an exciting field.

ei

Web [BibTex]

2007


Web [BibTex]


no image
Advances in Neural Information Processing Systems 19: Proceedings of the 2006 Conference

Schölkopf, B., Platt, J., Hofmann, T.

Proceedings of the Twentieth Annual Conference on Neural Information Processing Systems (NIPS 2006), pages: 1690, MIT Press, Cambridge, MA, USA, 20th Annual Conference on Neural Information Processing Systems (NIPS), September 2007 (proceedings)

Abstract
The annual Neural Information Processing Systems (NIPS) conference is the flagship meeting on neural computation and machine learning. It draws a diverse group of attendees--physicists, neuroscientists, mathematicians, statisticians, and computer scientists--interested in theoretical and applied aspects of modeling, simulating, and building neural-like or intelligent systems. The presentations are interdisciplinary, with contributions in algorithms, learning theory, cognitive science, neuroscience, brain imaging, vision, speech and signal processing, reinforcement learning, and applications. Only twenty-five percent of the papers submitted are accepted for presentation at NIPS, so the quality is exceptionally high. This volume contains the papers presented at the December 2006 meeting, held in Vancouver.

ei

Web [BibTex]

Web [BibTex]

2002


no image
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond

Schölkopf, B., Smola, A.

pages: 644, Adaptive Computation and Machine Learning, MIT Press, Cambridge, MA, USA, December 2002, Parts of this book, including an introduction to kernel methods, can be downloaded here. (book)

Abstract
In the 1990s, a new type of learning algorithm was developed, based on results from statistical learning theory: the Support Vector Machine (SVM). This gave rise to a new class of theoretically elegant learning machines that use a central concept of SVMs-kernels—for a number of learning tasks. Kernel machines provide a modular framework that can be adapted to different tasks and domains by the choice of the kernel function and the base algorithm. They are replacing neural networks in a variety of fields, including engineering, information retrieval, and bioinformatics. Learning with Kernels provides an introduction to SVMs and related kernel methods. Although the book begins with the basics, it also includes the latest research. It provides all of the concepts necessary to enable a reader equipped with some basic mathematical knowledge to enter the world of machine learning using theoretically well-founded yet easy-to-use kernel algorithms and to understand and apply the powerful algorithms that have been developed over the last few years.

ei

Web [BibTex]

2002


Web [BibTex]


no image
test jon
(book)

[BibTex]