Header logo is



Methods of forming dry adhesive structures
Methods of forming dry adhesive structures

Sitti, M., Murphy, M., Aksak, B.

September 2015, US Patent 9,120,953 (patent)

Abstract
Methods of forming dry adhesives including a method of making a dry adhesive including applying a liquid polymer to the second end of the stem, molding the liquid polymer on the stem in a mold, wherein the mold includes a recess having a cross-sectional area that is less than a cross-sectional area of the second end of the stem, curing the liquid polymer in the mold to form a tip at the second end of the stem, wherein the tip includes a second layer stem; corresponding to the recess in the mold, and removing the tip from the mold after the liquid polymer cures.

pi

[BibTex]

[BibTex]


Micro-fiber arrays with tip coating and transfer method for preparing same
Micro-fiber arrays with tip coating and transfer method for preparing same

Sitti, M., Washburn, N. R., Glass, P. S., Chung, H.

July 2015, US Patent 9,079,215 (patent)

Abstract
Present invention describes a patterned and coated micro- and nano-scale fibers elastomeric material for enhanced adhesion in wet or dry environments. A multi-step fabrication process including optical lithography, micromolding, polymer synthesis, dipping, stamping, and photopolymerization is described to produce uniform arrays of micron-scale fibers with mushroom-shaped tips coated with a thin layer of an intrinsically adhesive synthetic polymer, such as lightly crosslinked p(DMA-co-MEA).

pi

[BibTex]

[BibTex]


Dry adhesives and methods for making dry adhesives
Dry adhesives and methods for making dry adhesives

Sitti, M., Murphy, M., Aksak, B.

March 2015, US Patent App. 14/625,162 (patent)

Abstract
Dry adhesives and methods for forming dry adhesives. A method of forming a dry adhesive structure on a substrate, comprises: forming a template backing layer of energy sensitive material on the substrate; forming a template layer of energy sensitive material on the template backing layer; exposing the template layer to a predetermined pattern of energy; removing a portion of the template layer related to the predetermined pattern of energy, and leaving a template structure formed from energy sensitive material and connected to the substrate via the template backing layer.

pi

[BibTex]

[BibTex]

2007


no image
Predicting Structured Data

Bakir, G., Hofmann, T., Schölkopf, B., Smola, A., Taskar, B., Vishwanathan, S.

pages: 360, Advances in neural information processing systems, MIT Press, Cambridge, MA, USA, September 2007 (book)

Abstract
Machine learning develops intelligent computer systems that are able to generalize from previously seen examples. A new domain of machine learning, in which the prediction must satisfy the additional constraints found in structured data, poses one of machine learning’s greatest challenges: learning functional dependencies between arbitrary input and output domains. This volume presents and analyzes the state of the art in machine learning algorithms and theory in this novel field. The contributors discuss applications as diverse as machine translation, document markup, computational biology, and information extraction, among others, providing a timely overview of an exciting field.

ei

Web [BibTex]

2007


Web [BibTex]


no image
Advances in Neural Information Processing Systems 19: Proceedings of the 2006 Conference

Schölkopf, B., Platt, J., Hofmann, T.

Proceedings of the Twentieth Annual Conference on Neural Information Processing Systems (NIPS 2006), pages: 1690, MIT Press, Cambridge, MA, USA, 20th Annual Conference on Neural Information Processing Systems (NIPS), September 2007 (proceedings)

Abstract
The annual Neural Information Processing Systems (NIPS) conference is the flagship meeting on neural computation and machine learning. It draws a diverse group of attendees--physicists, neuroscientists, mathematicians, statisticians, and computer scientists--interested in theoretical and applied aspects of modeling, simulating, and building neural-like or intelligent systems. The presentations are interdisciplinary, with contributions in algorithms, learning theory, cognitive science, neuroscience, brain imaging, vision, speech and signal processing, reinforcement learning, and applications. Only twenty-five percent of the papers submitted are accepted for presentation at NIPS, so the quality is exceptionally high. This volume contains the papers presented at the December 2006 meeting, held in Vancouver.

ei

Web [BibTex]

Web [BibTex]