|    Login    |    Register

Optimization for Machine Learning

(Paperback)


Publishing Details

Full Title:

Optimization for Machine Learning

Contributors:

By (Author) Suvrit Sra
Edited by Sebastian Nowozin
Edited by Stephen J. Wright
Contributions by Suvrit Sra
Contributions by Sebastian Nowozin
Contributions by Stephen J. Wright
Contributions by Francis Bach
Contributions by Rodolphe Jenatton
Contributions by Julien Mairal
Contributions by Guillaume Obozinski

ISBN:

9780262537766

Publisher:

MIT Press Ltd

Imprint:

MIT Press

Publication Date:

30th September 2011

Country:

United States

Classifications

Readership:

Professional and Scholarly

Fiction/Non-fiction:

Non Fiction

Main Subject:
Other Subjects:

Robotics

Dewey:

006.31

Physical Properties

Physical Format:

Paperback

Number of Pages:

512

Dimensions:

Width 203mm, Height 254mm, Spine 22mm

Description

An up-to-date account of the interplay between optimization and machine learning, accessible to students and researchers in both communities.The interplay between optimization and machine learning is one of the most important developments in modern computational science. Optimization formulations and methods are proving to be vital in designing algorithms to extract essential knowledge from huge volumes of data. Machine learning, however, is not simply a consumer of optimization technology but a rapidly evolving field that is itself generating new optimization ideas. This book captures the state of the art of the interaction between optimization and machine learning in a way that is accessible to researchers in both fields. Optimization approaches have enjoyed prominence in machine learning because of their wide applicability and attractive theoretical properties. The increasing complexity, size, and variety of today's machine learning models call for the reassessment of existing assumptions. This book starts the process of reassessment. It describes the resurgence in novel contexts of established frameworks such as first-order methods, stochastic approximations, convex relaxations, interior-point methods, and proximal methods. It also devotes attention to newer themes such as regularized optimization, robust optimization, gradient and subgradient methods, splitting techniques, and second-order methods. Many of these techniques draw inspiration from other fields, including operations research, theoretical computer science, and subfields of optimization. The book will enrich the ongoing cross-fertilization between the machine learning community and these other fields, and within the broader optimization community.

Author Bio

Suvrit Sra is a Research Scientist at the Max Planck Institute for Biological Cybernetics, T bingen, Germany. Sebastian Nowozin is a Researcher in the Machine Learning and Perception group (MLP) at Microsoft Research, Cambridge, England. Stephen J. Wright is Professor of Computer Science at the University of Wisconsin-Madison. Suvrit Sra is a Research Scientist at the Max Planck Institute for Biological Cybernetics, T bingen, Germany. Sebastian Nowozin is a Researcher in the Machine Learning and Perception group (MLP) at Microsoft Research, Cambridge, England. Stephen J. Wright is Professor of Computer Science at the University of Wisconsin-Madison. Dimitri P. Bertsekas is Professor of Electrical Engineering and Computer Science at MIT. Masashi Sugiyama is Associate Professor in the Department of Computer Science at Tokyo Institute of Technology. Suvrit Sra is a Research Scientist at the Max Planck Institute for Biological Cybernetics, T bingen, Germany. Leon Bottou is a Research Scientist at NEC Labs America. Yoshua Bengio is Professor of Computer Science at the Universite de Montreal.

See all

Other titles from MIT Press Ltd