Home
Machine Learning: A Bayesian and Optimization Perspective / Edition 2
Barnes and Noble
Machine Learning: A Bayesian and Optimization Perspective / Edition 2
Current price: $105.00
Barnes and Noble
Machine Learning: A Bayesian and Optimization Perspective / Edition 2
Current price: $105.00
Size: OS
Loading Inventory...
*Product information may vary - to confirm product availability, pricing, shipping and return information please contact Barnes and Noble
Machine Learning: A Bayesian and Optimization Perspective, SecondEdition
, gives a unifying perspective on machine learning by covering both probabilistic and deterministic approaches -which are based on optimization techniques – together with the Bayesian inference approach, whose essence lies in the use of a hierarchy of probabilistic models.
The book builds carefully from the basic classical methods to the most recent trends, with chapters written to be as self-contained as possible, making the text suitable for different courses: pattern recognition, statistical/adaptive signal processing, statistical/Bayesian learning, as well as short courses on sparse modeling, deep learning, and probabilistic graphical models.
Machine Learning: A Bayesian and Optimization Perspective
presents the major machine learning methods as they have been developed in different disciplines, such as statistics, statistical and adaptive signal processing and computer science. Focusing on the physical reasoning behind the mathematics, all the various methods and techniques are explained in depth, supported by examples and problems, giving an invaluable resource to the student and researcher for understanding and applying machine learning concepts.
New to this edition:
To aid understanding, inclusion of many more simple examples in the chapters covering the basic theory
Complete re-write of the chapter on Neural Networks and Deep Learning to reflect the latest advances since the 1st edition
Expanded treatment of Bayesian learning to include Nonparametric Bayesian Learning
All major classical techniques: Mean/Least-Squares regression and filtering, Kalman filtering, stochastic approximation and online learning, Bayesian classification, decision trees, logistic regression and boosting methods
Presents the physical reasoning, mathematical modelling and algorithmic implementation of each method
The latest trends: Sparsity, convex analysis and optimization, online distributed algorithms, learning in RKH spaces, Bayesian inference, graphical and hidden Markov models, particle filtering, deep learning, dictionary learning and latent variables modeling
Case studies - protein folding prediction, optical character recognition, text authorship identification, fMRI data analysis, change point detection, hyperspectral image unmixing, target localization, channel equalization and echo cancellation, show how the theory can be applied
MATLAB and Python code for all the main algorithms are available on an accompanying website, enabling the reader to experiment with the code
, gives a unifying perspective on machine learning by covering both probabilistic and deterministic approaches -which are based on optimization techniques – together with the Bayesian inference approach, whose essence lies in the use of a hierarchy of probabilistic models.
The book builds carefully from the basic classical methods to the most recent trends, with chapters written to be as self-contained as possible, making the text suitable for different courses: pattern recognition, statistical/adaptive signal processing, statistical/Bayesian learning, as well as short courses on sparse modeling, deep learning, and probabilistic graphical models.
Machine Learning: A Bayesian and Optimization Perspective
presents the major machine learning methods as they have been developed in different disciplines, such as statistics, statistical and adaptive signal processing and computer science. Focusing on the physical reasoning behind the mathematics, all the various methods and techniques are explained in depth, supported by examples and problems, giving an invaluable resource to the student and researcher for understanding and applying machine learning concepts.
New to this edition:
To aid understanding, inclusion of many more simple examples in the chapters covering the basic theory
Complete re-write of the chapter on Neural Networks and Deep Learning to reflect the latest advances since the 1st edition
Expanded treatment of Bayesian learning to include Nonparametric Bayesian Learning
All major classical techniques: Mean/Least-Squares regression and filtering, Kalman filtering, stochastic approximation and online learning, Bayesian classification, decision trees, logistic regression and boosting methods
Presents the physical reasoning, mathematical modelling and algorithmic implementation of each method
The latest trends: Sparsity, convex analysis and optimization, online distributed algorithms, learning in RKH spaces, Bayesian inference, graphical and hidden Markov models, particle filtering, deep learning, dictionary learning and latent variables modeling
Case studies - protein folding prediction, optical character recognition, text authorship identification, fMRI data analysis, change point detection, hyperspectral image unmixing, target localization, channel equalization and echo cancellation, show how the theory can be applied
MATLAB and Python code for all the main algorithms are available on an accompanying website, enabling the reader to experiment with the code