INFORMATION THEORY, INFERENCE, AND LEARNING ALGORITHMS.

  • 4.00 ·
  • 1 Rating
  • 13 Want to read
  • 0 Currently reading
  • 1 Have read
Not in Library

My Reading Lists:

Create a new list

Check-In

×Close
Add an optional check-in date. Check-in dates are used to track yearly reading goals.
Today

  • 4.00 ·
  • 1 Rating
  • 13 Want to read
  • 0 Currently reading
  • 1 Have read


Download Options

Buy this book

Last edited by ImportBot
December 19, 2023 | History

INFORMATION THEORY, INFERENCE, AND LEARNING ALGORITHMS.

  • 4.00 ·
  • 1 Rating
  • 13 Want to read
  • 0 Currently reading
  • 1 Have read

Full text is online at book site.

Publish Date
Language
Undetermined
Pages
628

Buy this book

Previews available in: Undetermined English

Edition Availability
Cover of: Information Theory, Inference and Learning Algorithms
Information Theory, Inference and Learning Algorithms
2004, University of Cambridge ESOL Examinations, TBS
in English
Cover of: INFORMATION THEORY, INFERENCE, AND LEARNING ALGORITHMS.
INFORMATION THEORY, INFERENCE, AND LEARNING ALGORITHMS.
2003, CAMBRIDGE UNIV PRESS, Cambridge University Press
Cover of: Information Theory, Inference & Learning Algorithms
Information Theory, Inference & Learning Algorithms
2003, Cambridge University Press
Hardcover in English - 1st edition

Add another edition?

Book Details


Published in

CAMBRIDGE

Classifications

Library of Congress
Q360 .M23 2003

ID Numbers

Open Library
OL22584006M
Internet Archive
informationinfer00davi
ISBN 10
0521642981
LCCN
2003055133
OCLC/WorldCat
52377690
Library Thing
403618
Goodreads
201357

Work Description

Book Jacket:

This textbook introduces theory in tandem with applications. Information theory is taught alongside practical communication systems, such as arithmetic coding for data compression and sparse-graph codes for error-correction. A toolbox of inference techniques, including message-passing algorithms, Monte Carlo methods, and variational approximations, are developed alongside applications of these tools to clustering, convolutional codes, independent component analysis, and neural networks.

Publisher Description:

This textbook offers comprehensive coverage of Shannon's theory of information as well as the theory of neural networks and probabilistic data modelling. It includes explanations of Shannon's important source encoding theorem and noisy channel theorem as well as descriptions of practical data compression systems. Many examples and exercises make the book ideal for students to use as a class textbook, or as a resource for researchers who need to work with neural networks or state-of-the-art error-correcting codes.

Excerpts

You cannot do inference without making assumptions.
Page 26, added by David. "A central theme of the book."

Links outside Open Library

Community Reviews (0)

Feedback?
No community reviews have been submitted for this work.

Lists

This work does not appear on any lists.

History

Download catalog record: RDF / JSON / OPDS | Wikipedia citation
December 19, 2023 Edited by ImportBot import existing book
November 8, 2023 Edited by raybb Merge works
July 19, 2023 Edited by ImportBot import existing book
May 3, 2023 Edited by ImportBot import existing book
November 16, 2008 Created by ImportBot Imported from University of Toronto MARC record.