This textbook tackles the problem of formulating AI systems by combining probabilistic modeling and deep learning. Moreover, it goes beyond typical predictive modeling and brings together supervised learning and unsupervised learning. The resulting paradigm, called deep generative modeling, utilizes the generative perspective on perceiving the surrounding world. It assumes that each phenomenon is driven by an underlying generative process that defines a joint distribution over random variables and their stochastic interactions, i.e., how events occur and in what order. The adjective "deep" comes from the fact that the distribution is parameterized using deep neural networks. There are two distinct traits of deep generative modeling. First, the application of deep neural networks allows rich and flexible parameterization of distributions. Second, the principled manner of modeling stochastic dependencies using probability theory ensures rigorous formulation and prevents potential flaws in reasoning. Moreover, probability theory provides a unified framework where the likelihood function plays a crucial role in quantifying uncertainty and defining objective functions. Deep Generative Modeling is designed to appeal to curious students, engineers, and researchers with a modest mathematical background in undergraduate calculus, linear algebra, probability theory, and the basics in machine learning, deep learning, and programming in Python and PyTorch (or other deep learning libraries). It will appeal to students and researchers from a variety of backgrounds, including computer science, engineering, data science, physics, and bioinformatics, who wish to become familiar with deep generative modeling. To engage the reader, the book introduces fundamental concepts with specific examples and code snippets. The full code accompanying the book is available on github. The ultimate aim of the book is to outline the most important techniques in deep generative modeling and, eventually, enable readers to formulate new models and implement them.
Les mer
Why Deep Generative Modeling?.- Autoregressive Models.- Flow-based Models.- Latent Variable Models.- Hybrid Modeling.- Energy-based Models.- Generative Adversarial Networks.- Deep Generative Modeling for Neural Compression.- Useful Facts from Algebra and Calculus.- Useful Facts from Probability Theory and Statistics.- Index.
Les mer
This textbook tackles the problem of formulating AI systems by combining probabilistic modeling and deep learning. Moreover, it goes beyond typical predictive modeling and brings together supervised learning and unsupervised learning. The resulting paradigm, called deep generative modeling, utilizes the generative perspective on perceiving the surrounding world. It assumes that each phenomenon is driven by an underlying generative process that defines a joint distribution over random variables and their stochastic interactions, i.e., how events occur and in what order. The adjective "deep" comes from the fact that the distribution is parameterized using deep neural networks. There are two distinct traits of deep generative modeling. First, the application of deep neural networks allows rich and flexible parameterization of distributions. Second, the principled manner of modeling stochastic dependencies using probability theory ensures rigorous formulation and prevents potential flaws in reasoning. Moreover, probability theory provides a unified framework where the likelihood function plays a crucial role in quantifying uncertainty and defining objective functions.Deep Generative Modeling is designed to appeal to curious students, engineers, and researchers with a modest mathematical background in undergraduate calculus, linear algebra, probability theory, and the basics in machine learning, deep learning, and programming in Python and PyTorch (or other deep learning libraries). It will appeal to students and researchers from a variety of backgrounds, including computer science, engineering, data science, physics, and bioinformatics, who wish to become familiar with deep generative modeling. To engage the reader, the book introduces fundamental concepts with specific examples and code snippets. The full code accompanying the book is available on github.
Les mer
This approach combines probability theory with deep learning to obtain powerful AI systems Outlines the most important techniques in deep generative modeling, enabling readers to formulate new models All chapters include code snippets to help understand how the presented methods can be implemented
Les mer

Produktdetaljer

ISBN
9783030931605
Publisert
2023-02-20
Utgiver
Vendor
Springer Nature Switzerland AG
Høyde
235 mm
Bredde
155 mm
Aldersnivå
Upper undergraduate, P, 06
Språk
Product language
Engelsk
Format
Product format
Heftet

Forfatter

Biographical note

Jakub Tomczak is an assistant professor of Artificial Intelligence in the Computational Intelligence group at Vrije Universiteit Amsterdam since November 2019. Before, from October 2018 to October 2019, he was a deep learning researcher (Staff Engineer) in Qualcomm AI Research in Amsterdam. From October 2016 to September 2018, he was a Marie Sklodowska-Curie Individual Fellow in Prof. Max Welling’s group at the University of Amsterdam. He obtained his Ph.D. in machine learning from the Wroclaw University of Technology. His research interests include probabilistic modeling, deep learning, approximate Bayesian modeling, and deep generative modeling (with special focus on Variational Auto-Encoders and Flow-based model).