Diffusion Generative Models: From Theory To Practice

Bahman Moraffah, Springer
This book investigates the rapidly evolving field of diffusion generative models, which have gained prominence for their capability in probabilistic modeling and data synthesis. By intertwining stochastic processes with deep learning techniques, diffusion models provide a versatile framework for producing high-quality samples across various domains such as images, text, and audio. This book begins with an exploration of the fundamental concepts underlying diffusion models, shedding light on their probabilistic foundations and the key elements of stochastic processes and Bayesian inference. It then navigates through the deep learning intricacies of these models, examining the neural network architectures and training methodologies employed. Additionally, the book discusses essential sampling techniques and inference methods for generating realistic samples and efficiently estimating model parameters. This book highlights the latest advancements in diffusion generative modeling, underscoring its scalability, interpretability, and capacity for uncertainty modeling. This book also covers a range of applications for diffusion models, such as image generation, text synthesis, audio synthesis, and biology, and evaluates metrics for assessing the quality of generated samples. This comprehensive overview aims to be a valuable resource for researchers and practitioners seeking to grasp the theoretical underpinnings and practical applications of diffusion generative modeling.

Contents:

  • Introduction

  • Mathematical Background and Principles Underlying Diffusion Models

  • Exploring Generative Modeling Techniques

  • Diving into the Depths: Understanding Diffusion Generative Models

  • Building Bridges: Exploring the Foundations of Diffusion Models

  • Beyond the Horizon: Advancements in Diffusion Models

  • Training and Inference: Techniques and Tricks

  • Architectural Variants of Diffusion Models

  • From Theory to Practice: Applications

  • Challenges and Future Directions

  • Appendix A: Datasets Used in Generative Modeling

  • Appendix B: Metrics Used for Training and Sampling

  • Appendix C: Summary of Techniques and Applications

  • Appendix D: Gumbel-Softmax Technique

Probability, Statistics, and Random Processes with Applications in Learning Theory

Bahman Moraffah
Probability theory provides a principled, practical, mathematical approach to learning theory. This book provides the fundamental background for doing research in machine learning, engineering, and statistics. It provides a unified treatment of theoretical and practical aspects of probability theory. The treatment is comprehensive and self-contained with many applications, targeted at researchers and students in machine learning, engineering, and applied statistics.

Contents:
Part I: Probability Theory

  • Probability Space

  • Continuous and Discrete Random Variables

  • Multiple Random Variables

  • Parametric Point Estimation

  • Probability Theory: Applications

  • Convergence and Asymptotic Behavior

Part II: Random Processes

  • Introduction to Random Processes

  • Markov Processes

  • Poisson Processes

Part III: Applications in Learning Theory

  • Probability and its Application in Machine Learning

  • Bootstrap and Monte Carlo Resampling Methods

  • Information Geometry and its Applications in Machine Learning

Bayesian Modeling and Inference: A Bayesian Approach to Machine Learning

Bahman Moraffah
It provides a comprehensive treatment of advanced learning theory. It spans a wide range of statistical tools from frequentists to Bayesian. This book extensively emphasizes the advances in machine learning theory from a Bayesian perspective. This book can be used for an advanced graduate course in Bayesian statistics.
Contents:

  • Preliminary

    • Measure theoretic definition

    • Stochastic convergences

    • Information theory

    • Group theory

    • Conceptual things

    • Exponential family

    • Gaussian

  • Theory of Point Estimation

    • Likelihood and first order methods such as MLE

    • Delta method

    • Asymptotic statistics

    • M and Z estimators

    • U statistics

    • Empirical processes

    • L-Stattiitcs

    • Parametric minimax theory

    • Expectation-minimization algorithm to compute

  • Introduction to nonparametric Statistics

    • Concentration of measure

    • Density estimation

    • Minimax theory

  • Statistical Decision Theory: A frequentist Approach

  • Bootstrap

  • Information Geometry

  • Regression

    • Linear regression

  • Introduction to Bayesian Statistics

    • Bayesian statistics: introduction and example — Bayes rule, Exchangeability, posterior, inference, marginalization

    • Singel parameter estimation

    • Multi- parameter estimation

    • Choice of prior

    • Frequentist property of Bayesian modeling

  • Advanced Posterior Computation

    • Markov chain Monte Carlo method

    • Gibbs sampling — efficient Gibbs sampling

    • Metropolis-Hastings algorithm

    • Building MC algorithm

    • Variational Bayes

  • Hierarchical Modeling

  • Mixture Model

  • Robust inference

  • Bayesian Regression

    • Linear Models

    • Generalized linear model

    • Linear regression — Bayesian linear model, Model selection — Bayesian comparison, Gibbs sampling and model averaging

  • Bayesian Decision theory

  • Poisson Processes

    • Poisson distribution and relationship to multinomial and binomial distribution

    • Definition of Poisson process

    • Campbell’s theorem

  • Bayesian nonparametrics I

    • Dirichlet process – gamma process

    • Hierarchical Dirichlet process

    • Dependent Dirichlet process

    • Two-parameter Poisson-Dirichlet process

  • Bayesian nonparametrics II

    • Beta processes, stick-breaking, and power law

    • Indian Buffett process

    • Hierarchical Indian buffet proceses

  • Bayesian nonparametric III

    • Completely random measure

    • Normalized random measure

    • Kingman paintbox

    • Feature allocation and paintbox

    • Truncated random measure

  • Bayesian Nonparametrics IV

    • Gaussian Processes

    • Learning Functions and Gaussian Processes

    • Bayesian Optimization

  • Bayesian Network and Causal Inference

    • Directed Graphical model and Inference

    • Causal inference

  • Undirected graphical model

    • Definitions

    • Inference on Undirected graphs