Skip to content

GreatDrake/awesome-diffusion-samplers

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

27 Commits
 
 
 
 

Repository files navigation

Awesome Diffusion Samplers Awesome

This repository contains a collection of resources and papers on Diffusion Samplers.

Table of Contents

Models trained with unnormalized density functions: A need for a course correction
Rishal Aggarwal, et al. ICLR 2025 Blogpost Track.

Reinforced sequential Monte Carlo for amortised sampling
Sanghyeok Choi, et al. [code]

Energy-Weighted Flow Matching: Unlocking Continuous Normalizing Flows for Efficient and Scalable Boltzmann Sampling
Niclas Dern, et al.

Non-equilibrium Annealed Adjoint Sampler
Jaemoo Choi, et al.

Progressive Inference-Time Annealing of Diffusion Models for Sampling from Boltzmann Densities
Tara Akhound-Sadegh, et al. [code]

Rethinking Losses for Diffusion Bridge Samplers
Sebastian Sanokowski, et al.

Adaptive Destruction Processes for Diffusion Samplers
Timofei Gritsaev, et al.

On scalable and efficient training of diffusion samplers
Minkyu Kim, et al.

Importance Weighted Score Matching for Diffusion Samplers with Enhanced Mode Coverage
Chenguang Wang, et al.

NETS: A Non-Equilibrium Transport Sampler
Michael Samuel Albergo, et al.

From discrete-time policies to continuous-time diffusion samplers: Asymptotic equivalences and faster training
Julius Berner, et al. [code]

Importance-Weighted Training of Diffusion Samplers
Sanghyeok Choi, et al. ICML 2025 Workshop on Generative AI and Biology.

Progressive Tempering Sampler with Diffusion
Severi Rissanen, et al. ICML 2025. [code]

Outsourced Diffusion Sampling: Efficient Posterior Inference in Latent Spaces of Generative Models
Siddarth Venkatraman, et al. ICML 2025. [code]

Adjoint Sampling: Highly Scalable Diffusion Samplers via Adjoint Matching
Aaron Havens, et al. ICML 2025. [code]

Training Neural Samplers with Reverse Diffusive KL Divergence
Jiajun He, et al. AISTATS 2025. [code]

Continuously Tempered Diffusion Samplers
Ezra Erives, et al. ICLR 2025 Workshop on Frontiers in Probabilistic Inference. [code]

Improving the evaluation of samplers on multi-modal targets
Louis Grenioux, et al. ICLR 2025 Workshop on Frontiers in Probabilistic Inference.

No Trick, No Treat: Pursuits and Challenges Towards Simulation-free Training of Neural Samplers
Jiajun He, et al. ICLR 2025 Workshop on Frontiers in Probabilistic Inference.

Single-Step Consistent Diffusion Samplers
Pascal Jutras-Dubé, et al. ICLR 2025 Workshop on Frontiers in Probabilistic Inference.

Adaptive teachers for amortized samplers
Minsu Kim, et al. ICLR 2025. [code]

Learned Reference-based Diffusion Sampling for multi-modal distributions
Maxence Noble, et al. ICLR 2025. [code]

Sequential Controlled Langevin Diffusions
Junhua Chen, et al. ICLR 2025. [code]

End-To-End Learning of Gaussian Mixture Priors for Diffusion Sampler
Denis Blessing, et al. ICLR 2025.

Underdamped Diffusion Bridges with Applications to Sampling
Denis Blessing, et al. ICLR 2025. [code]

Improved off-policy training of diffusion samplers
Marcin Sendera, et al. NeurIPS 2024. [code]

Particle Denoising Diffusion Sampler
Angus Phillips, et al. ICML 2024. [code]

Beyond ELBOs: A Large-Scale Evaluation of Variational Methods for Sampling
Denis Blessing, et al. ICML 2024. [code]

Iterated Denoising Energy Matching for Sampling from Boltzmann Densities
Tara Akhound-Sadegh, et al. ICML 2024. [code]

Physics-informed neural networks for sampling
Jingtong Sun, et al. ICLR 2024 Workshop on AI4DifferentialEquations In Science.

Diffusion Generative Flow Samplers: Improving learning signals through partial trajectory optimization
Dinghuai Zhang, et al. ICLR 2024. [code]

Transport meets Variational Inference: Controlled Monte Carlo Diffusions
Francisco Vargas, et al. ICLR 2024. [code]

Improved sampling via learned diffusions
Lorenz Richter, et al. ICLR 2024. [code]

An optimal control perspective on diffusion-based generative modeling
Julius Berner, et al. TMLR 2024. [code]

A Theory of Continuous Generative Flow Networks
Salem Lahlou, et al. ICML 2023. [code]

Denoising Diffusion Samplers
Francisco Vargas, et al. ICLR 2023. [code]

Path Integral Sampler: a Stochastic Control Approach for Sampling
Qinsheng Zhang, et al. ICLR 2022. [code]

Contact

If you have any suggestions or want to add your own work, please feel free to create a pull request or write to greatdraken@gmail.com.

About

A collection of resources and papers on diffusion samplers.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors