Guidance with Spherical Gaussian Constraint for Conditional Diffusion

ICML 2024

Lingxiao Yang1,2, Shutong Ding1,2, Yifan Cai1,2, Jingya Wang1,2, Ye Shi1,2,
1ShanghaiTech University, 2MoE Key Laboratory of Intelligent Perception and Human Machine Collaboration

DSG mitigates manifold deviation problem by introducing Spherical Gaussian constraint without relying on the linear manifold assumption. Simultaneously, DSG enables the use of larger step sizes, significantly reducing inference time while enhancing sample quality.


Abstract

Recent advances in diffusion models attempt to handle conditional generative tasks by utilizing a differentiable loss function for guidance without the need for additional training. While these methods achieved certain success, they often compromise on sample quality and require small guidance step sizes, leading to longer sampling processes.

This paper reveals that the fundamental issue lies in the manifold deviation during the sampling process when loss guidance is employed. We theoretically show the existence of manifold deviation by establishing a certain lower bound for the estimation error of the loss guidance.

To mitigate this problem, we propose Diffusion with Spherical Gaussian constraint (DSG), drawing inspiration from the concentration phenomenon in high-dimensional Gaussian distributions. DSG effectively constrains the guidance step within the intermediate data manifold through optimization and enables the use of larger guidance steps. Furthermore, we present a closed-form solution for DSG denoising with the Spherical Gaussian constraint. Notably, DSG can seamlessly integrate as a plugin module within existing training-free conditional diffusion methods. Implementing DSG merely involves a few lines of additional code with almost no extra computational overhead, yet it leads to significant performance improvements. Comprehensive experimental results in various conditional generation tasks validate the superiority and adaptability of DSG in terms of both sample quality and time efficiency.

Method

Experimental Results

1. Linear Inverse Problems

2. FaceID Guidance

3. Text-Segmentation Guidance

4. Style Guidance

5. Text-Style Guidance

BibTeX

@inproceedings{
yang2023dsg,
title={Guidance with Spherical Gaussian Constraint for Conditional Diffusion},
author={Lingxiao Yang and Shutong Ding and Yifan Cai and Jingyi Yu and Jingya Wang and Ye Shi},
booktitle={International Conference on Machine Learning},
year={2024}
}