A Comparative Study of Noise Schedules in Denoising Diffusion Probabilistic Models
Keywords:
Generative models, diffusion models, denoising diffusion probabilistic models, noise scheduleAbstract
Noise scheduling plays a crucial role in
the performance of denoising diffusion probabilistic
models (DDPMs), affecting both training dynamics and
sample quality. Although various noise schedules
have been proposed, a comprehensive comparative
analysis remains limited. In this work, we evaluate
five widely used noise schedules: linear, cosine,
quadratic, sigmoid and exponential across datasets
with increasing complexity: MNIST, Fashion-MNIST
and CIFAR-10. We analyze their impact on training
performance and generative quality using metrics such
as Fr´ echet Inception Distance (FID), Kernel Inception
Distance (KID) and Inception Score (IS). Our quantitative
results show that the linear schedule offers the most
rapid training convergence, whereas the exponential
schedule shows the lowest performance. In contrast,
cosine, quadratic, and sigmoid schedules tend to
produce higher-quality samples, depending on the
complexity of the dataset. Qualitative analysis reveals
that nonlinear schedules like cosine and exponential
accelerate the formation of structured and recognizable
images in early training stages, suggesting greater
efficiency in producing better samples with less training.
Our findings indicate that nonlinear schedules may be
preferable when early sample quality is critical, while
linear schedules offer advantages in training speed.