CoDAR: Continuous Diffusion Language Models are More Powerful Than You Think
Published in -, 2026
Revealed the theoretical suboptimality of pointwise token rounding in continuous DLMs. Proposed a contextual autoregressive decoder to replace linear rounding, enabling sequence-aware discretization. Demonstrated that continuous diffusion models can rival discrete DLMs when rounding is properly modeled.
Download here
