Posts by Collection

portfolio

publications

CoDAR: Continuous Diffusion Language Models are More Powerful Than You Think

Published in -, 2026

Revealed the theoretical suboptimality of pointwise token rounding in continuous DLMs. Proposed a contextual autoregressive decoder to replace linear rounding, enabling sequence-aware discretization. Demonstrated that continuous diffusion models can rival discrete DLMs when rounding is properly modeled.

Download here

talks

teaching

Teaching experience 1

Undergraduate course, University 1, Department, 2014

This is a description of a teaching experience. You can use markdown like any other post.

Teaching experience 2

Workshop, University 1, Department, 2015

This is a description of a teaching experience. You can use markdown like any other post.