Redirecting to original paper in 30 seconds...

Click below to go immediately or wait for automatic redirect

arxiv_ai 95% Match Research Paper ML Researchers,NLP Engineers,AI Developers 2 weeks ago

Edit Flows: Flow Matching with Edit Operations

generative-ai › autoregressive
📄 Abstract

Abstract: Autoregressive generative models naturally generate variable-length sequences, while non-autoregressive models struggle, often imposing rigid, token-wise structures. We propose Edit Flows, a non-autoregressive model that overcomes these limitations by defining a discrete flow over sequences through edit operations$\unicode{x2013}$insertions, deletions, and substitutions. By modeling these operations within a Continuous-time Markov Chain over the sequence space, Edit Flows enable flexible, position-relative generation that aligns more closely with the structure of sequence data. Our training method leverages an expanded state space with auxiliary variables, making the learning process efficient and tractable. Empirical results show that Edit Flows outperforms both autoregressive and mask models on image captioning and significantly outperforms the mask construction in text and code generation.
Authors (4)
Marton Havasi
Brian Karrer
Itai Gat
Ricky T. Q. Chen
Submitted
June 10, 2025
arXiv Category
cs.LG
arXiv PDF

Key Contributions

Introduces Edit Flows, a non-autoregressive model that generates variable-length sequences using discrete edit operations (insertions, deletions, substitutions) within a Continuous-time Markov Chain framework. This enables flexible, position-relative generation that outperforms autoregressive and mask-based models on tasks like image captioning and text generation.

Business Value

Enables more efficient and flexible generation of text, code, and other sequential data, leading to improved AI assistants, content creation tools, and programming aids.