Neural sequence-to-sequence models are finding increasing use in editing of documents, for example correcting a text document or repairing source code. In this paper, we argue that common seq2seq (with facility to copy single tokens) not natural fit such tasks, as they have explicitly each unchanged token. We present an extension capable copying entire spans the input output one step, greatly r...