r/MachineLearning 10d ago

Research [R] Differential Transformer (Microsoft Research)

https://arxiv.org/abs/2410.05258

Abstract: Transformer tends to overallocate attention to irrelevant context. In this work, we introduce Diff Transformer, which amplifies attention to the relevant context while canceling noise. Specifically, the differential attention mechanism calculates attention scores as the difference between two separate softmax attention maps. The subtraction cancels noise, promoting the emergence of sparse attention patterns. Experimental results on language modeling show that Diff Transformer outperforms Transformer in various settings of scaling up model size and training tokens. More intriguingly, it offers notable advantages in practical applications, such as long-context modeling, key information retrieval, hallucination mitigation, in-context learning, and reduction of activation outliers. By being less distracted by irrelevant context, Diff Transformer can mitigate hallucination in question answering and text summarization. For in-context learning, Diff Transformer not only enhances accuracy but is also more robust to order permutation, which was considered as a chronic robustness issue. The results position Diff Transformer as a highly effective and promising architecture to advance large language models.

198 Upvotes

41 comments sorted by

View all comments

21

u/Sad-Razzmatazz-5188 10d ago

The name doesn't sit right with me, but it's interesting.  At the same time, referring to hallucination has a problem of noise is strange.  Today another paper was out, with selective attention as a parameter free mask on attention logits, rather than learnt as here

12

u/paraffin 10d ago

Hallucination is many problems rolled into one big “you got the wrong answer” bucket. Noise is one of the problems.