~~NOCACHE~~ {{page>.:indexheader}} \\ ==== Next talks ==== [[en:seminaires:StatP6P7:index|Statistics seminar]]\\ Tuesday April 14, 2026, 10:45AM, Jussieu en salle 15-16 309\\ **Hugo Cui** (Université Paris-Saclay) //High-dimensional analysis of a single-layer attention for sparse token classification// \\ When and how can an attention mechanism learn to selectively attend to informative tokens, thereby enabling detection of weak, rare, and sparsely located features? We address these questions theoretically in a sparse-token classification model in which positive samples embed a weak signal vector in a randomly chosen subset of tokens, whereas negative samples are pure noise. In the long-sequence limit, we show that a simple single-layer attention classifier can in principle achieve vanishing test error when the signal strength grows only logarithmically in the sequence length L, whereas linear classifiers require sqrt(L) scaling. Moving from representational power to learnability, we study training at finite in a high-dimensional regime, where sample size and embedding dimension grow proportionally. We prove that just two gradient updates suffice for the query weight vector of the attention classifier to acquire a nontrivial alignment with the hidden signal, inducing an attention map that selectively amplifies informative tokens. We further derive an exact asymptotic expression for the test error and training loss of the trained attention-based classifier, and quantify its capacity -- the largest dataset size that is typically perfectly separable -- thereby explaining the advantage of adaptive token selection over nonadaptive linear baselines. Joint work with Nicholas Barnfield and Yue M Lu. [[en:seminaires:StatP6P7:index|Statistics seminar]]\\ Tuesday May 12, 2026, 10:45AM, Sophie Germain en salle 2018\\ **François Roueff** (Télécom Paris) //To be announced.// \\ {{page>.:info}} \\ ==== Previous talks ==== \\ === Year 2026 === {{page>.:statp6p72026}} \\ === Year 2025 === {{page>.:statp6p72025}} \\ === Year 2024 === {{page>.:statp6p72024}} \\ === Year 2023 === {{page>.:statp6p72023}} \\ === Year 2022 === {{page>.:statp6p72022}} \\ === Year 2021 === {{page>.:statp6p72021}}