Rethinking Attention with Performers (Paper Explained)
Video Statistics and Information
Channel: Yannic Kilcher
Views: 42,144
Rating: undefined out of 5
Keywords: deep learning, machine learning, arxiv, explained, neural networks, ai, artificial intelligence, paper, nlp, natural language processing, natural language understanding, data science, transformer, attention, attention mechanism, transformers, attention is all you need, gpus, tpu, linformer, reformer, explanation, imagenet64, kernels, gaussian kernel, softmax, softmax kernel, approximation, random features, random positive features, random fourier features, google, favor, machine translation
Id: xJrKIPwVwGM
Channel Id: undefined
Length: 54min 38sec (3278 seconds)
Published: Mon Oct 26 2020
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.