Bibby's Latex Templates
Attention Is All You Need
Ashish Vaswani, Noam Shazeer, Niki Parmar, et al.
Revolutionary paper introducing the Transformer architecture, fundamentally changing how we approach sequence-to-sequence tasks in natural language processing through self-attention mechanisms.
Generative Adversarial Networks
Ian J. Goodfellow, Jean Pouget-Abadie, Mehdi Mirza, et al.
Groundbreaking work introducing GANs, a framework for training generative models through adversarial processes between generator and discriminator networks.
A Neural Algorithm of Artistic Style
Leon A. Gatys, Alexander S. Ecker, Matthias Bethge
Pioneering research demonstrating how deep neural networks can separate and recombine content and style of arbitrary images, enabling artistic style transfer.
Deep Residual Learning for Image Recognition
Kaiming He, Xiangyu Zhang, Shaoqing Ren, et al.
Introduction of ResNet architecture with skip connections, enabling training of extremely deep neural networks and achieving breakthrough performance on ImageNet.
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
Jacob Devlin, Ming-Wei Chang, Kenton Lee, et al.
Revolutionary pre-trained language model using bidirectional training, achieving state-of-the-art results across multiple NLP benchmarks.
Language Models are Few-Shot Learners
Tom B. Brown, Benjamin Mann, Nick Ryder, et al.
Introduction of GPT-3, demonstrating that scaling language models leads to emergent few-shot learning capabilities across diverse tasks.
Mastering the Game of Go with Deep Neural Networks and Tree Search
David Silver, Aja Huang, Chris J. Maddison, et al.
AlphaGo's breakthrough achievement in defeating world champion Go players through combination of deep learning and Monte Carlo tree search.
ImageNet Classification with Deep Convolutional Neural Networks
Alex Krizhevsky, Ilya Sutskever, Geoffrey E. Hinton
AlexNet's groundbreaking performance on ImageNet, sparking the deep learning revolution in computer vision with GPU acceleration.