Large Language Models#
In this session we discuss transformer architectures for large language models. We will learn about the motivation and benefits of transformer models, and how to use pre-trained LLMs through huggingface’s ’transformers’ package.
Learning goals for this session#
understand limitations of RNNs and LSTMs
understand the basics of transformer based architectures
become able to use the ’transformers’ package to access pre-trained large-language models
Slides#
Here are the slides for this session.