Advanced AI: Transformers for NLP using Large Language Models
- 10.11.2022
- 12

Advanced AI: Transformers for NLP using Large Language Models
Linkedin Learning
Duration: 1h 8m | Video: 1280x720 30fps | Audio: AAC, 48 kHz, 2ch | Size: 134 MB
Genre: eLearning | Language: English
Transformers have quickly become the go-to architecture for natural language processing (NLP). As a result, knowing how to use them is now a business-critical skill in your AI toolbox. In this course, instructor Jonathan Fernandes walks you through many of the key large language models developed since GPT-3. He presents a high-level overview of GLaM, Megatron-Turing NLG, Gopher, Chinchilla, PaLM, OPT, and BLOOM, relaying some of the most important insights from each model.
Get a high-level overview of large language models, where and how they are used in production, and why they are so important to NLP. Additionally, discover the basics of transfer learning and transformer training to optimize your AI models as you go. By the end of this course, you'll be up to speed with what's happened since OpenAI first released GPT-3 as well as the key contributions of each of these large language models.
Screenshots
#
https://rapidgator.net/file/75e99f99e1b6be3df59c43f338479a88/Advanced_AI_Transformers_for_NLP_using_Large_Language_Models.rar.html
https://uploadgig.com/file/download/Eb7e144729F27e7a/Advanced_AI_Transformers_for_NLP_using_Large_Language_Models.rar
https://uploadgig.com/file/download/Eb7e144729F27e7a/Advanced_AI_Transformers_for_NLP_using_Large_Language_Models.rar