Language: English
The “Generative AI with LLMs” course from DeepLearning.AI offers an in-depth study of generative AI using large language models (LLMs). Here are the main points:
- Contents : The course covers the key stages of a typical LLM-based generative AI lifecycle, from data collection and model selection to performance assessment and deployment.
- Technical Details : You'll learn details about the transformer architecture that powers LLMs, how they learn, and how fine-tuning allows LLMs to be tailored to different specific use cases.
- Practical skills : You will learn to use empirical scaling laws to optimize a model's objective function based on data set size, computational budget, and inference requirements.
- Application : You will learn to apply advanced training, tuning, inference, tooling, and deployment techniques to maximize model performance within the specific constraints of your project.
- Business Perspectives : You will discuss the challenges and opportunities that generative AI poses for business after hearing stories from researchers and industry practitioners.
This course is suitable for data scientists, research engineers, machine learning engineers, and anyone interested in generative AI.