- CS 598 (Spring
2024): Principles of
Generative AI:
Recent advancements in generative AI have equipped machine learning algorithms with the ability to learn from and accurately replicate observed data, creating new, similar data instances. This course provides an in-depth exploration of the key algorithmic developments in generative models, together with their underlying mathematical principles. We will cover a range of topics such as normalizing flows, variational autoencoders, Langevin algorithms, generative adversarial networks, diffusion models, and sequence generation models, etc.
- CS 598 (Fall
2024): Machine Learning Algorithms for Large Language Models:
This course is a general overview of machine learning algorithms
used in the current development of large language models
(LLMs). It covers a relatively broad range of topics, starting
with mathematical models for sequence generation, and important
neural network architectures with a focus on transformers. We will
then investigate variants of transformer based language models,
along with algorithms for prompt engineering and improving
reasoning capability. Other topics include ML techniques used in
studying LLM safety, hallucination, fine-tuning of LLMs, alignment
(reinforcement learning from human feedback), multimodal LLMs, and
common methods for accelerating training and inference.