Hero image

Modern Natural Language Processing

Cohort-based Course

See how OpenAI and Cohere pre-train and fine-tune language models like BERT, GPT, and T5 to create powerful and fair NLP pipelines

Hosted by

Sinan Ozdemir

Textbook author, Johns Hopkins lecturer, and Founder of Kylie.ai (acquired)

Course overview

Master Large Language Models in 2 weeks

Build fair NLP pipelines with modern techniques and architectures through a unique hands-on approach.

Who is this course for

01

NLP / Machine Learning Engineers who want to learn about modern large language models.

02

Data Scientists looking to reduce bias in their NLP pipelines.

03

AI enthusiasts exploring NLP to build fun projects!

What you will do

Build a robust information retrieval system and Q/A system

Use S-BERT to index documents at scale and both GPT and BERT to answer questions given indexed documents.

Build multi-task NLP pipelines

Use GPT and other causal language models to perform both sentiment analysis and abstractive summarization at once using the same training dataset

Train an image captioning system

Combine the Vision Transformer with GPT to create an automatic image captioning system from scratch.

Quantify and mitigate biases in language models

Learn how LLMs pick up on bias and how to mitigate it at both the data and model level.

Share and deploy models with the world

Share what you build by building Streamlit apps using LLMs and deploying them to Huggingface.

Meet your instructor

Sinan Ozdemir

Sinan Ozdemir

Sinan is an experienced teacher, engineer, author, and entrepreneur.


  • He has written 5 books about Data Science and Machine Learning
  • He was an adjunct lecturer for The Johns Hopkins University teaching Data Science
  • He started and sold Kylie.ai, a company dedicated to providing top-tier automations for Fortune 100 companies
  • He lectures for O'Reilly and Pearson on Large Language Models
  • He advises for VCs in the Bay Area on AI strategy and implementation
A pattern of wavy dots
Be the first to know about upcoming cohorts

Modern Natural Language Processing

Course syllabus

01

Introduction to Large Language Models

Learn the history of modern NLP from the introduction of deep learning in solve language tasks to language modeling and the modern Transformer architecture

02

Natural Language Understanding with BERT

We will dissect the BERT architecture and the scaled dot product attention formula to understand how the encoder stack of the Transformer can be used to achieve state of the art results for Natural Language Understanding (NLU) tasks.

03

Natural Language Generation with GPT and Cohere

We will dissect the GPT architecture and see how the altered masked attention formula makes the decoder stack of the Transformer able to achieve state of the art results for Natural Language Understanding (NLU) tasks. We will also use the GPT-3 API as well as Cohere to solve NLP tasks

04

Detecting and Mitigating Algorithmic Bias

Large Language Models learn from the text they process. If that text comes from less-than-reliable sources then the models themselves will absorb and reproduce any bias. We will see how we can identify and quantify such biases and work to remove them as much as possible in our fine-tuned models.

05

Using T5 and the Vision Transformer

We will use two powerful Transformer-based models: T5 and The Vision Transformer to solve some of the more complex NLP tasks including image captioning and abstractive summarization.

06

Sharing custom language models with the open-source community

It is crucial that we learn how to share our work with others and understand the different licenses out there. We will explore the Huggingface Website and all it has to offer including their built in inference API, sharing features, and how to create spaces using Streamlit.

Course schedule

4-6 hours per week
  • Tuesdays & Thursdays

    8:00am - 10:00am PT

    Sessions are twice a week for two hours each. Each session will have breaks and Q/A baked in.

  • Weekly office hours

    1 hour per week

    We will have at least one office hour a week where you can ask me anything about our course or something AI related. I'd love to hear about what you're working on and even pair program!

A pattern of wavy dots
Be the first to know about upcoming cohorts

Modern Natural Language Processing

Learning is better with cohorts

Learning is better with cohorts

Active learning, not passive watching

This course builds on live workshops and hands-on projects

Interactive and project-based

You’ll be interacting with other learners through breakout rooms and project teams

Learn with a cohort of peers

Join a community of like-minded people who want to learn and grow alongside you

Frequently Asked Questions

What happens if I can’t make a live session?
I work full-time, what is the expected time commitment?
What’s the refund policy?
Will this course be mostly code or theory?
A pattern of wavy dots
Be the first to know about upcoming cohorts

Modern Natural Language Processing