practical skills and tangible tools to accelerate your career immediately

Rise of the SLMs: The inevitable future of Language models

Hosted by Mahesh Yadav

Thu, Oct 10, 2024

3:00 PM UTC (30 minutes)

Virtual (Zoom)

Free to join

140 signed up

Invite your network

What you'll learn

Optimize AI Models for the Edge

Peaking in how to shrink large language models for faster, more efficient performance on edge devices.

Master the Shift from LLMs to SLMs

Understand how the AI industry is transitioning to SLMs and why they are becoming essential in driving AI applications o

Implement SLMs in Real-World Applications

Discover practical techniques for deploying small language models (SLMs) directly on devices, without relying on the clo

Why this topic matters

SLMs represent the future of AI on edge devices, offering faster and more secure applications, which will be critical in scaling AI products for real-world use. Big picture, Mastering SLMs equips professionals with cutting-edge skills, allowing them to stay competitive and drive innovation in AI-driven industries like tech, healthcare, and legal.

You'll learn from

Mahesh Yadav

GenAI Product Lead at Google, Ex-Meta, Ex-Amazon & Ex-Microsoft l AI PM Mentor

Mahesh has 20 years of experience in building products at Meta, Microsoft and AWS AI teams. Mahesh has worked in all layers of the AI stack from AI chips to LLM and has a deep understanding of how GenAI companies ship value to customers. His work on AI has been featured in the Nvidia GTC conference, Microsoft Build, and Meta blogs.

His mentorship has helped various students in building Real time products & Career in GenAI PM.



Previously at

Meta
Microsoft
Amazon Web Services

Learn directly from Mahesh Yadav

By continuing, you agree to Maven's Terms and Privacy Policy.

© 2024 Maven Learning, Inc.