The DeepSeek Paradox: will LLM Optimization save the Planet?
Hosted by Pascal Joly
What you'll learn
When smaller is better
DeepSeek v3 and R1 just released, and the race is on for model optimization. How much energy these techniques can save?
The rebound effect
When high performance and low cost model are released, everyone wants to use them. With unintended consequences.
Making the right choice
The trend of LLM reasoning models is having an impact on energy usage during inference. How much and why should we care?
Why this topic matters
DeepSeek R1 launched. The stock market of some tech companies took a beating: bigger is not always better.
For a short moment, those who worry about the impact of AI on the climate crisis could take a victory lap: finally some true energy efficient models available to all!
Then came the realization that with the Jevons paradox, all bets are back on the table. LLM optimization is not the end game.
You'll learn from
Pascal Joly
25 years of hands on experience in the IT industry
Pascal combines deep expertise in data center operations with a focus on AI sustainability. As a Terra.do fellow and certified AI/ML professional, he bridges the gap between technical efficiency and environmental impact. His years of experience optimizing data centers through automation, as an engineer and product leader, gives him unique insights into measuring and reducing AI's carbon footprint.
Advised, taught and worked for
Go deeper with a course
Sustainable AI - Reducing Carbon Footprint and Optimizing Performance

Pascal Joly
Sustainability Consultant, IT Climate Ed
Keep exploring