‹ All episodes
ML Cult
September 12th, 2023 - Frontiers in AI: From Pint-Sized Powerhouses and Pruned Datasets to Multilingual Mastery and Image Restoration
September 12, 2023
Marcus Edel
September 12th, 2023 - Frontiers in AI: From Pint-Sized Powerhouses and Pruned Datasets to Multilingual Mastery and Image Restoration
ML Cult
Chapters
0:00
Intro
2:20
Textbooks Are All You Need II: phi-1.5 technical report
4:42
DiffBIR: Towards Blind Image Restoration with Generative Diffusion Prior
5:59
When Less is More: Investigating Data Pruning for Pretraining LLMs at Scale
7:21
MADLAD-400: A Multilingual And Document-Level Large Audited Dataset
8:32
FIAT: Fusing learning paradigms with Instruction-Accelerated Tuning
10:01
Optimize Weight Rounding via Signed Gradient Descent for the Quantization of LLMs
More Info
ML Cult
September 12th, 2023 - Frontiers in AI: From Pint-Sized Powerhouses and Pruned Datasets to Multilingual Mastery and Image Restoration
Sep 12, 2023
Marcus Edel
Textbooks Are All You Need II: phi-1.5 technical report
DiffBIR: Towards Blind Image Restoration with Generative Diffusion Prior
When Less is More: Investigating Data Pruning for Pretraining LLMs at Scale
MADLAD-400: A Multilingual And Document-Level Large Audited Dataset
FIAT: Fusing learning paradigms with Instruction-Accelerated Tuning
Optimize Weight Rounding via Signed Gradient Descent for the Quantization of LLMs
Support the Show.
Share
Share Episode
Share on Facebook
Share on Twitter
Share on LinkedIn
Download
Support Podcast
Support
Subscribe
Apple Podcasts
Spotify
More
Apple Podcasts
Spotify
Amazon Music
Podcast Index
Overcast
Podcast Addict
Castro
Castbox
Pocket Casts
Deezer
Player FM
Goodpods
Podfriend
TrueFans
RSS Feed
Buzzsprout
Listen on
Apple Podcasts
Spotify
Amazon Music
Podcast Index
Overcast
Podcast Addict
+
Share Episode
Share on Facebook
Share on Twitter
Share on LinkedIn
Share Link
Share This Episode
Copy
Start at
ML Cult +
Become a supporter of the show!
Starting at $3/month
Support
Show Notes
Chapter Markers
Textbooks Are All You Need II: phi-1.5 technical report
DiffBIR: Towards Blind Image Restoration with Generative Diffusion Prior
When Less is More: Investigating Data Pruning for Pretraining LLMs at Scale
MADLAD-400: A Multilingual And Document-Level Large Audited Dataset
FIAT: Fusing learning paradigms with Instruction-Accelerated Tuning
Optimize Weight Rounding via Signed Gradient Descent for the Quantization of LLMs
Support the Show.
0:00
Intro
2:20
Textbooks Are All You Need II: phi-1.5 technical report
4:42
DiffBIR: Towards Blind Image Restoration with Generative Diffusion Prior
5:59
When Less is More: Investigating Data Pruning for Pretraining LLMs at Scale
7:21
MADLAD-400: A Multilingual And Document-Level Large Audited Dataset
8:32
FIAT: Fusing learning paradigms with Instruction-Accelerated Tuning
10:01
Optimize Weight Rounding via Signed Gradient Descent for the Quantization of LLMs
×
Listen to this podcast on
Apple Podcasts
Spotify
Amazon Music
Podcast Index
Overcast
Podcast Addict
Castro
Castbox
Pocket Casts
Deezer
Player FM
Goodpods
Podfriend
TrueFans
RSS Feed