AI Says It Can Compress Better Than FLAC?! Hold My Entropy 馃嵖 (Ep. 268)

Data Science at Home - Un p贸dcast de Francesco Gadaleta

Categor铆as:

Can AI really out-compress PNG and FLAC? 馃 Or is it just another overhyped tech myth? In this episode of Data Science at Home, Frag dives deep into the wild claims that Large Language Models (LLMs) like Chinchilla 70B are beating traditional lossless compression algorithms. 馃馃挜 But before you toss out your FLAC collection, let's break down Shannon's Source Coding Theorem and why entropy sets the ultimate limit on lossless compression. We explore: 鈿欙笍 How LLMs leverage probabilistic patterns for compression 馃搲 Why compression efficiency doesn鈥檛 equal general intelligence 馃殌 The practical (and ridiculous) challenges of using AI for compression 馃挕 Can AI actually BREAK Shannon鈥檚 limit鈥攐r is it just an illusion? If you love AI, algorithms, or just enjoy some good old myth-busting, this one鈥檚 for you. Don't forget to hit subscribe for more no-nonsense takes on AI, and join the conversation on Discord! Let鈥檚 decode the truth together. Join the discussion on the new Discord channel of the podcast https://discord.gg/4UNKGf3 聽 Don't forget to subscribe to our new YouTube channel聽 https://www.youtube.com/@DataScienceatHome 聽 聽 References Have you met Shannon? https://datascienceathome.com/have-you-met-shannon-conversation-with-jimmy-soni-and-rob-goodman-about-one-of-the-greatest-minds-in-history/ 聽 聽

Visit the podcast's native language site