Loading…

Subscribe to the podcast

Get The Stack Overflow Podcast at your favorite listening service.

How your favorite movie is changing language learning technology

Ryan sits down with CTO Aruna Srivastava and CPO Ruslan Mukhamedvaleev from Koel Labs to talk about how they’re innovating speech technology with the help of AI and classic movies. They also tell Ryan about their time in the Mozilla Builders Accelerator and their experiences as student co-founders in an ever-changing economic and technological landscape.

Attention isn’t all we need; we need ownership too

Ryan welcomes Illia Polosukhin, co-author of the original "Attention Is All You Need" Transformers paper and co-founder of NEAR, on the show to talk about the development and impact of the Transformers model, his perspective on modern AI and machine learning as an early innovator of the tech, and the importance of decentralized, user-owned AI utilizing the blockchain.

Programming problems that seem easy, but aren't, featuring Jon Skeet

Jon Skeet, the first Stack Overflow user with a million reputation, sits down with Ryan to share his wealth of knowledge on all things development: the deceptively simple but actually complicated problem of timezones, the importance of clear documentation for programmers, handling breaking changes and upgrading legacy systems, and the need for improved communication skills among developers.

“We’re not worried about compute anymore”: The future of AI models

Ryan Donovan and Ben Popper sit down with Jamie de Guerre, SVP of Product at Together AI, to discuss the evolving landscape of AI and open-source models. They explore the significance of infrastructure in AI, the differences between open-source and closed-source models, and the ethical considerations surrounding AI technology. Jamie emphasized the importance of leveraging internal data for model training and the need for transparency in AI practices.

“The future is agents”: Building a platform for RAG agents

Douwe Kiela, CEO and cofounder of Contextual AI, joins Ryan and Ben to explore the intricacies of retrieval-augmented generation (RAG). They discuss the early research Douwe did at Meta that jump started the whole thing, the challenges of hallucinations, and the significance of context windows in AI applications.