@ArtOfTheProblem
  @ArtOfTheProblem
Art of the Problem | ChatGPT: 30 Year History | How AI Learned to Talk @ArtOfTheProblem | Uploaded 10 months ago | Updated 9 hours ago
This video explores the journey of AI language models, from their modest beginnings through the development of OpenAI's GPT models. Our journey takes us through the key moments in generative neural network research involved in next word prediction. We delve into the early experiments with tiny language models in the 1980s, highlighting significant contributions by researchers like Jordan, who introduced Recurrent Neural Networks, and Elman, whose work on learning word boundaries revolutionized our understanding of language processing. It leaves us with a question: what is thought? Is simulated thought, thought? Featuring Noam Chomsky Douglas Hofstadter Michael I. Jordan Jeffrey Elman Geoffrey Hinton Ilya Sutskever Andrej Karpathy Yann LeCun and more. (Sam altman)

My script, references & visualizations here: docs.google.com/document/d/1s7FNPoKPW9y3EhvzNgexJaEG2pP4Fx_rmI4askoKZPA

consider joining my channel as a YouTube member: youtube.com/channel/UCotwjyJnb-4KW7bmsOoLfkg/join


This is the last video in the series "The Pattern Machine" you can watch it all here: youtube.com/playlist?list=PLbg3ZX2pWlgKV8K6bFJr5dhM7oOClExUJ

00:00 - Introduction
00:32 - hofstader's thoughts on chatGPT
01:00 - recap of supervised learning
01:55 - first paper on sequential learning
02:55 - first use of state units (RNN)
04:33 - first observation of word boundary detection
05:30 - first observation of word clustering
07:16 - first "large" language model Hinton/Sutskever
10:10 - sentiment neuron (Ilya | OpenAI)
12:30 - transformer explaination
15:50 - GPT-1
17:00 - GPT-2
17:55 - GPT-3
18:20 - In-context learning
19:40 - ChatGPT
21:10 - tool use
23:25 - philosophical question: what is thought?
ChatGPT: 30 Year History | How AI Learned to TalkFuncionamiento de Bitcoin: Confianza mecánicaWhat is Computer Memory?Discovery of ElectrostaticsWhy Bitcoin has Value #bitcoin #bitcoinnews #bitcoinminingWhat is Computer Science? | The Turing testClaude Shannons Information Entropy (Physical Analogy)The Beauty of Lempel-Ziv CompressionHamming & low density parity check codesEulers Totient/Phi Function (step 4)Discovery of the Visible SpectrumThe Trust Machine: Teaser

ChatGPT: 30 Year History | How AI Learned to Talk @ArtOfTheProblem

SHARE TO X SHARE TO REDDIT SHARE TO FACEBOOK WALLPAPER