OpenAI's Greg Brockman: The Future of LLMs, Foundation & Generative Models (DALL·E 2 & GPT-3)
Video Statistics and Information
Channel: Scale AI
Views: 167,998
Rating: undefined out of 5
Keywords:
Id: Rp3A5q9L_bg
Channel Id: undefined
Length: 46min 9sec (2769 seconds)
Published: Sun Oct 23 2022
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.
It's interesting that he specifically references Kurzweil's The Singularity is Near. Ideas that seemed a little crazy in 2005 are a lot less so now.
I'd start with these two sound bites!
Why did he choose 40TB of text? Did they hit that number for GPT-4? That would be many trillions of tokens...
Hmmm...
Transcript.
Edit: and my first snip: https://youtu.be/LFx5q3m\_F68