Can OpenAI Codex Recreate Itself?
Video Statistics and Information
Channel: Edan Meyer
Views: 7,834
Rating: undefined out of 5
Keywords: openai codex, github copilot, AI, codex ai, machine learning, openai copilot, ai singularity, self improving ai, ai that codes, ai that codes for you, self programming ai github, meta machine learning, nlp, nlp for code, GPT, gpt-4, gpt-3, machine learning model, openai codex demo, openai codex tutorial, codex demo, what is openai codex, how to use openai codex, two minute papers, openai, codex, self-replicating ai, codex openai, open ai codex
Id: 7QWVJ5rWy2s
Channel Id: undefined
Length: 31min 24sec (1884 seconds)
Published: Sat Sep 18 2021
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.
Interesting. As far as I can tell, your question of wether Codex could create itself is a little inconclusive. It was certainly not a complete failure, so the question cannot definitively be answered in the negative.
As far as I can see, it mostly generates syntactically valid code and got close to generating a proper "Hello World". Do you think the model you designed and Codex built would have got there in with more training time and tweaking GPT-2 model parameters?
An alternative title could have been: "Can you bootstrap a GPT-3 using GPT-2?". Unfortunately, you had to abandon the question because of limited resources. There's been a lot of discussion suggesting that just making models bigger isn't a silver bullet. So it would be significant if anyone could distill the Codex knowledge into a much smaller model. Although distillation happens frequently (ie ImageNet and GPT-J), being able to DIY with your own model would be cool.