Subscribe to Our Newsletter

Success! Now Check Your Email

To complete Subscribe, click the confirmation link in your inbox. If it doesn't arrive within 3 minutes, check your spam folder.

Ok, Thanks

EleutherAI’s Announces GPT-NeoX-20B

EleutherAI’s has announced their newest open-source language model: GPT-NeoX-20B [https://blog.eleuther.ai/announcing-20b/]. It is a 20 billion parameter model trained by using the GPT-NeoX framework on GPUs generously provided by the team from CoreWeave. GPT-NeoX-20B is the biggest candidly available educated general-purpose autoregressive language model, and the

Dmitry Spodarets profile image
by Dmitry Spodarets
EleutherAI’s Announces GPT-NeoX-20B

EleutherAI’s has announced their newest open-source language model: GPT-NeoX-20B. It is a 20 billion parameter model trained by using the GPT-NeoX framework on GPUs generously provided by the team from CoreWeave. GPT-NeoX-20B is the biggest candidly available educated general-purpose autoregressive language model, and the team predicts it to work great on many tasks.

In addition to that, it is going to be created a 20b channel in Discord for conversations about this model. Bear in mind that GPT-NeoX and GPT-NeoX-20B are research artifacts, and the team doesn't recommend deploying it in a production setting without accurate consideration.

Dmitry Spodarets profile image
by Dmitry Spodarets

Data Phoenix Digest

Subscribe to the weekly digest with a summary of the top research papers, articles, news, and our community events, to keep track of trends and grow in the Data & AI world!

Success! Now Check Your Email

To complete Subscribe, click the confirmation link in your inbox. If it doesn’t arrive within 3 minutes, check your spam folder.

Ok, Thanks

Read More