EleutherAI’s Announces GPT-NeoX-20B

EleutherAI’s has announced their newest open-source language model: GPT-NeoX-20B. It is a 20 billion parameter model trained by using the GPT-NeoX framework on GPUs generously provided by the team from CoreWeave. GPT-NeoX-20B is the biggest candidly available educated general-purpose autoregressive language model, and the team predicts it to work great on many tasks.

In addition to that, it is going to be created a 20b channel in Discord for conversations about this model. Bear in mind that GPT-NeoX and GPT-NeoX-20B are research artifacts, and the team doesn't recommend deploying it in a production setting without accurate consideration.