
Mistral AI has updated its code generation model Codestral to make it more efficient
Mistral recently announced an update to its code generation model, equipping it with a new architecture and improved tokenizer. Codestral 25.01 supports a leading 256K context length, performs strongly on popular coding benchmarks, and outperforms its rivals on fill-in-the-middle (FIM) tasks.