Ghostboard pixel

Subscribe to Our Newsletter

Success! Now Check Your Email

To complete Subscribe, click the confirmation link in your inbox. If it doesn't arrive within 3 minutes, check your spam folder.

Ok, Thanks

Cohere launches Command-R, a scalable model focusing on RAG and tool use

Cohere recently launched Command-R, a RAG-focused model incorporating several features designed with scalable implementation in mind, including a large context window, business proficiency in 10 languages, and tool access via external APIs. Command-R is available for deployment starting March 11.

Ellie Ramirez-Camara profile image
by Ellie Ramirez-Camara
Cohere launches Command-R, a scalable model focusing on RAG and tool use
Image credit: Cohere

Cohere designed Command-R to be used alongside its other models, Embed and Rerank, to deliver state-of-the-art performance in RAG applications and other enterprise use cases. Among the features for scalable implementation, Command-R features a 128K-token context window at a lower cost, business proficiency in 10 languages, and tool access via external APIs, all emphasizing security and privacy. Command-R is available starting March 11 via Cohere's API and in the demo environment, with upcoming availability on most cloud providers. Cohere for AI, the company's non-profit lab is releasing the model weights on Hugging Face to support academic research and independent evaluation.

When Command-R works in parallel with Embed and Rerank in RAG applications, Embed and Rerank are designed for application during the retrieval stage of the process, while Command-R focuses on augmented generation. Embed makes searches more convenient and accurate by improving contextual and semantic understanding while reviewing substantial amounts of documents. Rerank refines the results by testing them against personalized metrics, such as relevance and personalization, to increase the value of the retrieved information. Finally, Command-R can perform several tasks on the retrieved data, including summarization and analysis, and "generally put that information to work in ways that help employees be more productive, or to create a magical new product experience." Command-R mitigates the risk of hallucination while performing these tasks on private information by providing transparent citations and the possibility of diving into additional context.

Command-R delivers effective and scalable enterprise performance even when deployed in isolation. Command-R outperforms models advertised as ideal for scalable deployments, such as Mixtral, Llama2- 70B, and GPT-3.5 Turbo. Command-R differentiates itself from other RAG-focused models because of its ability to use APIs for user-defined external and infrastructure-using tools, such as databases, CRMs, and search engines. Moreover, its proficiency in languages such as English, French, Spanish, Italian, German, Portuguese, Japanese, Korean, Arabic, and Chinese makes Command-R ideal for extracting value from a wide array of resources, enabling users who are native speakers of any of these languages to interact with the model through clear and accurate dialogue. Command-R's performance is additionally boosted by its longer context length, with a lower pricing when compared with Command.

Prospective users can contact Cohere's sales team for additional details on pricing and deployments.

News of Command-R follows the recent announcement of Cohere's collaboration with Accenture, where the former will be able to deliver solutions powered by Cohere's Command and Embed models and its Rerank search tool.

Ellie Ramirez-Camara profile image
by Ellie Ramirez-Camara
Updated

Data Phoenix Digest

Subscribe to the weekly digest with a summary of the top research papers, articles, news, and our community events, to keep track of trends and grow in the Data & AI world!

Success! Now Check Your Email

To complete Subscribe, click the confirmation link in your inbox. If it doesn’t arrive within 3 minutes, check your spam folder.

Ok, Thanks

Read More