Together AI announces its $102.5M Series A

Together AI, the creators of the cloud platform that allows the creation of custom open models, recently raised $102.5 million in a Series A round led by Kleiner Perkins. The firm was joined by a group of investors, including NVIDIA, Emergence Capital, and many of Together's seed investors. Bucky Moore from Kleiner Perkins will join the Together AI board as part of this financing agreement. According to the announcement, the capital will be put into accelerating Together AI's goal of "creating the fastest cloud platform for generative AI applications."

The company reports that it has seen wide adoption of its products among startups and enterprises since launching in June 2023. Thus, Together AI wants to benefit from this impulse by scaling its offerings and building new products that simplify the integration of AI into applications. In the eyes of Bucky Moore, "AI is a new layer of infrastructure that is changing how we develop software." He also emphasized that making this new infrastructure available to all developers is essential to maximizing its potential.

Together AI has a mission that aligns perfectly with the needs of enterprises and startups wanting to build a generative AI strategy for their businesses that does not necessarily rely on closed proprietary solutions and is free from single vendor lock-in. It is for this reason that Together is looking to harness the potential of open-source models to provide a strong foundation for enterprise applications. Developers using the platform can create custom models from scratch via pre-training or fine-tune industry-leading open-source models. Customers retain ownership over their custom models, which can be run on any platform developers choose.

In addition to its commitment to its clients, Together AI has also adopted a research-driven approach to creating its products and services. The company publishes its research under open-source licenses for the benefit of the AI community. Together AI also released the RedPajama-V2 dataset, the largest open dataset consisting of an impressive 30T tokens for LLM training. The company is also behind contributions such as the FlashAttention v2 algorithm, the fastest inference stack for transformer models, and its research on sub-quadratic models.

None of these things would be possible were it not for the company's impressive compute infrastructure: the Together GPU Clusters infrastructure is growing to an astonishing 20 exaflops. The infrastructure was created so that Together could build its models, like RedPajama. However, since the company made its NVIDIA-powered infrastructure publicly available, industry leaders such as Pika Labs, NexusFlow, Voyage AI, and Cartesia are already building custom models there. The Together GPU Clusters are the most recent stepping stone in a road filled with ground-breaking successes. What's more significant, Together AI shows no signs of slowing down, so it will be exciting to see the company bringing its plans for the future to fruition.