Subscribe to Our Newsletter

Success! Now Check Your Email

To complete Subscribe, click the confirmation link in your inbox. If it doesn't arrive within 3 minutes, check your spam folder.

Ok, Thanks

Magic announced a new funding round and a strategic partnership with Google

Magic AI has announced a major partnership with Google Cloud to build next-generation AI supercomputers powered by NVIDIA GPUs, alongside a significant $465 million funding round from notable investors.

Ellie Ramirez-Camara profile image
by Ellie Ramirez-Camara
Magic announced a new funding round and a strategic partnership with Google
Credit: Magic

Magic, a startup developing a coding assistant powered by LLMs accepting ultra-long contexts—5 to 10 million tokens, has announced a strategic partnership with Google Cloud and a substantial funding round. These milestones are fundamental steps towards Magic's goal of building two supercomputers on Google Cloud: Magic-G4, leveraging NVIDIA H100 Tensor Core GPUs, and Magic-G5, powered by NVIDIA GB200 NVL72. Magic expects that the G5 supercomputer will "scale to tens of thousands of Blackwell GPUs over time."

It is easy to see why ultra-long context understanding is an attractive feature for an AI-powered assistant. Magic envisions AI-powered software engineering assistants that will be more useful because they can handle a developer's (or maybe even an organization's) code, documentation, and libraries in context. To do this, the startup has been addressing the limitations of traditional long-context evaluations, such as needle-in-a-haystack.

Since Magic does not yet have any commercially viable products, as proof of progress they've shared that their latest 100M token context model, LTM-2-mini, has achieved two remarkable feats: although it is orders of magnitude smaller than frontier models, LTM-2-mini created a calculator using a custom in-context GUI framework and autonomously implemented a password strength meter for the open source repo Documenso. Magic also reports that it is currently training a larger LTM-2 model, and expects that the G5 supercomputers, which will leverage the GB200 NLV72 system, will boost inference and training efficiency for its models.

Magic has raised an overall $465 million in funding, mostly due to a recent investment of $320 million from notable investors including Eric Schmidt, Jane Street, Sequoia, and Atlassian. Additionally, existing investors such as Nat Friedman, Daniel Gross, Elad Gil, and CapitalG participated in the funding round. The startup actively looks to grow its 23-person team with more researchers, engineers, supercomputing and systems engineers (who can join Ben Chess, OpenAI's former Supercomputing Lead), and even a new Head of Security.

Ellie Ramirez-Camara profile image
by Ellie Ramirez-Camara
Updated

Data Phoenix Digest

Subscribe to the weekly digest with a summary of the top research papers, articles, news, and our community events, to keep track of trends and grow in the Data & AI world!

Success! Now Check Your Email

To complete Subscribe, click the confirmation link in your inbox. If it doesn’t arrive within 3 minutes, check your spam folder.

Ok, Thanks

Read More