Get direct access to the small language models (SLMs) that power Arcee Orchestra, our new end-to-end, SLM-powered agentic AI platform. Sign up for the public beta of the Arcee Model Engine today.
The Arcee AI research team is honored to be among the contributors to the world's first fully decentralized training of a large language model (LLM). Read about the game-changing project led by Prime Intellect, and how we brought our expertise to the post-training.
Strategic collaboration empowers organizations of all sizes with efficient, high-performance, generative AI solutions
As we approach the end of 2024, it's clear that the rise of small language models (SLMs) has been one of the defining developments in AI this year. For those of you still stuck in LLM-world, here's a crash course on SLMs.
Today Arcee AI makes our latest contribution to AI in underserved global languages with the release of a 3B Vietnamese SLM, Arcee-VyLinh.
New research details a novel approach to model merging that could help to improve enterprise AI deployments.
Hot on the heels of our top-performing 72B Arabic-language model Arcee-Meraj, we bring you a 7B version: Arcee-Mini-Meraj, which boasts exceptional performance in instruction-following, generating long texts, structured data understanding, and generating structured outputs.
First came our flagship 70B SuperNova, followed by the 8B SuperNova-Lite. Today we add to this family of superpower Small Language Models with the release of the 14B SuperNova-Medius.
Arcee AI unveils SuperNova: a customizable 70B parameter model for enterprises seeking data privacy, stability, and an ownable AI alternative.