Strategic collaboration empowers organizations of all sizes with efficient, high-performance, generative AI solutions
As we approach the end of 2024, it's clear that the rise of small language models (SLMs) has been one of the defining developments in AI this year. For those of you still stuck in LLM-world, here's a crash course on SLMs.
Today Arcee AI makes our latest contribution to AI in underserved global languages with the release of a 3B Vietnamese SLM, Arcee-VyLinh.
New research details a novel approach to model merging that could help to improve enterprise AI deployments.
Hot on the heels of our top-performing 72B Arabic-language model Arcee-Meraj, we bring you a 7B version: Arcee-Mini-Meraj, which boasts exceptional performance in instruction-following, generating long texts, structured data understanding, and generating structured outputs.
First came our flagship 70B SuperNova, followed by the 8B SuperNova-Lite. Today we add to this family of superpower Small Language Models with the release of the 14B SuperNova-Medius.
Arcee AI unveils SuperNova: a customizable 70B parameter model for enterprises seeking data privacy, stability, and an ownable AI alternative.
Meet Arcee-SuperNova: a groundbreaking model with state-of-the-art abilities in instruction-following and strong alignment with human preferences.
We trained Arcee SuperNova-70B and Arcee SuperNova-8B to be a generally intelligent Llama-3.1-405B derivatives using intelligent distillation, novel post-training, and model merging techniques.