A community credit grant for developers, researchers, and open source builders working with Trinity models. Apply for free inference access on the Arcee API.
Learn how to set up Hermes Agent powered by Trinity-Large-Thinking. This guide covers installation, tool configuration, and launching your AI assistant.
Trinity-Large-Thinking is live. A frontier open reasoning model for complex, long-horizon agents and multi-turn tool calling released under Apache 2.0.
A deep dive into Trinity Large, covering architecture, sparsity, training at scale, and why we shipped Preview, Base, and TrueBase checkpoints.
Learn how Kimi Delta Attention was distilled into AFM-4.5B using knowledge distillation, long-context training, and Arcee’s open-source DistillKit.
Arcee introduces Trinity Mini, a compact MoE model trained end-to-end in the U.S., offering open weights, strong reasoning, and full control for developers.
Effective Friday, October 31, 2025, we are returning Mergekit to the GNU Lesser General Public License v3.
MergeKit helped evaluate merged checkpoints and select top performers, reflecting a broader enterprise shift to open models & reproducible tooling.
Explore how to optimize small language models on Intel’s latest CPU, utilizing Arcee AI’s AFM-4.5B and Intel-optimized inference libraries.