Arcee Blog

Company
July 30, 2025

Arcee AI Secures Strategic Investment to Accelerate Enterprise-Grade AI Platform Built on AFM Foundation Models

Prosperity7 Ventures, M12, Hitachi Ventures, JC2, Wipro, Samsung, and Guidepoint are now backing Arcee AI.

Open-Source SLMs
July 29, 2025

Announcing the Official Launch of AFM-4.5B

Today, we’re excited to officially take AFM-4.5B out of preview and release the weights of both AFM-4.5B and AFM-4.5B-Base on HuggingFace. This marks a major milestone for our team at Arcee AI as we open up access to a new, enterprise-grade language model designed for both flexibility and performance across a wide range of deployment environments.

Open-Source SLMs
July 18, 2025

Arcee AI Models Excel Across Yupp.ai Leaderboards

Small and mighty!

Research
July 9, 2025

Is Running Language Models on CPU Really Viable?

Running language models on CPUs has been discussed for some time, but delivering accurate results with production-level performance remains unproven. So, is using CPUs for language models truly viable in production?

Open-Source SLMs
June 30, 2025

Releasing Five Open-Weights Models

SuperNova 70B, Virtuoso-Large 72B, Caller 32B, GLM-4-32B-Base-32K, and Homunculus 12B

Case Studies
June 24, 2025

Research Spotlight: 3 Learnings from 3 MergeKit Use Cases

Merging for pre-training, data privacy in healthcare, and language support

Research
June 23, 2025

Extending AFM-4.5B to 64k Context Length

From 4k to 64k context through aggressive experimentation, model merging, distillation, and a concerning amount of soup.

Company
June 18, 2025

Deep Dive: AFM-4.5B, the First Arcee Foundation Model

Built for performance, compliance, and affordability.

Company
June 18, 2025

Announcing Arcee Foundation Models

The first release—AFM-4.5B—is a 4.5-billion-parameter model that delivers excellent accuracy, strict compliance, and very high cost-efficiency.