Model optimization

Two groundbreaking techniques that differentiate Arcee AI’s SLM training.

What is Spectrum?

Spectrum is a novel training methodology pioneered by Arcee AI researchers and built into our Small Language Model (SLM) training platform from the ground up. It was designed to optimize the training process of language models by selectively training specific layers based on their signal-to-noise ratio (SNR).

Learn more

What is Model Merging?

Model Merging allows customers to train an open source LLM on their data, then blend or “merge” that model with another open source LLM.

The merged model has the “brains” of both input LLMs, including the domain-specific customer data, but the size and inference cost of just one input model.”

Learn more

Spectrum-powered pre-training and model merging

Arcee AI’s revolutionary training routine

Spectrum-powered pre-training

Pre-training represents a critical foundation in training routines. At Arcee AI, we've enhanced this fundamental process with Spectrum - which is why we call it 'Spectrum-powered pre-training.'

This technique pioneered by Arcee AI revolutionizes the traditional training approach. Instead of updating every layer during training, Spectrum intelligently analyzes and prioritizes the layers that contribute most significantly to performance improvements (high SNR), while layers with low SNR remain frozen.

Starting from open-source model checkpoints, this innovative method delivers state-of-the-art results that demonstrates remarkable speed improvements of up to 42%.

Model merging and evolutionary (“metric-guided”) model merging

Next, we Merge that model with another open source one, resulting in a single model with all the advantages of both - combining their unique strengths to create a more specialized and powerful model that excels in domain-specific tasks while maintaining general capabilities.

Powered by this SOTA training pipeline, our SLMs set the standard. Start implementing cutting-edge GenAI with Arcee AI SLMs

Contact Us