Try our new intelligent model routing solution, Arcee Conductor. Sign up today and get a $20 credit.
A new standard for intelligent model routing.
Conductor intelligently routes your prompt to the best model, to efficiently deliver precise results, for any task.
Get started with $20 free credit.
With new AI models released daily, it's hard to keep up with which model is best for your business. Our pioneering work in small language models (SLMs) gives us unique insights into which model is the right one for your tasks or queries.
That's why we built Arcee Conductor, a model-agnostic platform that gives you access to a complete suite of top-performing SLMs, as well as other industry-leading LLMs.
Arcee Conductor intelligently routes your query to the optimal model based on factors like industry/specialty, complexity, efficiency, and cost–all in an easy-to-use interface that requires no technical expertise.
Automatically routes your prompt or query based on complexity, type of task, industry or domain, language, and whether it involves tool / function calling.
Reduce cost-per-prompt by over 99% compared to when you use just a single premium LLM. Save on routine prompts that you may be currently overpaying with a single LLM. Gain real-time visibility into your model spending per prompt.
Based on model router classifications, your prompt is routed to one of the language models in Arcee Conductor. Available models include Arcee SLMs plus the latest models from leading providers like OpenAI, Anthropic, and Deepseek.
Supports chain-of-thought reasoning for enhanced analytical capabilities, and automatic compute scaling for task-specific optimization.
Directly invoke Arcee Conductor via an API. Switch a few parameters in the OpenAI-compatible endpoint to use model routing without having to rebuild infrastructure.
User-defined preferences for routing parameters and model prioritization, with preset profiles for different tasks.
Some tasks can require an LLM, but over 80% of the tasks that companies use an LLM for can be handled with equal accuracy by an SLM.
Arcee Conductor's intelligent routing gives you cost efficiencies with optionality.
Conductor analyzes each prompt based on complexity, domain, and cost, then dynamically routes it to the optimal model, large or small, to maximize cost efficiency without compromising performance.
Featured Arcee SLMs included with Conductor are showcased below. You can also to Conductor to explore our complete model catalog.
Our most affordable SLM, this 24B is designed to be fast and efficient, with exceptional world knowledge for its size. A practical workhorse model that can tackle a range of tasks without the overhead of larger architectures.
A versatile and powerful 32B SLM, capable of handling varied tasks with precision and adaptability across multiple domains. Ideal for dynamic use cases that require significant computational power.
Our most powerful and versatile general-purpose SLM, this 72B is designed to excel at handling complex tasks across domains. Its scalability and depth make it ideal for enterprises seeking comprehensive AI solutions.
Our most advanced reasoning model, this 32B SLM tackles the most complex problem-solving scenarios with precision and depth. It excels at handling multifactorial decision-making, abstract reasoning, and scenario modeling.
A 32B SLM optimized for managing complex tool-based interactions and API function calls. Its strength lies in precise execution, intelligent orchestration, and effective communication between systems – making it ideal for automation pipelines.
“Auto” utilizes Arcee AI's intelligent model router to route prompts to the most optimal and efficient language model based on task and domain complexity
For nuanced questions that require more than a surface-level reply, select “Auto Reasoning”, so your prompt will be directed only to a reasoning model within Conductor, best suited for the request.
To manage tool integrations across models, use “Auto Tools” mode, for function-calling models within Conductor, where the best tool-calling model is selected for your query based on its complexity.
To unlock additional options–like volume discounts, custom model configuration, and dedicated SLAs–contact Sales to learn about our Enterprise Tier.
Currently, companies face significant challenges:
While selecting the best model for your task improves response quality, that isn't enough – on its own – to solve these challenges completely. This is why intelligent model routing is essential. It automatically selects the optimal model for each prompt, helping you to reclaim your profit margins without sacrificing quality.
Arcee Conductor is an intelligent model routing platform that directs each input to its ideal AI model based on complexity, domain, cost, and other requirements. By dynamically routing between large language models (LLMs) and small language models (SLMs), Conductor maximizes cost efficiency without compromising performance. You get the right model for each prompt, every time.
In Arcee Conductor, your prompt is automatically routed to the most suitable model through an advanced routing mechanism.
Available models include purpose-built Arcee SLMs for specific tasks alongside the latest models from leading providers like OpenAI, Anthropic, and Deepseek.
Log in to Conductor to see the full list of our current model offering.
A mode on Arcee Conductor is a selection of specialized AI models grouped by functionality. Each mode serves a different purpose, giving you the flexibility to use the most suitable approach for your task, whether for general inquiries, complex reasoning, or function-calling.
Each mode serves a specific purpose to deliver the most effective results for your particular needs.
You can sign up here to begin using Arcee Conductor today. First-time users receive $20 in credits (equivalent to approximately 400 million tokens) towards your Conductor usage.
Arcee Conductor is based on a usage-based pricing model. When you sign up for Conductor for the first time, you'll receive $20 in credits automatically. This allows you to immediately start using the platform.
After you've used your $20 in credits, charges will be applied to your payment method based on your usage. The specific rates vary depending on specific rates per token of models.
Arcee-Blitz
(General Purpose)
$0.45
$0.75
Virtuoso-Medium
(General Purpose)
$0.50
$0.80
Virtuoso-Large
(General Purpose)
$0.75
$1.20
Maestro
(Reasoning)
$0.90
$3.30
Caller-Large
(Function Calling)
$0.55
$0.85
Claude 3.7 Sonnet
$3.00
$15.00
GPT-4.1
$2.00
$8.00
DeepSeek-R1
$3.00
$7.00
OpenAI o3-mini
$1.10
$4.40
First-time users of Arcee Conductor can get started with a one-time $20 credit. Optimize output across models, reduce usage costs, and maximize performance with intelligent routing.