The first quarter of 2024 isn’t over yet, but we’ve already had epic developments at Arcee – with TechCrunch announcing our seed round, and our decision to merge with mergekit.
On Day 3 of March Merge Madness we're striking a bit of a serious note, with a discussion about our very serious commitment to the open source community.
Why the state-of-the-art technique works so well for our Small Language Model (SLM) system and domain-specific tasks
To celebrate Arcee’s recent merger with mergekit, we’re bringing you a month of resources and knowledge on model merging.
Arcee's recent merger with mergekit has made us a leader in model merging research and development. Check out our video interview with mergekit Founder Charles Goddard, who's come onboard our team as a Senior Research Engineer.
At Arcee, we believe in a world of smaller, specialized models that we call SLM’s. The “S” stands for smaller, specialized, scalable, and secure. These models are grounded on your data, run entirely in your own environment, and are infinitely scalable for all your use cases.
Several months ago, I stumbled upon an innovative technique in the world of language model training known as Model Merging. This SOTA approach involves the fusion of two or more LLMs into a singular, cohesive model, presenting a novel and experimental method for creating sophisticated models at a fraction of
Arcee, a new startup, is creating a platform with tools for enterprise customers to securely build, deploy and manage GenAI models.
Emerging out of intensive research and development, the Arcee team is excited to release our commercial product to contextualize language models in the Arcee platform.