What a week here at Arcee AI. On the heels of Arcee-Scribe yesterday, today we bring you Arcee-Nova – our highest-performing open source model... Evaluated on the same stack as the OpenLLM Leaderboard 2.0, making it the top-performing open source model tested on that stack. Its performance approaches that of GPT-4 from May 2023, marking a significant milestone.
Nova is a merge of Qwen2-72B-Instruct with a custom model tuned on a generalist dataset mixture.
Performance
Evaluated on the OpenLLM Leaderboard 2.0 stack
Top-performing open source model on this stack
Approaches GPT-4 (May 2023) performance levels
Technical Details
Merge of Qwen2-72B-Instruct with a custom-tuned model, with RLHF on top of that
Customer Service: Advanced chatbots and virtual assistants
Content Creation: High-quality marketing and documentation
Software Development: Code generation and quality improvement
Data Analysis: Enhanced interpretation and reporting
Research and Development: Literature reviews and hypothesis generation
Legal and Compliance: Contract analysis and regulatory checks
Education and Training: Adaptive learning systems
Performance Metrics
Acknowledgments
We thank the open source AI community for their ongoing contributions and the Qwen team for their foundational work on Qwen2-72B.
Looking Ahead
We invite researchers, developers, and businesses to explore Arcee-Nova's capabilities. Our commitment to open source AI advancement continues, and we look forward to seeing how the community builds upon this technology.
This post was created with assistance from Arcee-Nova.
Give Arcee a Try
Lorem ipsum dolor sit amet consectetur. Vitae enim libero lectus urna blandit sapien. In egestas ac dolor dictum.
Here at Arcee AI, we're the pioneers of training performant and efficient LLMs with Model Merging... And now we bring you *yet another* cutting-edge technique that also dramatically optimizes your training and improves your models.
As we approach the end of 2024, it's clear that the rise of small language models (SLMs) has been one of the defining developments in AI this year. For those of you still stuck in LLM-world, here's a crash course on SLMs.
Artificial Intelligence (AI) enables businesses to solve complex problems faster than ever. However, as decision-makers consider implementing AI solutions, one key question arises: How much energy does AI actually consume? The answer is not straightforward. AI's energy usage depends on factors that include:* The size of the model* The infrastructure
Thank you!
We will get back to you soon.
Oops! Something went wrong while submitting the form.