Arcee just open‑sourced two new LLM families under Apache‑2.0: the 26B Trinity Mini and the 6B Trinity Nano preview. Both use AFMoE’s Mixture‑of‑Experts, bringing DeepSeek‑style efficiency to the community. Check out the details on architecture, training tricks, and how they compare to Qwen. A big step for open‑source AI! #Apache2 #Arcee #TrinityMini #AFMoE
🔗 https://aidailypost.com/news/arcee-releases-apache20-trinity-mini-26b-nano-preview-6b-models
