Arcee's U.S.-made, open source Trinity Large and 10T-checkpoint offer rare look at raw model intelligence
Neutral
0.0
San Francisco-based AI lab Arcee made waves last year for being one of the only U.S. companies to train large language models (LLMs) from scratch and release them under open or partially open source licenses to the public—enabling developers, solo entrepreneurs, and even medium-to-large enterprises to use the powerful AI models for free and customize them at will.Now Arcee is back again this week with the release of its largest, most performant open language model to date: Trinity Large, a 400-billion parameter mixture-of-experts (MoE), available now in preview,Alongside the flagship release, Arcee is shipping a "raw" checkpoint model, Trinity-Large-TrueBase, that allows researchers to study what a 400B sparse MoE learns from raw data alone, before instruction tuning and reinforcement has
Pulse AI Analysis
Pulse analysis not available yet. Click "Get Pulse" above.
This analysis was generated using Pulse AI, Glideslope's proprietary AI engine designed to interpret market sentiment and economic signals. Results are for informational purposes only and do not constitute financial advice.