#Molmo

2025-12-15

Thứ Ba, 16/12 từ 1-2pm PST, tham gia AMA với các nhà nghiên cứu AI2 (tác giả mô hình Olmo & Molmo mở toàn phần). Đặt câu hỏi ngay! #AI2 #Olmo #Molmo #NghiênCứuAI #MôHìnhMở #OpenModeling #AIResearch

reddit.com/r/LocalLLaMA/commen

o lаvrоvskyloleg@hachyderm.io
2025-12-08

@frescosecco #Molmo is stubborny refusing to accept my position on the matter ... Maybe the two of you know something I don't 😅

A screenshot of a chat log with the Molmo chatbot from Ai2. The bot insists the dish is a piece of meat, even after I twice ask it to correct the response.
2024-10-06

These Mini AI Models Match OpenAI With 1,000 Times Less Data

Jason Dorrier discusses the AI industry's focus on scaling up models, and contrasts this with the Allen Institute for AI's (Ai2) approach of creating efficient, smaller models like Molmo.
Molmo outperforms larger models using high-quality data and is open-source.

#ArtificialIntelligence #LLM #OpenSource #Ai2
#OpenAI #Molmo

singularityhub.com/2024/10/04/

2024-10-01

@codepo8 Multimodal LLM-s like #Molmo can even read text from images.

2024-09-26

The Allen Institute for AI debuts Multimodal Open Language Model, #Molmo, the most capable #opensource #AI model with visual abilities yet

wired.com/story/molmo-open-sou

#Ai2 #multimodal

2024-09-26

• 🧠 #AI2 unveils #opensource #Molmo #LLM family, competing with top proprietary models

• 🏆 72B-parameter Molmo outperforms #GPT4 in image and document comprehension tests

• 🎯 7B-parameter version approaches state-of-the-art performance with significantly less data

• 📊 Trained on 600k high-quality, annotated images vs. billions in other models

• 👆 New "pointing" capability allows Molmo to identify specific elements in images

• 🌐 Available for developers on #HuggingFace, promoting open-source #AI development

technologyreview.com/2024/09/2

Benjamin Carr, Ph.D. 👨🏻‍💻🧬BenjaminHCCarr@hachyderm.io
2024-09-25

A tiny new open-source AI model performs as well as powerful big ones
The Allen Institute for Artificial Intelligence (#Ai2), called #Molmo, that it says perform as well as top proprietary models from OpenAI, Google, and Anthropic. The results suggest that training models on less, but higher-quality, data can lower computing costs.
They claim its biggest Molmo model, which has 72B parameters, outperforms GPT-4o, which is estimated to have over a trillion parameters
technologyreview.com/2024/09/2

White hand Icon on blue computer output like background

Client Info

Server: https://mastodon.social
Version: 2025.07
Repository: https://github.com/cyevgeniy/lmst