#Devstral

2025-05-31

J’ai demandé au #LLM #devstral «écrit un script bash pour stocker la sortie standard et la sortie d’erreur d’une commande dans une base de donnée sqlite3»

Il m’a pondu un script tout à fait crédible, décomposé en fonctions avec commentaires et tout et tout…

Sauf que :

# Capture la sortie standard et l'erreur dans des variables séparées
output=$(($command 2>&1 >/tmp/output) ; cat /tmp/output)
error=$(($command 2>&1 >/tmp/error) ; cat /tmp/error)

Je vous épargne le festival d’injection SQL dans la fonction qui enregistre les données.

Je lui pause UNE question, il répond de la merde. Bref, j’ai utilisé un LLM.

Nicolas Fränkel 🇺🇦🇬🇪frankel@mastodon.top
2025-05-25
2025-05-23

Cobbled together an #ExoLabs cluster to fuck around with #devstral a bit, since it's kinda too big for my M3 Max daily driver. While in the process of bringing up nodes the model hit a bug in the #MLX #Python module that deals with inference model sharding related to passing around MLX vs Numpy data structures.

For shits and giggles and also not being a top-tier #Numpy data structure debugging guy I asked Devstral to look at the bug and figure out a fix. After one wrong turn it came up with a fix which I applied to the other nodes and now it's happily sharding the bigger Devstral models. Not sure about vibe coding as a social contagion but from a “How close are we to #Skynet”-perspective I think we're cooked, chat.

Anyway enjoy your Memorial Day weekend 🎉

Figure 1. A very heterogeneous Exo cluster.

🤖 #MistralAI і All Hands AI представили #Devstral - нову LLM для складних кодинг-задач.

Модель дає 46.8% на SWE-Bench Verified - це на 6% вище, ніж у будь-якої іншої відкритої моделі, при тому що її розмір дозволяє запускати її навіть на RTX 3090 або Mac з 32 гігами оперативки.

🔗 mistral.ai/news/devstral

Sara Zanzansara
2025-05-23

📢 Don't overlook this in the wave of releases! has a new coding LLM: it's , an open model perfect for on-prem, private and local deployments 🐈

📰 Have a look at the announcement: mistral.ai/news/devstral

SWE Bench results for Devstral
Berndt.LegalBerndtLegal
2025-05-22

Wirklich stark - und auf einem Mac Mini mit 32GB RAM unter Ollama lauffähig mistral.ai/news/devstral

st1nger :unverified: 🏴‍☠️ :linux: :freebsd:st1nger@infosec.exchange
2025-05-22

#Devstral is an #agentic #LLM for #software #engineering tasks built under a collaboration between #Mistral #AI and All Hands AI. 24 billion params, context window of up to 128k tokens

ollama.com/library/devstral

2025-05-22

Mistral AI hat gemeinsam mit AllHandsAI sein eigenes Agentic LLM für Software-Engineering-Aufgaben veröffentlicht - #Devstral.

Der Clou ist, dass es auf dem lokalen Rechner ausgeführt werden kann und dennoch, laut eigenen Benchmarks, die beste Performance aller Open-Source-Modelle erreicht.

Damit bleiben die Daten in deiner Hand.

#MistralAI #Claude #DevLife #AI #AgenticAI #LLM #Codex

mistral.ai/news/devstral

2025-05-21

#Devstral: New #opensource Model for Coding Agents by #MistralAI & #AllHandsAI 🧠

• 🏆 #Devstral achieves 46.8% on #SWEBench Verified, outperforming previous #opensource models by over 6% points and surpassing #GPT4 mini by 20%

🧵👇#AI #coding

Rod2ik 🇪🇺 🇨🇵 🇪🇸 🇺🇦 🇨🇦 🇩🇰 🇬🇱rod2ik
2025-05-21
Rod2ik 🇪🇺 🇨🇵 🇪🇸 🇺🇦 🇨🇦 🇩🇰 🇬🇱rod2ik.bsky.social@bsky.brid.gy
2025-05-21

#Mistral dévoile #Devstral, un modèle #IA #AI taillé pour le #code , qui, selon #Mistral, surpasse les modèles #Gemma 3 27B de #Google et #DeepSeek V3 www.mac4ever.com/ia/189403-mi...

Mistral dévoile Devstral, un m...

N-gated Hacker Newsngate
2025-05-21

🥳🎉 Behold, the messiah 'Devstral' has arrived, promising to revolutionize by finally making your job obsolete! 🚀🔧 Because nothing screams "cutting-edge innovation" like regurgitating code snippets under the guise of being more "agentic" than your average bear. 🙄🤖
mistral.ai/news/devstral

Client Info

Server: https://mastodon.social
Version: 2025.04
Repository: https://github.com/cyevgeniy/lmst