#Codecompletion

2025-07-02

How to pass an AI coding benchmark: train on the questions

SWE-Bench Verified by OpenAI tests how well a model can solve real bugs in real Python code from GitHub. These bugs are all public information — so the AI models have almost certainly trained on the actual text of the bug and on the fix for the bug. In “The SWE-Bench Illusion,” researchers at Purdue […]

pivot-to-ai.com/2025/07/02/how
#Codecompletion #Papers #O3

Miguel Afonso Caetanoremixtures@tldr.nettime.org
2025-07-02

The GitHub Copilot Chat client for VS Code is now open source under the MIT license. Here's the source code:

"As Copilot Chat releases in lockstep with VS Code due to its deep UI integration, every new version of Copilot Chat is only compatible with the latest and newest release of VS Code. This means that if you are using an older version of VS Code, you will not be able to use the latest Copilot Chat.

Only the latest Copilot Chat versions will use the latest models provided by the Copilot service, as even minor model upgrades require prompt changes and fixes in the extension. An older version of Copilot Chat will still use the latest version of Copilot completions."

github.com/microsoft/vscode-co

#AI #GenerativeAI #CodeCompletion #Copilot #Github #Programming #SoftwareDevelopment #VSCode #VisualStudioCode #Microsoft

2025-05-24

AI coding bot allows prompt injection with a pull request

GitLab is a program code repository. It’s got an AI coding bot, because of course it does — it’s called Duo and it runs on Claude. Duo will make suggestions, analyse submitted pull requests and even give security tips! [GitLab] You can prompt-inject the bot by submitting a malicious pull request, and the bot will […]

pivot-to-ai.com/2025/05/24/ai-
#Codecompletion

All Things Openallthingsopen
2024-09-12

🚀 NEW on We ❤️ Open Source 🚀

Learn how to create your own AI coding assistant with Continue and Ollama, entirely open-source! Ty Dunn walks you through setting up models, optimizing for your needs, and more.

buff.ly/47nPHwS

Left side says We Love Open Source. #WeLoveOpenSource. ATO. A community education resource from All Things Open. Right side has a painting that looks like waves in various shades of blue.
2024-04-11

Das neue #PhpStorm hat jetzt zeilenweises #CodeCompletion per #KI. Das ist schon spooky, wie gut die erkennt, was ich gerade tippen wollte.. 😮

Wenn die KI etwas vorschlug traf es in meinem Fall bisher immer zu. Einmal Tab drücken und der Code, den ich gerade tippen wollte, steht da.. Das beschleunigt vor allem die langweiligen Codeteile, die ohne großes Nachdenken entstehen, doch erheblich. Hilfreich!
👍

2024-03-08

StarCoder2 is a family of code generation models (3B, 7B, and 15B), trained on 600+ programming languages from The Stack v2 and some natural language text such as Wikipedia, Arxiv, and GitHub issues. The models use Grouped Query Attention, a context window of 16,384 tokens, with sliding window attention of 4,096 tokens. The 3B & 7B models were trained on 3+ trillion tokens, while the 15B was trained on 4+ trillion tokens. For more details check out the paper.

StarCoder2 @ Github

StarCoder2 is a family of open LLMs for code and comes in 3 different sizes with 3B, 7B and 15B parameters. The flagship StarCoder2-15B model is trained on over 4 trillion tokens and 600+ programming languages from The Stack v2. All models use Grouped Query Attention, a context window of 16,384 tokens with a sliding window attention of 4,096 tokens, and were trained using the Fill-in-the-Middle objective.

StarCoder2 offers three model sizes: a 3 billion-parameter model trained by ServiceNow, a 7 billion-parameter model trained by Hugging Face, and a 15 billion-parameter model trained by NVIDIA using NVIDIA NeMo on NVIDIA accelerated infrastructure:

StarCoder2 @ Hugging Face

 

https://www.symphora.com/2024/03/starcoder2-open-source-code-completion-models/

#AI #codeCompletion #Huggingface #LLM #NVIDIA #StarCoder2

Client Info

Server: https://mastodon.social
Version: 2025.04
Repository: https://github.com/cyevgeniy/lmst