techhub.social is one of the many independent Mastodon servers you can use to participate in the fediverse.
A hub primarily for passionate technologists, but everyone is welcome

Administered by:

Server stats:

4.9K
active users

#codecompletion

1 post1 participant0 posts today

The GitHub Copilot Chat client for VS Code is now open source under the MIT license. Here's the source code:

"As Copilot Chat releases in lockstep with VS Code due to its deep UI integration, every new version of Copilot Chat is only compatible with the latest and newest release of VS Code. This means that if you are using an older version of VS Code, you will not be able to use the latest Copilot Chat.

Only the latest Copilot Chat versions will use the latest models provided by the Copilot service, as even minor model upgrades require prompt changes and fixes in the extension. An older version of Copilot Chat will still use the latest version of Copilot completions."

github.com/microsoft/vscode-co

Copilot Chat extension for VS Code. Contribute to microsoft/vscode-copilot-chat development by creating an account on GitHub.
GitHubGitHub - microsoft/vscode-copilot-chat: Copilot Chat extension for VS CodeCopilot Chat extension for VS Code. Contribute to microsoft/vscode-copilot-chat development by creating an account on GitHub.

Das neue #PhpStorm hat jetzt zeilenweises #CodeCompletion per #KI. Das ist schon spooky, wie gut die erkennt, was ich gerade tippen wollte.. 😮

Wenn die KI etwas vorschlug traf es in meinem Fall bisher immer zu. Einmal Tab drücken und der Code, den ich gerade tippen wollte, steht da.. Das beschleunigt vor allem die langweiligen Codeteile, die ohne großes Nachdenken entstehen, doch erheblich. Hilfreich!
👍

StarCoder2 is a family of code generation models (3B, 7B, and 15B), trained on 600+ programming languages from The Stack v2 and some natural language text such as Wikipedia, Arxiv, and GitHub issues. The models use Grouped Query Attention, a context window of 16,384 tokens, with sliding window attention of 4,096 tokens. The 3B & 7B models were trained on 3+ trillion tokens, while the 15B was trained on 4+ trillion tokens. For more details check out the paper.

StarCoder2 @ Github

StarCoder2 is a family of open LLMs for code and comes in 3 different sizes with 3B, 7B and 15B parameters. The flagship StarCoder2-15B model is trained on over 4 trillion tokens and 600+ programming languages from The Stack v2. All models use Grouped Query Attention, a context window of 16,384 tokens with a sliding window attention of 4,096 tokens, and were trained using the Fill-in-the-Middle objective.

StarCoder2 offers three model sizes: a 3 billion-parameter model trained by ServiceNow, a 7 billion-parameter model trained by Hugging Face, and a 15 billion-parameter model trained by NVIDIA using NVIDIA NeMo on NVIDIA accelerated infrastructure:

StarCoder2 @ Hugging Face

 

https://www.symphora.com/2024/03/starcoder2-open-source-code-completion-models/

huggingface.cobigcode/the-stack-v2 · Datasets at Hugging FaceWe’re on a journey to advance and democratize artificial intelligence through open source and open science.