Tom Smith discusses how Hugging Face’s new extension enables GitHub Copilot Chat users to access and test open-source language models in VS Code, broadening developers’ options for AI-assisted coding.

Hugging Face Opens GitHub Copilot Chat to Open-Source Models

Hugging Face has released an extension for Visual Studio Code that brings open-source large language models (LLMs) directly to GitHub Copilot Chat. This integration allows developers to test models like Kimi K2, DeepSeek V3.1, and GLM 4.5 in the familiar VS Code chat interface, offering more flexibility and choice in AI-assisted coding workflows.

How It Works

  • Installation: Install the Hugging Face Copilot Chat extension from the marketplace.
  • Setup: Open VS Code’s chat interface, select Hugging Face as your provider, add your token, and choose the desired model.
  • Requirements: VS Code version 1.104.0 or newer is required.
  • Switching Models: Easily change between models and providers without leaving your editor.

Key Features & Benefits

  • Open-Source Model Access: Connects Copilot Chat to Hugging Face’s network of inference providers, bringing previously unavailable LLMs into VS Code.
  • Domain-Specific Choice: Models can be optimized for languages, frameworks, or industries, enabling tailored developer workflows.
  • Streamlined Workflow: Reduces the need to switch tools or tabs for model comparison.
  • Developer Flexibility: Lets developers experiment with models suited to their programming style or project requirements, such as those optimized for Rust or data science workflows.

Technical Foundation

  • Unified API: Utilizes Hugging Face Inference Providers, providing a single API for accessing different models.
  • Easy Integration: Switching models is as simple as a provider selection, requiring minimal code changes.
  • Compatibility: Works with existing OpenAI SDKs, lowering integration friction.

Pricing and Access

  • Free Tier: Offers monthly inference credits for basic experimentation.
  • Upgraded Plans: Pro, Team, and Enterprise plans add higher capacity with transparent, pass-through pricing based on usage.

Real-World Use Cases

  • Code Generation: Use models specialized for frameworks, programming languages, or domain tasks.
  • Experimentation: Compare how different models handle the same coding problem without setting up parallel environments.
  • Evaluation: Teams can make informed decisions on which models fit their needs before committing to a workflow.

Signs of an Open AI Future

This integration shows the growing maturity and openness of AI tooling within developer workflows. Developers now have alternatives beyond proprietary models and are empowered to customize their coding environment according to their preferred stacks and tasks.

How to Get Started

  1. Update VS Code to version 1.104.0 or later.
  2. Install the Hugging Face Copilot Chat extension from the marketplace.
  3. Obtain a Hugging Face account and token.
  4. Experiment on the free tier, switching between open-source models as needed.

Developers are encouraged to try out various LLMs for their coding workflows and enjoy increased flexibility when integrating AI into their development process.

This post appeared first on “DevOps Blog”. Read the entire article here