We are grateful for your interest in contributing to the Confluent VSCode extension. This guide includes information on:
- getting started for the first time
- as well as a quick reference for contributors who need refreshers on the process
The best developers are those who are familiar with the user experience of the software they create, so we ask that you review the project's official documentation before diving in. We also recommend the VS Code extension development documentation to familiarize yourself with the VS Code extension development process.
There are three main ways to contribute:
- File an Issue
- You can report bugs, request features, or ask questions by filing an issue in our GitHub repository.
- Submit or Review Pull Requests (PR)
- In this case, we ask that you read the rest of this guide to understand the process and requirements for contributing code.
- Ask A Question Or Contribute to a Discussion
- Use our GitHub Discussions page to participate in the conversation around the extension's development. As always, we ask that you follow our Code of Conduct.
Quicklinks:
Our README contains a project overview, including a description of the extension's features and functionality. Again, you can also find the official documentation for the extension on the Confluent website.
You can report problems or comment on issues without installing the tools, getting the code, or building the code. All you need is a GitHub account.
If you spot a problem with the app, code, or docs search if an issue already exists. If a related issue doesn't exist, you can open a new issue using a relevant issue form.
Scan through our existing issues to find one that
interests you. You can narrow down the search using labels as filters. See GitHub's
label reference docs
for more information.
Note
As a general rule, you are welcome to open a PR with a fix unless that issue is already assigned to someone else, or someone else has added a comment that they are working on it. Currently unassigned issues can be found using this filter.
All changes are made through pull requests (PRs). Every PR's Semaphore CI/CD build must pass. The Confluent team will review PRs and provide feedback; once the changes in the PR are acceptable, the team will merge the PR onto the appropriate branch.
Note that while a SonarQube report of less than 80% code coverage will not block a PR, we do expect that the code coverage will improve with each PR. A member of the Confluent team will need to comment "/sem-approve" on the PR to approve external PRs for merging.
To create a PR, you must create a fork of this repository and set up your machine with the tools needed for development. These steps are outlined below.
- Install the required tools and dependencies (see Tools and Prerequisites).
- Clone your fork of the repository.
- Set up your local Git repository to track the upstream repository, learn to build, clean and format locally (see Sync your repo with ours).
If you want to work with this project's codebase and maybe contribute to it, you will need to have some development tools. We use GitHub, so consult the GitHub docs if you are unfamiliar with the GitHub flow. This project uses the following software that you may already have:
- Git — version 2.40.0 or later
- Node.js — version 22.17.0 or later. It usually installs NPM alongside
- Visual Studio Code — version 1.87.0 or later
See the links above for installation instructions on your platform. You can verify the versions you have installed and that they are working.
git --versionshould be at least 2.40.0 or later,
node --versionshould be 22.17.0 or later, and
code --versionshould be 1.87.0 or later. See
VS Code Command Line Interface Guide for
more information about code command usage.
The project also uses these tools:
- NVM — optional, Node.js version manager.
- Gulp — task automation tool. It is installed along with other Node.js dependencies, but you may want to install a global CLI package as well:
npm install -g gulpThe project also uses several services:
- GitHub — this project is on GitHub, so to contribute you'll need a GitHub account.
- Semaphore CI/CD — continuous integration and deployment service. You should not need an account.
Go to this repository on GitHub and click the "Fork" button near the upper right corner of the page. Complete the form and click the "Create fork" button to create your own https://github.com/YOUR-USERNAME/vscode repository. This is the repository to which you will upload your proposed changes and create pull requests. See the GitHub documentation for details.
To work locally on the code, you need to pull the code onto your machine. At a terminal, go to the directory in which you want to place a local clone of this repository, and run the following commands to use SSH authentication (recommended):
git clone git@github.com:YOUR-USERNAME/vscode.gitor with HTTPS:
git clone https://github.com/YOUR-USERNAME/vscode.gitThis will create a vscode directory and pull the contents of your forked repository. Change into
that directory:
cd vscodeIf you intend to propose changes to our upstream repository, you should next configure your local repository to be able to pull code from the project's remote repository, called the upstream repository.
Use the following command to see the current remotes for your fork:
git remote -vwhich will output something like:
origin git@github.com:YOUR-USERNAME/vscode.git (fetch)
origin git@github.com:YOUR-USERNAME/vscode.git (push)or if you used HTTPS:
origin https://github.com/YOUR-USERNAME/vscode.git (fetch)
origin https://github.com/YOUR-USERNAME/vscode.git (push)Then run the following command to add the project's repository as a remote called upstream:
git remote add upstream git@github.com:confluentinc/vscode.gitor if you've used HTTPS:
git remote add upstream https://github.com/confluentinc/vscode.gitTo verify the new upstream repository you have specified for your fork, run this command again:
git remote -vYou should see the URL for your fork as origin, and the URL for the project's upstream repository
as upstream. If you used SSH, this will look something like:
origin git@github.com:YOUR-USERNAME/vscode.git (fetch)
origin git@github.com:YOUR-USERNAME/vscode.git (push)
upstream git@github.com:confluentinc/vscode.git (fetch)
upstream git@github.com:confluentinc/vscode.git (push)Once setup, you can periodically sync your fork with the upstream repository, using just a few Git
commands. The most common way is to keep your local main branch always in sync with the upstream
repository's main branch:
git checkout main
git fetch upstream
git pull upstream mainYou can create local branches from main and do your development there.
Note
You don't need to keep the main branch on your remote https://github.com/YOUR-USERNAME/vscode
repository in sync, but you can if you want:
git push origin main
For more details and other options, see "Syncing a fork" in GitHub's documentation.
To install frontend-related dependencies, use NPM:
npm ciWe recommend using npm ci over npm install so you'd get reproducible state of dependencies
defined by package-lock.json.
Now that you have the source code and installed all the tools, you can build the project locally.
First check out the main branch:
git checkout mainand pull the latest changes from the project's repository:
git pull upstream mainNow you can compile the extension code:
gulp buildWhen using VS Code, you can run the extension using Run and Debug tab. The project includes
necessary configs in .vscode folder to define what needs to be done for the extension to run in
debug mode.
To check the code against style conventions and potential bugs:
gulp lint
gulp checkTo get a brief overview of existing automated tasks:
gulp --tasksThe build will create a lot of local files. You can clean up these generated files with:
gulp cleanCleaning is often useful to ensure that all generated files, JARs and executables are removed, before rerunning the build and tests.
We use Prettier for code formatting. To format the code, run:
gulp formatYou can also install the
Prettier extension for VS Code
which will format typescript documents on save based on the .prettierrc and
.vscode/settings.json files in the project.
This project uses Husky to run automated checks before commits:
- Linting is automatically run before each commit through
.husky/pre-commit - To fix issues before committing:
npx gulp lint --fix <pattern> - To bypass (not recommended):
git commit --no-verify
This project uses unit, integration, functional (webview), and end-to-end (E2E) tests to verify functionality and identify regressions.
We use Mocha for unit and integration tests, and
Sinon for stubbing. These tests are located in the same directories as the
production code they test, and follow the *.test.ts naming pattern. (For example,
src/commands/connections.test.ts contains tests for
src/commands/connections.ts.)
To run the existing unit/integration tests:
gulp testYou can also specify a test name/title (either from a describe or it block) to run specific
tests:
gulp test -t "Extension manifest tests" # describe() title
gulp test -t "should register all commands" # it() titleOr add .only after a describe or it block in the test file (see
docs):
describe.only("Extension manifest tests", () => {
it("should register all commands", () => {
// test code
});
});Warning
Remember to remove .only after running the test(s) and before raising a PR for review!
Functional tests are written for the content the extension displays in webviews. These tests run using the Playwright Test framework and cover the webviews' content and behavior from the perspective of user interactions.
To run all existing functional tests:
gulp functionalEnd-to-end (E2E) tests are written for the extension's functionality in a real VS Code environment.
These tests use Playwright with Electron to launch
VS Code and interact with it programmatically. The tests are located in the
tests/e2e directory.
See E2E tests using Playwright for more information on running, writing, and debugging E2E tests.
To run all existing E2E tests:
gulp e2eSimilar to the Mocha tests, Playwright tests (functional or E2E) can also be filtered by test
name/title using the -t flag:
# Functional/webview tests
gulp functional -t "should render with the correct title"
# E2E tests
gulp e2e -t "Project Scaffolding"The .only pattern is also supported by using
test.describe.only and/or
test.only in the test file.
Warning
Remember to remove .only after running the test(s) and before raising a PR for review!
-
Clone the
ide-sidecarrepo and check out the branch/commit you want to test against. (Also ensure any sidecar prerequisites are installed.) -
Within the cloned
ide-sidecardirectory, to run the followingmakecommand to build the native executable.
make clean mvn-package-native-no-tests- Set the environment variable
$VS_CODE_EXTENSION_PROJECTto the local directory holding this repository:
export VS_CODE_EXTENSION_PROJECT=~/Code/vscodeNext, copy the native executable to the bin directory in the VS Code extension project:
cp ./target/ide-sidecar-0.*.0-runner $VS_CODE_EXTENSION_PROJECT/binOtherwise, use your preferred method to copy the native executable from the sidecar's target
directory to the vscode/bin/ directory.
-
Update
.versions/ide-sidecar.txtto ensure it matches the specified version in the native executable file name. (For example, testing againstide-sidecar-0.123.0-runnerrequires.versions/ide-sidecar.txtto bev0.123.0.) -
If there are any changes to the sidecar's OpenAPI spec, copy
ide-sidecar/src/generated/resources/openapi.yamlintosrc/clients/sidecar-openapi-specs/sidecar.openapi.yamland rungulp apigen(see more here). -
If there are any changes to the sidecar's GraphQL schema, copy
ide-sidecar/src/generated/resources/schema.graphqlintosrc/graphql/sidecar.graphqland similarly rungulp apigen(see more here) -
Run
gulp cito check for TypeScript and ESLint errors and confirm the extension builds successfully. -
Run the extension either via Run > Start Debugging (F5) or by packaging the .vsix and installing with
gulp clicktest.
Most development occurs on the main branch. Larger feature branches may be created off of main
for development, and then merged back into main when complete.
We use semantic versioning, so our version numbers are of the form
v.MAJOR.MINOR.PATCH, such as v1.2.0. We create release branches off of main using a
v.MAJOR.MINOR.x pattern (for example, v1.2.x), and create and publish releases off of those
branches.
If we need to patch a release, we will create a PR against the associated v.MAJOR.MINOR.x release
branch before releasing a patch version (for example, v1.2.1). If we need to make additional
fixes, we'll continue to do so against this same branch and release subsequent patch versions (e.g.,
v1.2.2, v1.2.3, etc).
This project's releases (and pre-releases) are published to GitHub releases.
We use openapi-generator-cli with the
typescript-fetch generator to
create the client code from OpenAPI specs.
The generated client code helps to make requests to the services defined in the OpenAPI specs without needing to manually write the request/response structures, middlewares, handlers, and more.
To generate the client code, run the apigen task:
gulp apigenThis task generates the client code for all OpenAPI specs in the src/clients directory.
- Copy the associated OpenAPI spec file(s) to the
src/clientsdirectory.- For requests handled by the sidecar*, place them in the
src/clients/sidecar-openapi-specsdirectory. - For other requests (like to the local Docker engine API), place them in the
src/clientsdirectory.
- For requests handled by the sidecar*, place them in the
- Update the
apigentask'sclientsarray in theGulpfile.jsto include the path of the new OpenAPI spec file(s) and their destination directory. For example:
const clients = [
// existing clients
["src/clients/sidecar-openapi-specs/sidecar.openapi.yaml", "src/clients/sidecar"],
["src/clients/sidecar-openapi-specs/ce-kafka-rest.openapi.yaml", "src/clients/kafkaRest"],
["src/clients/sidecar-openapi-specs/schema-registry.openapi.yaml", "src/clients/schemaRegistryRest"],
["src/clients/sidecar-openapi-specs/scaffolding-service.openapi.yaml", "src/clients/scaffoldingService"],
- ["src/clients/docker.openapi.yaml", "src/clients/docker"]
+ ["src/clients/docker.openapi.yaml", "src/clients/docker"],
+ ["src/clients/sidecar-openapi-specs/new-service-openapi.yaml", "src/clients/newService"],
];- Run the
apigentask:
gulp apigen
*For sidecar-handled requests, update
SidecarHandle
with any custom headers and/or other configurations.
Sometimes, we need to make manual adjustments to OpenAPI specs before generating the client code. To
ensure these changes are not lost, we have a
src/clients/sidecar-openapi-specs/patches directory
where we can store these changes as .patch files.
The apigen task tries to apply these patches to the OpenAPI specs before generating the client
code by using a glob pattern to find all .patch files in the patches directory.
We use GraphQL queries to the sidecar process to retrieve information about available connections and their resources, such as environments, Kafka clusters, Schema Registry instances, and Flink compute pools.
To enable type-safe GraphQL operations, we rely on gql.tada to
generate TypeScript types from our GraphQL schema. The
src/graphql/sidecar.graphql
file contains the GraphQL schema for the sidecar, which is used to generate the TypeScript
declarations file at
src/graphql/sidecarGraphQL.d.ts.
This file is auto-generated and should not be edited manually.
Usually, type generation will be done automatically based on tsconfig.json. If you update the
schema and need to manually regenerate sidecarGraphQL.d.ts, run the following command from the
root of the repository:
npx gql-tada generate output(Or re-run gulp apigen.)
See the
gql.tada documentation
for more details.
Note
The LICENSE.txt file contains the full text of the Apache License, Version 2.0. This file will never need to be updated.
A Semaphore CI/CD pipeline (See "Update third party notices PR" block in .semaphore/semaphore.yml)
automatically raises a Pull Request to update the THIRD_PARTY_NOTICES.txt and NOTICE-vsix.txt
files, on the following conditions (when a PR is merged into the main branch or a release branch,
e.g., v1.2.x):
- Any change to the
package.jsonfile (e.g., adding a new dependency, updating an existing one) - Any change to the
NOTICE.txtfile - Any change to the
scripts/notices/NOTICE-vsix_PREAMBLE.txtfile
The pipeline calls the make update-third-party-notices-pr target, which in turn calls the
following targets:
make generate-third-party-noticesto generate theTHIRD_PARTY_NOTICES.txtfilemake collect-notices-vsixto generate theNOTICE-vsix.txtfile: AppendsNOTICE.txt,scripts/notices/NOTICE-vsix_PREAMBLE.txt, andNOTICE*files from all dependency NPM packages.
The PR raised must be summarily reviewed and merged by a maintainer. The PR title will be suffixed
with [ci skip] to avoid triggering the pipeline again.
Hidden settings
Some extension settings are intentionally not exposed in package.json to avoid confusing users
with experimental, internal, or pre-MVP features. These "hidden" settings allow more flexible
development and testing without risking breaking user workflows, and can be considered even less
mature than VS Code's
experimental-tagged settings.
When to use hidden settings in development
Hidden settings are appropriate for:
- Pre-MVP features that need more work and/or testing and aren't ready for wider use
- Experimental functionality that may change or be removed
- Developer-only features that shouldn't appear in VS Code's Settings UI
- Settings that require technical knowledge to configure correctly
How hidden settings work
In src/extensionSettings/constants.ts, hidden settings
use the Setting<T> base class instead of ExtensionSetting<T>:
// Regular setting (in package.json)
export const SHOW_SIDECAR_EXCEPTIONS = new ExtensionSetting<boolean>(
"confluent.debugging.showSidecarExceptions",
SettingsSection.GENERAL,
);
// Hidden setting (not in package.json)
export const ENABLE_MEDUSA_CONTAINER = new Setting<boolean>(
"confluent.localDocker.medusaEnable",
undefined, // no section = hidden
);Key differences:
sectionTitleisundefinedfor hidden settingsdefaultValuereturnsundefined(no default from package.json)valuemay returnundefinedif not configured by user- Code using hidden settings must handle
undefinedvalues
Making hidden settings available for testing
Users can manually configure hidden settings in their VS Code settings.json:
- Open Command Palette (
Cmd+Shift+Pon macOS,Ctrl+Shift+Pon Windows/Linux) - Select "Preferences: Open User Settings (JSON)"
- Add the setting(s) directly, for example:
{
"confluent.localDocker.medusaEnable": true,
"confluent.localDocker.medusaImageRepo": "us-east1-docker.pkg.dev/medusa-prod-env/medusa/medusa",
"confluent.localDocker.medusaImageTag": "0.2.1"
}(These settings will likely show an "Unknown Configuration Setting" warning in the Settings UI
until/unless they are added to package.json.)
Currently available hidden settings
confluent.localDocker.medusaEnable(boolean): Enable Medusa container for local Docker environments.confluent.localDocker.medusaImageRepo(string): Custom Docker image repository for Medusa container.confluent.localDocker.medusaImageTag(string): Custom Docker image tag for Medusa container.
Warning
These settings are subject to change and may be removed in future releases. As they are not exposed by default in the extension, users should use them with caution and expect potential bugs, missing features, and/or instability.