Best way to develop extension for JupyterLab >= 3

I’m currently building a JupyterLab extension and have been doing so since shortly before the JupyterLab 4 release. I then decided to simply upgrade to 4.x and adapt my extension thinking I would have to do it at some point anyways.

My issue now is that the intended users of my extension (a group of students) have mostly not upgraded to JupyterLab 4 and will mainly stay on 3.x for the upcoming semester. Since I’ve already been through the process of adapting my calls to the JupyterLab API in my extension to match the new JupyterLab 4 API, I could do it again the other way around and downgrade my extension to 3.x. However, some students might already be using JupyterLab 4, so I would actually like to have my extension available for JupyterLab >= 3.

My questions are the following :

  1. Is it possible to have an extension that works for both version of JupyterLab without having two separate development repos/branches that I have to keep in sync myself ? If yes, can someone detail the most convenient development workflow to achieve this ?
  2. In case I release this extension as a python package, will I need to have 2 different packages or can I somehow check the version of JupyterLab the client installing the package is using to determine which version of my package should be installed ?

Thank you for your help !

1 Like

It is possible to support both, but as the API surface area used by the extension increases, it can become more complex. Some big gotchas are the new sharedModel, CodeMirror6, and others. If the software in question works for both, happy days! At the packaging level, it’s fairly straightforward to claim this support, e.g. in package.json:

"dependencies": {
  "@jupyterlab/application": "^3.5.0 || ^4.0.5"

In pyproject.toml:

# ...
dependencies = ["jupyterlab >=3.5,<5"]

However, if things start breaking, at the source level, things get trickier… a few approaches:

  • consolidate all API relevant changes into a single file, e.g. lab-compatibility.ts
    • do checks to see if the API is supported
    • e.g.
    if((model as any).metadata) {
      // do a lab 3 thing
    } else {
      // do a lab 4 thing
  • split code into multiple js packages with e.g. lerna
    • so instead of a monolothic:
      • @my-org/my-extension in src/index.ts
    • it might be three packages:
      • @my-org/my-lib in packages/my-lib/src/index.ts
        • only reference “stable” APIs (for the value of stable the extension needs)
        • this would not be a labextension, but would be used as a sharedPackages in…
      • @my-org/my-lab4-extension in packages/my-lab4-extension/src/index.ts
        • "@jupyterlab/application": "^3.5.0", "@my-org/my-lib": "..."
      • @my-org/my-lab3-extension in packages/my-lab4-extension/src/index.ts
        • "@jupyterlab/application": "^4.0.5", "@my-org/my-lib": "..."

The latter approach would probably still generate some startup warnings on a running browser, but would preserve solid type-checking at development time, vs the other approach, doing lots of as any checks, but should otherwise be harmless to include in the same python distribution.

The most important thing in either case is to actually test both these versions, if only in CI, or by hand: such a repo would need to pick exactly one version for development/packaging (as the yarn.lock file would not be compatible between the two).

The “blessed” version of doing white-box testing with galata would require duplicating the tests and test infrastructure (200mb), as it assumes the entire jupyterlab API.

A block-box approach would be to use “raw” puppeteer, or a higher-order testing approach, such as robotframework-jupyterlibary Disclaimer: maintainer. The challenge here is in selecting good externally-visible DOM selectors, etc. some of which are different between the two versions. CodeMirror 6 is definitely harder to test with black-box approaches, as much of its internal state (like syntax tokens) have been made all-but-opaque to the DOM.

All that being said: maintaining two branches (e.g. 0.3.x and 0.4.x) with tighter pinning in pyproject.toml and package.json would probably be a smoother experience, just with twice the PRs. Again, doing good code-level isolation of functionality (e.g. the manager.ts approach, common in the JupyterLab codebase) would hopefully limit the amount of thrash between the two.


@bollwyvl When we set dependencies like this:

"dependencies": {
  "@jupyterlab/application": "^3.5.0 || ^4.0.5"

the package will always install the most recent version, right? I remember testing it and irrespective of version of jupyterlab, I got always version 4.* of @jupyterlab packages. Maybe the environments in my tests were messed up.

Does jlpm is smart enough to install proper version from npm registry based on existing jupyterlab version?


install the most recent version, right

Yes. The multi-package approach (with disjoint pins) exploits the “feature” of yarn to create multiple node_modules folders at different levels… deduplicatable packages will be at the root of the repo (next to ./package.json) while the tightly-pinned versions will be at the individual package level (e.g. ./packages/my-lab3-extension).

The root level will check that deveDependencies contains "@jupyterlab/builder": "^4.0.5", so that jupyter labextension build works properly. A disclaimer I should have added: I haven’t tested this approach, but it should work, given my imperfect understanding of yarn version resolution.

smart enough to install proper version

no, this is not relevant at build time. The as-built extension won’t have any control over what version it is using, due to the magic of WebPack “federated modules”. The pin just controls whether jupyter labextension list and browser console warnings will show incompatibilities. This is how, e.g. @jupyer-widgets/jupyterlab-manager works.

1 Like

I have been exposed to this question in the context of jupytext, and maybe the feedback may be of use to others

how it is done now

  • the original author @mwouts had expressed his preference for hosting the jlab extension in the same repo as the bulk of jupytext - which also contains a substantial part as a regular Python module
  • so after toying around in a separate repo, we managed to come up with a single git repo, and to produce code that works for both jlab3 and 4
  • to achieve that (keep in mind that the primary goal was time to prod)
    • we keep on building under jlab3, like it was the case originally
    • and as @bollwyvl suggested, the code primarily checks for the existence of attributes to guess what to do
    • which means that we primarily lose most of the interest of TS in the mix

why it’s interesting

  • we now have a single pip package that installs a jlab extension that works regardless of the user having jlab3 or jlab4

why it’s suboptimal

  • because the code really is ugly (every thing simple like getting a metadata is 4 or 5 lines long)
  • because it builds under jlab3 (although a separate issue I guess)
  • because I have no idea how to address jlab5 in this scheme if/when it shows up (god help us :wink: )

Thanks for the advice. I decided to opt for having a single code base that I develop in JupyterLab 3 and then consolidate all API changes in a single file compatibility.ts.

After testing out that same code in my JupyterLab extension based on JupyterLab 3.x and my other extension based on JupyterLab 4.x, it works fine. To do this I simply copy/pasted the source code of my extension in both extension repositories and tested it out. My problem is that I’m not completely sure as to how I should adapt the package.json, the pyproject.toml, and any other files in order for this JupyterLab 3.x extension built package to work fine in JupyterLab >= 3.x, hence me first trying the same code in two different extensions. However, I know that I have to take care of modifying dependencies since when I build the extension package from JupyterLab 3.x and install it in a fresh JupyterLab 4.x environment I get some errors such as :

Unsatisfied version 4.0.0 from @jupyterlab/application-top of shared singleton module @jupyterlab/settingregistry (required ^3.5.3)
Unsatisfied version 2.1.1 from @jupyterlab/application-top of shared singleton module @lumino/signaling (required ^1.10.0)

in my developer console upon JupyterLab start-up. My extension is still activated, but all the features that rely on the settingregistry don’t work.

@bollwyvl @parmentelat could you provide me with some guidance with what files (any other than package.json & pyproject.toml ?) I should adapt and what dependencies I should make sure I modify in my extension repository so I can keep developing my extension in Jupyterlab 3 and have JupyterLab 4 users install and use it without problems ?

Here are the dependencies of my package.json :

  "dependencies": {
    "@jupyterlab/application": "^4.0.0",
    "@jupyterlab/settingregistry": "^4.0.0"
  "devDependencies": {
    "@jupyterlab/builder": "^3.5.3 || ^4.0.0",
    "@jupyterlab/testutils": "^4.0.0",
    "@types/jest": "^29.2.0",
    "@types/json-schema": "^7.0.11",
    "@types/react": "^18.0.26",
    "@types/uuid": "^9.0.3",
    "@typescript-eslint/eslint-plugin": "^5.55.0",
    "@typescript-eslint/parser": "^5.55.0",
    "css-loader": "^6.7.1",
    "eslint": "^8.36.0",
    "eslint-config-prettier": "^8.7.0",
    "eslint-plugin-prettier": "^4.2.1",
    "jest": "^29.2.0",
    "npm-run-all": "^4.1.5",
    "prettier": "^2.8.7",
    "rimraf": "^4.4.1",
    "source-map-loader": "^1.0.2",
    "style-loader": "^3.3.1",
    "stylelint": "^14.9.1",
    "stylelint-config-prettier": "^9.0.4",
    "stylelint-config-recommended": "^8.0.0",
    "stylelint-config-standard": "^26.0.0",
    "stylelint-prettier": "^2.0.0",
    "typescript": "~5.0.2",
    "yjs": "^13.5.0"

Here is my pyproject.toml :

requires = ["hatchling>=1.5.0", "jupyterlab>=4.0.0,<5", "hatch-nodejs-version"]
build-backend = ""

name = "send_3_test"
readme = ""
license = { file = "LICENSE" }
requires-python = ">=3.8"
classifiers = [
    "Framework :: Jupyter",
    "Framework :: Jupyter :: JupyterLab",
    "Framework :: Jupyter :: JupyterLab :: 4",
    "Framework :: Jupyter :: JupyterLab :: Extensions",
    "Framework :: Jupyter :: JupyterLab :: Extensions :: Prebuilt",
    "License :: OSI Approved :: BSD License",
    "Programming Language :: Python",
    "Programming Language :: Python :: 3",
    "Programming Language :: Python :: 3.8",
    "Programming Language :: Python :: 3.9",
    "Programming Language :: Python :: 3.10",
    "Programming Language :: Python :: 3.11",
dependencies = [
dynamic = ["version", "description", "authors", "urls", "keywords"]

source = "nodejs"

fields = ["description", "authors", "urls"]

artifacts = ["send_3_test/labextension"]
exclude = [".github", "binder"]

"send_3_test/labextension" = "share/jupyter/labextensions/send-3-test"
"install.json" = "share/jupyter/labextensions/send-3-test/install.json"

path = "send_3_test/"

dependencies = ["hatch-jupyter-builder>=0.5"]
build-function = "hatch_jupyter_builder.npm_builder"
ensured-targets = [
skip-if-exists = ["send_3_test/labextension/static/style.js"]

build_cmd = "build:prod"
npm = ["jlpm"]

build_cmd = "install:extension"
npm = ["jlpm"]
source_dir = "src"
build_dir = "send_3_test/labextension"

version_cmd = "hatch version"

before-build-npm = [
    "python -m pip install 'jupyterlab>=4.0.0,<5'",
    "jlpm build:prod"
before-build-python = ["jlpm clean:all"]

ignore = ["W002"]

This probably also need to be e.g ^3.5.0 || ^4.0.5 (where 3.5 is whatever version you test against).

This… might not work. If going down the all-in-one route, only one lab should do the building. I’ve ended up, thus far, on doing a full wheel build against 4.x (though you could stay on 3.x) and then testing in another separate env. Having all needed deps in the development environment and building with pyproject-build --no-isolation can drastically speed up the cycle, but it’s still not great, of course.

1 Like

Basically, the code I have now for my extension works fine for JupyterLab >= 3.1.0 up until the last version released, and I want to have a package that my target users can install that can work with all those versions. Whether it is the same package or not does not matter in the end, so in that aspect working with different branches could do the trick. I’m not super experienced with package releases so I’m trying to clarify a few things :

  1. Could it work with two branches, one for JupyterLab >=3.1.0 until 4 and one for >= 4.0.0 ?
  2. What’s the best way to make sure I keep things in sync and working on both versions, I have my core code with most of my extension logic that works on all versions and I have put all the code that deals with the API changes in one file named compatibility.ts. Should I then seed two extensions, one for 3 and one for 4, develop in two different conda environments and with their own package.json dependencies, and link both repos to their respective branches ?
  3. And finally, I should then always publish a release on both branches whenever I have a feature that is ready, which means two different python packages, one named 3.x.x and the other one 4.x.x. When a user then run pip install <package-name>, how can I enforce the right package to be installed by default based on the user’s JupyterLab version ? Is this part of the pyproject.toml sufficient ?
requires = ["hatchling>=1.5.0", "jupyterlab>=3.1.0,<4", "hatch-nodejs-version"]

Thank you for the help !

Now sounds like conflating a bunch of approaches.

If shipping from one branch:

  • would only upload one .whl per logical release (e.g. adding a feature)
  • the pyproject.toml#/project/dependencies would include jupyterlab >=3.1,<5.0.0a0
  • package.json would need all the || version descriptors
  • use compatibility.ts, 50% of which would be fairly ugly types

If shipping from multiple branches, e.g. main and 0.1.x:

  • would upload a .whl-per-branch-per-adding-a-feature
  • pyproject.toml#/project/dependencies would be one of:
    • main: jupyterlab >=4.0.5,<5.0.0a0
    • 0.1.x: jupyterlab >=3.1,<4.0.0a0
  • would need to do the work on one branch, then backport it
  • but would get 100% accurate types
  • backports would be generally simpler, as the files would be named the same on both sides

One could interleave the versions, but that’s generally… a very poor idea, as folk generally understand the version ranges to not flip-flop back and forth between point releases.

Once multiple package.json are in the mix, different approaches could be taken, but again, gets rather complex, quickly, and cross-linked submodules between branches is generally no fun.