Is there support / plans / acceptance of domain-specific distributions built on top of jupyterlab-desktop?

Hi all,

Just curious if there’s been any discussion of making it possible for domain-specific researchers to somehow provide jupyterlab-desktop plus a set of packages specific to their domain?

E.g. astrolab would have astropy and related packages already installed.

This would be particularly useful for someone who is planning to develop domain-specific pre-built Javascript plug-ins (namely, me).

This might be a bad idea for multiple reasons I don’t understand. Please feel free to tell me about them.

Thanks,
David

1 Like

Yes, this was discussed in Host a JupyterLab metapackage · Issue #23 · jupyterlab-contrib/jupyterlab-contrib.github.io · GitHub and I would love to have this too!

2 Likes

Perfect, thank you @krassowski – I had a feeling you were one of the people who would know, and I also had a feeling @tonyfast and @bollwyvl would have opinions about this already :slight_smile:

I will read that issue in detail, thank you.
I would definitely be willing to help dry run this process.
The “quantum chemistry” use case built with constructor that @bollwyvl describes in this comment sounds like what I have in mind: Host a JupyterLab metapackage · Issue #23 · jupyterlab-contrib/jupyterlab-contrib.github.io · GitHub

To be clear, I’m thinking of something that’s more like a “domain-specific distro” than a metapackage.
So you could have:

  • geojupyterlab-desktop
  • astrojupyterlab-desktop
  • quantum-jupyterlab-desktop
  • molecular-simulation-…
  • etc.
    each managed by the community.

I’m not sure if this would be of interest to, say, astropy right now given they have a good thing going.

But I also think there are a lot of domains that really need more GUI functionality, and being able to just build on top of juptyerlab-desktop instead of dealing with all the Python ecosystem GUI pain points could be a huge draw for them

1 Like

Right: threw this out on today’s lab call, but theoretically, one could reverse-engineer the current repo into a cookiecutter that, by default, would emit the same content right back out. This might be better than re-architecting the existing repo.

From some previous efforts to this effect (abandoned for other reasons), the strings that would need replacing are:

    REPLACEMENTS = {
        "JupyterLab Desktop": "{{ cookiecutter.product_title }}",
        "https://github.com/jupyterlab/jupyterlab-desktop/issues": "{{ cookiecutter.issues_url }}",
        "https://github.com/jupyterlab/jupyterlab-desktop/releases/latest/download/latest.yml": "{{ cookiecutter.releases_yaml_url }}",
        "https://github.com/jupyterlab/jupyterlab-desktop/releases": "{{ cookiecutter.releases_url }}",
        "https://github.com/jupyterlab/jupyterlab-desktop": "{{ cookiecutter.repo_url }}",
        "/jupyterlab-desktop/": "/{{ cookiecutter.package_path_unix }}/",
        "\\jupyterlab-desktop\\": "\\{{ cookiecutter.package_path_win }}\\",
        "JupyterLabDesktop": "{{ cookiecutter.product_path }}"
    }

As a downstream, one could then choose to:

  • manually…
    • runing the cookiecutter (once)
    • check it in
    • hack on it
    • never update it
    • push
    • wait for CI
    • download the releases
  • or, in an automated fashion…
    • running the cookiecutter
    • automated patching the generated repo
    • check that in (so that e.g. the crucial github actions work correctly)
    • push
    • wait for CI
    • download the releases

Or a hybrid of the two:

  • e.g. generate a workflow that can propose PRs when the upstream changes.

As for dependency robustness: there are a few places (like the construct.yaml) where the specificity/auditability of the product could be substantially improved (e.g. by injecting conda-lock outputs).

2 Likes

Looks useful, thank you @bollwyvl – need to put some other pieces in place but will definitely come back to this