Recommended way to install JupyterLab globally on Mac?

Hi there! I want to have one Jupyter(Lab) installation that I can use across all my virtual environments. On a Mac, I have several options to do that:

  1. Install JupyterLab globally using pip install jupyterlab
  2. Using Homebrew: brew install jupyterlab
  3. Create a dedicated virtual environment for JupyterLab

What is the recommended way? Global pip installs tend to get messy, so I tried the Homebrew version. The problem with that is installing extensions like Jupytext, which require additional pip packages. I have to install Jupytext in the Homebrew install path, but then the jupytext command is not globally available (unless I create an alias). Using a dedicated virtual env would be a bit more convenient, but that means having to source it before I can use the JupyterLab or jupytext command. Nothing terrible and all managable, but I was wondering what others are doing :slight_smile:

Hi!

Suggestion - don’t install it globally. It’ll get messy whether you use Apple’s Python distribution or Homebrew. If it’s OKi with you, use a Docker image instead, and share your file system with it. After working with Python/IPython Notebook/Jupyter for 5 years, I believe this is the sanest way to proceed.

Such a setup also has another bonus: it allows you to manage your notebooks, Python packages, support files using any source version control system (e.g. Git, Hg, whatever) without having to worry about VIRTUALENV.

Last bonus: you can move the deployment from your Mac to your friend’s Windows machine, or to a beefy cloud server without having to worry about a full installation in each case.

Have a look here: https://www.dataquest.io/blog/docker-data-science/

Ask me details about Mac file system idiosyncratic behavior if you decide to follow that route. I have an (dormant) open source project that’s based on this, you can have a look here: https://github.com/pr3d4t0r/SSScoring

Cheers!

pr3d

1 Like

Thank you! I heard about Docker, but it seemed to be a bit overkill for my use case, and Python officially recommends pipenv. I was hoping to find a way to use that without having to install JupyterLab x times. I’ll check your links and give Docker another try though :slight_smile:

@cecep - Excellent.

Using Docker doesn’t invalidate using pipenv or similar. For your purposes, a Docker container is a Debian box enabled to run the Jupyter notebook. You can add things via pip, conda, apt-get, etc. within the container. In 99% of cases where you intend to deploy your notebooks and setup to multiple machines (and maybe even multiple architectures) Docker tends to be the simplest, most effective solution.

Let me know if you need some help / more details, happy to oblige. You can find me as pr3d4t0r in irc://irc.freenode.net/#python or in various other dev channels. Have a great weekend!

1 Like

When you use conda for your virtual environments, the jupyterlab package is indeed being hard-linked (if the dependency resolution allows it to be the exact same version obviously). so in that case you only physically would have one copy of jupyterlab on your computer.

1 Like

@michaelaye Thanks for the tip, didn’t try conda yet!

I’m a new user and reported a related issue / feature request:

I’ve linked this thread into that issue. There are multiple ways to automate environment isolation (docker and conda mentioned here, tox in my issue), so it might come down to personal preference (I haven’t used conda and docker isn’t second nature to me yet for getting a local dev environment up and running quickly). One thing I’ve done in other projects is have tox for running tests locally (you could also run jupyterhub itself from a virtualenv) and re-using those same tox environments in travis configuration so local and remote CI are all the same.