I’ve recently been struggling trying to help someone run some notebooks against a Python kernel with a range of Python package dependencies.
They’re on a Windows box — I’m not — and have had a range of issues:
- repo2docker didn’t work (
(return os.geteuid() AttributeError: module 'os' has no attribute 'geteuid'?);
- Docker didn’t work using a container I prebuilt (how do you reliably mount the current directory against a Docker volume?!);
- installing packages didn’t work / couldn’t be found (I have no idea what environmental catastrophe was going on!).
There was a preference for not working on a public machine (so no Binderhub and uploading notebooks there) and we don’t have an internal JupyterHub or BinderHub server.
Which got me thinking. Is there a way of simply packaging a Python kernel with custom packages preinstalled so that it can be shared?
For example, installing the kernel might do something like:
- make sure env support is available;
- create a new py env;
- install the required packages;
- register that env as a new, custom-named Jupyter kernel in an existing Jupyter supporting environment.
Or perhaps rather than something like having a
binder folder in a directory and running
repo2docker, having a
kernel directory containing a
requirements.txt and running
jupyter addkernel --name myPykernel . ?