I have sketched a JupyterHub Service and a JupyterLab extension (bundled with a tiny server extension to support it) that enables users within the same Hub to easily share notebooks along with information about the environment they should be run in. It’s similar to Google Docs’ feature “Anyone with link can read (and make a copy)”.
In this demo GIF, Alice logs in, opens a notebook, and clicks a button to create a shareable link. A dialog box appears, saying:
For the next hour, any other user on this JupyterHub who has this link will be able to fetch a copy of your latest saved version of dask-examples/array.ipynb.
She copies the link and gives it to Bob. Then, on the right, Bob logs in and pastes the link into his browser. He is given a copy of Alice’s notebook.
See danielballan/jupyterhub-share-link for the source and instructions for trying it.
Notable aspects of this design:
- This works for local process spawners and container-based spawners.
- If a container-based spawner is used, the share link encodes which container image the sender was running, and it opens a server running the same image for the recipient.
- The copying happens entirely via the Jupyter REST API. It does not use the file system directly at all; users do not need to be on a shared file system.
- There is no extra database involved; the only state is a key pair controlled by the Hub Service, which it uses to sign and verify the share links.
See the Uses and Limitations section of the README for thoughts on how this fits into other kinds of sharing folks might want.
This work owes a lot to conversations about sharing notebooks at the Jupyter for Scientific User Facilities and HPC Community Workshop I have not put this in front of users yet, but I intend to deploy it on NSLS-II’s JupyterHub soon. Feedback and ideas welcome!