How to run a notebook using command line

I know the user can convert the notebook into a script and run python ...py in the terminal.
Is there any way more convenient to do this task like python ...py to run notebook.

Yes, you can directly run a notebook file from the command line. Two main ways I use are nbconvert and papermill. See the third paragraph here for links to documentation for each and some examples.

By the way since you mentioned .py files, jupytext can convert a script back to a notebook and run that notebook in one command, here. It also does the conversion from notebook to script on the command line, too.

Here’s an from example from the nbconvert 6.0 docs using the nbconvert api: https://github.com/jupyter/nbconvert/blob/master/docs/api_examples/template_path/make_html.py

Thanks, I find the solution.

Executing notebooks from the command line
The same functionality of executing notebooks is exposed through a command line interface or a Python API interface. As an example, a notebook can be executed from the command line with:

jupyter nbconvert --to notebook --execute mynotebook.ipynb
1 Like

Is it possible to just execute the notebook, but not generate an output file?
I’d like to just time the execution speed of a bunch of notebooks.

1 Like

we have an issue to discuss and track this here, in case you’d like to weigh in: https://github.com/jupyter/nbclient/issues/4

1 Like

papermill {test_file_name.ipynb} /dev/null {args...} is a convenient way to do that

2 Likes

@MSeal has papermill considered adding a --no-save flag, or something like this? For folks that are not as familiar with coding (or for folks who are on windows), /dev/null may be a bit of a stretch to remember and use. Happy to open an issue to discuss if you think it’d be helpful

The library is a little opinionated in that it expects when something goes wrong you want to read the output, so it hasn’t been made easy to drop the output. Another option btw is to make the output the same as the input (which will write in place) or to just output to a dummy output path. A --no-save flag could be a decent issue to discuss. The implementation would be a little difficult to make conditional on the --no-save flag since the click configuration as it it’s setup to always expect an input and an output path. Happy to discuss more in github if you wanted.

Thanks for all the feedback. I left a comment on https://github.com/jupyter/nbclient/issues/4#issuecomment-648616654

1 Like

This solution doesn’t work for me. My notebook (sample_nn_Master.ipynb) contains print statements, but when I run it via this command I don’t see any of them. The only output I get is…

$  jupyter nbconvert --to notebook --execute sample_nn_Master.ipynb 
[NbConvertApp] Converting notebook sample_nn_Master.ipynb to notebook
[NbConvertApp] Executing notebook with kernel: python3
[NbConvertApp] Writing 11971 bytes to sample_nn_Master.nbconvert.ipynb

I defined a bash alias/function like so:

nbrun() { jupyter nbconvert --to script "$1"; cat "${1%.*}".py | grep -v get_ipython > run_this.py; python3 run_this.py;}

and then I just run

$ nbrun mynotebook.ipynb

It’s not a full solution – the grep -v get_ipython will remove calls like !pip but won’t actually call pip. But it’s getting me going for now.

That command you quote runs the notebook in the notebook. You should see your print statements ran in the notebook now when you open the notebook. Running it using the command line just triggers running the notebook. The output from inside the notebook doesn’t go to stdout on the command line from where you triggered running it.

If you want magics and sending shell commands via ! to work, why isn’t your bash function the following?

nbrun() { jupyter nbconvert --to script "$1";  mv  "${1%.*}".py run_this.ipy; ipython run_this.ipy}

More details on that suggestion / plus, Jupytext route from notebook to running as Python/IPython

For those following along later who are maybe less bash-oriented, I’m going to break down those steps in my suggested edit of the bash function, and as part of those steps use Jupytext, which I posted about near the start of this thread since it is a nice tool to have in your toolbox. Jupytext does some very related steps and adds some convenience if you want the produced script automatically looking cleaner and closer to a true Python script. (In other words, none of the get_ipython() cruft.)
If you want to run the notebook as a script, then you can use jupytext to convert it to python on command line according to here:

jupytext --to py notebook.ipynb 

Following that then run the script like normal python python <converted_notebook>.py. (Specifically, if you were using the example from jupytext where the notebook is named ‘notebook.ipynb’, the command to run the converted notebook as a script would be python notebook.py.) Then, if you had print statements inside your original notebook, they will go to stdout.

However, @drscotthawley raised the issue of magics and sending shell commands via !. Jupytext and IPython can help you there. See below for how you should be using pip magics and not !pip.) Jupytext includes an option to not comment off the magics and lines beginning with exclamation points, using comment_magics=false. Then you can rename that generated script to be an IPython script by changing the extension to .ipy and run it with ipython instead of python.

So the steps on the command line all together would be:

jupytext --to py notebook.ipynb --opt comment_magics=false
mv notebook.py notebook.ipy
ipython notebook.ipy

Along the line of @drscotthawley 's bash function above, you can substitute in jupyter nbconvert --to script for the Jupytext line.


Best practice for running pip install and conda install in notebooks is now to use the magics. So it should be %pip and not !pip. With conda, the best practice is to now use %conda install in a notebook. See here. Those are IPython magics & so they work with IPython, too.

2 Likes

Thank you very much! One other thing I was confused about was that sometimes instead of creating a .py file, jupyter nbconvert --to-script would created a .txt file, ruining my scheme. With jupytext I don’t have this problem. Based on what you wrote, I’m settling on this:

nbrun() { jupytext --to py "$1";  mv  "${1%.*}".py run_this.ipy; ipython run_this.ipy;}

1 Like

You could use nbtoolbelt: nbtb run -n nb.ipynb.