Keep Jupyter Notebook (dockerHDDM) running without browser

I am using Jupyter Notebook to run dockerHDDM. I am very new to Jupyter Notebook and computational modeling. Running the code and fitting the model is taking quite some time and I am wondering if it is possible to keep the process running when closing the browser or if there is an alternative to using the browser to run the code.

Any help is very appreciated!

The answer to that is ‘yes’. Traditionally code is run in text files and you point the language at it. Or that is at least often the case how you run code classically.
I see from looking at this notebook that you are using Python and so that gives us some specifics to work with. (When posting at forums like this please keep in mind Jupyter runs a lot of languages depending on the kernel used and it is helpful to at least mention things in the post to help those trying to help you.)

And so you can put the code in a text file and then add an .py extension and call it with Python from the command line. (If you are using some of the IPython conveniences in your code, you can do similar by leaving it in and calling it with ipython. You will though probably benefit greatly if you are scaling anything like that up to convert it to pure Python.) If you execute running that script in an environment where you set things up already it will continue to run and you can monitor it with things like ps.
Usually though if you already know it will take some amount you would attach a screen before executing the job. The terminal console where you started the job can later be closed if you are behind a screen and detach. And then later you reattach to the screen to monitor or see the results. Normally, if you start a process in a terminal it is instantly closed if you were to close the terminal. So this gives you a lot of flexibility for long-running processes. You can do similar with tmux or more modern screen alternatives.

You can combine or swap in doing these types of steps on machines with more power, perhaps remotely or in the cloud. Depending on affiliation, some institutions have partnerships with other facilities. (I’d offer more guidance but it would vary by country and I only know a fraction of what is available in the U.S.A.) In fact, if you have access to more powerful machines, running Jupyter there may be all you need to do. Although you’ll benefit more by having multiple options available.

If you search around about ‘long-running computation’ in relation to Jupyter, you’ll find related guidance. There are options, such as offered here, if you need/want to stay with Jupyter. Additionally, here is an option addressed executing in Jupyter from the command line. (I like Jupytext for executing notebooks from the command line, separate from the browser, myself, see here for further discussion of that with an available example.)

I suspect the developers in this are offering the tutorials as Jupyter .ipynb because it offers a nice way to be in a literate computing environment where formatted text documentation and code can be alongside each other and the text is easily read. Jupyter is also meant to emphasize ‘interactive’, in fact it grew out of the ‘Interactive Python (IPython)’ and ‘IPython Notebooks’ project and so in the cases where the code runs reasonably fast you can continue on to interact with the results and make nice plots, for example. These conveniences though can and should be compromised if you need to run long-running calculations or scale to process many jobs.