I spin-up a custom mybinder instance using a Docker image that in the Notebook, launches a custom background process that the Notebook interacts with using the networking stack.
For some reason, after a while of activity, the instance terminates with “A connection to the Jupyter server could not be established”. This despite the fact that memory utilized stays within limits (just 280Mbytes used). While the notebook is active, everything works just fine as I expect to.
This does not happen when I launch the same image locally using jupyter-repo2docker.
So the termination of the instance happens while I’m actively interacting with the notebook, and usually, within a couple of minutes of starting the notebook.
If you are using a typical notebook with not many requirements resource-wise, that shouldn’t be happening. You should have a good connection and working notebook for longer than several minutes while if you are actively using it. The easiest way to tell the user experience you should be seeing is to go here and launch a session with the minimal dockerfile example and work in a notebook. If launches from your repo aren’t behaving similarly, there is an issue. In the first line of your original post you are talking about doing some complex stuff. I suspect that is your problem, combined with the problem you are using the discouraged dockerfile approach meant as a last option and not requirements.txt / environment.yml. You are probably not realizing you are exceeding resource limits or getting shut down by security measures meant to stop folks abusing the system.
Issues with permissions are discussed in the minimal dockerfile example repo. I’d suggest reviewing what it says and working backwards from the minimal example code so that new empty notebooks work in sessions launched from your repo. Maybe fixing that that may help you sort your losing connection issue in sessions.