Build fails while collecting package metadata

I’ve been following the contributing instructions to get BinderHub running with minikube, but I run into a failure during build, right after this log output:

+ conda env create -p /srv/conda/envs/notebook -f /tmp/environment.yml
Collecting package metadata (repodata.json): ...working...

With debugging enabled I see a 404:

[D 191203 15:39:23 rest:219] response body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"pods \"build-xxxxxxxx-e8225e-d93d43\" not found","reason":"NotFound","details":{"name":"build-xxxxxxxx-e8225e-d93d43","kind":"pods"},"code":404}

I encounter the same problem on macOS and CentOS, so I’m clearly doing something wrong in both places. Is there some step here that I am missing? Many thanks for your help.

Could you give a bit of detail where the second bit of log output came from? I suspect it is from the binderhub pod but unsure.

What is it that fails in your build? What repository are you trying to build?

+ conda env create -p /srv/conda/envs/notebook -f /tmp/environment.yml
Collecting package metadata (repodata.json): ...working...

by itself doesn’t look wrong, it should print some kind of error that could point towards what exactly isn’t working.

I think there’s something more fundamental wrong my installation. When I run pytest -svx -m "not auth_test" I get an error on JupyterHub’s availability, i.e. JupyterHub available at http://192.168.99.106:30123 ERROR

I was able to get JupyterHub running fine using the Zero to JupyterHub instructions (and using minikube tunnel to handle the LoadBalancer), but it seems like JupyterHub is not accessible when I try to follow the binderhub/minikube testing instructions.

OK, I never see an EXTERNAL-IP when running kubectl get svc proxy-public -n binder-test. This seems like a problem? Any ideas how to fix this? I’m running minikube on macOS.

Can you create a short step by step list of what you did since downloading minikube?

For me following the instructions “just works”. I never used minikube tunnel so maybe the fact that you think you need to use it points towards there being a difference in how minikube is configured? Things to check in addition to a list of the exact step by step commands you run is what minikube ip says. Once JupyterHub is up you should be able to reach it at http://<minikube ip>:30123 (open it in a browser and it should show some kind of Jupyter themed error page)

Thank you for responding. After adding c.BinderHub.debug = True to binderhub_config.py, I followed these steps:

curl https://raw.githubusercontent.com/kubernetes/helm/master/scripts/get | bash
helm init
helm repo add jupyterhub https://jupyterhub.github.io/helm-chart/
helm repo update
python3 -m pip install -e . -r dev-requirements.txt
./testing/minikube/install-hub
eval $(minikube docker-env)
python3 -m binderhub -f testing/minikube/binderhub_config.py

When I visit http://<minikube ip>:30123 I get redirected to http://<minikube ip>:30123/hub/login and receive a Jupyter-themed “403 : Forbidden”.

Hi all, I seem to be bumping into a similar issue…

Here is the (rather long) log of running pytest -svx -m “not auth_test”

Any hints ?

(binderhub) romain@plume:~/inria/binderhub$ pytest -svx -m "not auth_test"
============================================================================================================ test session starts =============================================================================================================
platform linux -- Python 3.8.1, pytest-5.3.2, py-1.8.1, pluggy-0.13.1 -- /home/romain/miniconda3/envs/binderhub/bin/python3
cachedir: .pytest_cache
rootdir: /home/romain/inria/binderhub
plugins: cov-2.8.1, asyncio-0.10.0
collected 68 items / 3 deselected / 65 selected                                                                                                                                                                                              

binderhub/tests/test_app.py::test_help PASSED
binderhub/tests/test_app.py::test_help_all PASSED
binderhub/tests/test_build.py::test_build[gh/binderhub-ci-repos/requirements/d687a7f9e6946ab01ef2baa7bd6d5b73c6e904fd] JupyterHub available at http://192.168.39.130:30123
Waiting for build to start...
Picked Git content provider.
Cloning into '/tmp/repo2dockerzfru3_n6'...
HEAD is now at d687a7f lighten default requirements.txt
Building conda environment for python=3.7Using PythonBuildPack builder
Building conda environment for python=3.7Building conda environment for python=3.7Step 1/51 : FROM buildpack-deps:bionic
 ---> 84f9345349b1
Step 2/51 : ENV DEBIAN_FRONTEND=noninteractive
 ---> Using cache
 ---> 40a2d5db1977
Step 3/51 : RUN apt-get -qq update &&     apt-get -qq install --yes --no-install-recommends locales > /dev/null &&     apt-get -qq purge &&     apt-get -qq clean &&     rm -rf /var/lib/apt/lists/*
 ---> Using cache
 ---> c2e5f09e89d6
Step 4/51 : RUN echo "en_US.UTF-8 UTF-8" > /etc/locale.gen &&     locale-gen
 ---> Using cache
 ---> d847ac357778
Step 5/51 : ENV LC_ALL en_US.UTF-8
 ---> Using cache
 ---> 0db893527001
Step 6/51 : ENV LANG en_US.UTF-8
 ---> Using cache
 ---> 711cc499ab7c
Step 7/51 : ENV LANGUAGE en_US.UTF-8
 ---> Using cache
 ---> 80ea136844bf
Step 8/51 : ENV SHELL /bin/bash
 ---> Using cache
 ---> 173e1d2ae213
Step 9/51 : ARG NB_USER
 ---> Using cache
 ---> 4670563e9391
Step 10/51 : ARG NB_UID
 ---> Using cache
 ---> ded23be936dd
Step 11/51 : ENV USER ${NB_USER}
 ---> Using cache
 ---> 60e1435647e0
Step 12/51 : ENV HOME /home/${NB_USER}
 ---> Using cache
 ---> 09ae90dfa8aa
Step 13/51 : RUN adduser --disabled-password     --gecos "Default user"     --uid ${NB_UID}     ${NB_USER}
 ---> Using cache
 ---> eb9096d270cc
Step 14/51 : RUN wget --quiet -O - https://deb.nodesource.com/gpgkey/nodesource.gpg.key |  apt-key add - &&     DISTRO="bionic" &&     echo "deb https://deb.nodesource.com/node_10.x $DISTRO main" >> /etc/apt/sources.list.d/nodesource.list &&     echo "deb-src https://deb.nodesource.com/node_10.x $DISTRO main" >> /etc/apt/sources.list.d/nodesource.list
 ---> Using cache
 ---> 15b598ce7dc8
Step 15/51 : RUN apt-get -qq update &&     apt-get -qq install --yes --no-install-recommends        less        nodejs        unzip        > /dev/null &&     apt-get -qq purge &&     apt-get -qq clean &&     rm -rf /var/lib/apt/lists/*
 ---> Using cache
 ---> 33c840fb36d0
Step 16/51 : EXPOSE 8888
 ---> Using cache
 ---> 5cb53ed2674c
Step 17/51 : ENV APP_BASE /srv
 ---> Using cache
 ---> dbb9aa519008
Step 18/51 : ENV NPM_DIR ${APP_BASE}/npm
 ---> Using cache
 ---> aae28f7349e8
Step 19/51 : ENV NPM_CONFIG_GLOBALCONFIG ${NPM_DIR}/npmrc
 ---> Using cache
 ---> 9d45c2139fe8
Step 20/51 : ENV CONDA_DIR ${APP_BASE}/conda
 ---> Using cache
 ---> 254b1ecd5cc1
Step 21/51 : ENV NB_PYTHON_PREFIX ${CONDA_DIR}/envs/notebook
 ---> Using cache
 ---> 6c27ebb14d99
Step 22/51 : ENV KERNEL_PYTHON_PREFIX ${NB_PYTHON_PREFIX}
 ---> Using cache
 ---> c747ad77dc6e
Step 23/51 : ENV PATH ${NB_PYTHON_PREFIX}/bin:${CONDA_DIR}/bin:${NPM_DIR}/bin:${PATH}
 ---> Using cache
 ---> 8e7a5c6b70d2
Step 24/51 : COPY conda/activate-conda.sh /etc/profile.d/activate-conda.sh
 ---> Using cache
 ---> c815acb575e0
Step 25/51 : COPY conda/environment.py-3.7.frozen.yml /tmp/environment.yml
 ---> Using cache
 ---> a8d1bca9944f
Step 26/51 : COPY conda/install-miniconda.bash /tmp/install-miniconda.bash
 ---> Using cache
 ---> a274d85f90c5
Step 27/51 : RUN mkdir -p ${NPM_DIR} && chown -R ${NB_USER}:${NB_USER} ${NPM_DIR}
 ---> Using cache
 ---> d02f7e63f3e6
Step 28/51 : USER ${NB_USER}
 ---> Using cache
 ---> d154cb01bb92
Step 29/51 : RUN npm config --global set prefix ${NPM_DIR}
 ---> Using cache
 ---> 3816b2486c49
Step 30/51 : USER root
 ---> Using cache
 ---> cf0bc3130e5a
Step 31/51 : RUN bash /tmp/install-miniconda.bash && rm /tmp/install-miniconda.bash /tmp/environment.yml
 ---> Running in a9ef642cac53
++ dirname /tmp/install-miniconda.bash
+ cd /tmp
+ MINICONDA_VERSION=4.6.14
+ CONDA_VERSION=4.7.10
+ MD5SUM=718259965f234088d785cad1fbd7de03
+ URL=https://repo.continuum.io/miniconda/Miniconda3-4.6.14-Linux-x86_64.sh
+ INSTALLER_PATH=/tmp/miniconda-installer.sh
+ unset HOME
+ wget --quiet https://repo.continuum.io/miniconda/Miniconda3-4.6.14-Linux-x86_64.sh -O /tmp/miniconda-installer.sh
+ chmod +x /tmp/miniconda-installer.sh
+ md5sum --quiet -c -
+ echo '718259965f234088d785cad1fbd7de03  /tmp/miniconda-installer.sh'
+ bash /tmp/miniconda-installer.sh -b -p /srv/conda
PREFIX=/srv/conda
installing: python-3.7.3-h0371630_0 ...
Python 3.7.3
installing: ca-certificates-2019.1.23-0 ...
installing: libgcc-ng-8.2.0-hdf63c60_1 ...
installing: libstdcxx-ng-8.2.0-hdf63c60_1 ...
installing: libffi-3.2.1-hd88cf55_4 ...
installing: ncurses-6.1-he6710b0_1 ...
installing: openssl-1.1.1b-h7b6447c_1 ...
installing: xz-5.2.4-h14c3975_4 ...
installing: yaml-0.1.7-had09818_2 ...
installing: zlib-1.2.11-h7b6447c_3 ...
installing: libedit-3.1.20181209-hc058e9b_0 ...
installing: readline-7.0-h7b6447c_5 ...
installing: tk-8.6.8-hbc83047_0 ...
installing: sqlite-3.27.2-h7b6447c_0 ...
installing: asn1crypto-0.24.0-py37_0 ...
installing: certifi-2019.3.9-py37_0 ...
installing: chardet-3.0.4-py37_1 ...
installing: idna-2.8-py37_0 ...
installing: pycosat-0.6.3-py37h14c3975_0 ...
installing: pycparser-2.19-py37_0 ...
installing: pysocks-1.6.8-py37_0 ...
installing: ruamel_yaml-0.15.46-py37h14c3975_0 ...
installing: six-1.12.0-py37_0 ...
installing: cffi-1.12.2-py37h2e261b9_1 ...
installing: setuptools-41.0.0-py37_0 ...
installing: cryptography-2.6.1-py37h1ba5d50_0 ...
installing: wheel-0.33.1-py37_0 ...
installing: pip-19.0.3-py37_0 ...
installing: pyopenssl-19.0.0-py37_0 ...
installing: urllib3-1.24.1-py37_0 ...
installing: requests-2.21.0-py37_0 ...
installing: conda-4.6.14-py37_0 ...
installation finished.
+ export PATH=/srv/conda/bin:/srv/conda/envs/notebook/bin:/srv/conda/bin:/srv/npm/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
+ PATH=/srv/conda/bin:/srv/conda/envs/notebook/bin:/srv/conda/bin:/srv/npm/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
+ conda config --system --add channels conda-forge
+ conda config --system --set auto_update_conda false
+ conda config --system --set show_channel_urls true
+ echo 'update_dependencies: false'
+ [[ 4.7.10 != \4\.\6\.\1\4 ]]
+ conda install -yq conda==4.7.10
Collecting package metadata: ...working... FAILED

================================================================================================================== FAILURES ==================================================================================================================
__________________________________________________________________________ test_build[gh/binderhub-ci-repos/requirements/d687a7f9e6946ab01ef2baa7bd6d5b73c6e904fd] ___________________________________________________________________________

app = <binderhub.app.BinderHub object at 0x7fd82b692a90>, needs_build = None, needs_launch = None, always_build = None, slug = 'gh/binderhub-ci-repos/requirements/d687a7f9e6946ab01ef2baa7bd6d5b73c6e904fd'
pytestconfig = <_pytest.config.Config object at 0x7fd82f612610>

    @pytest.mark.asyncio(timeout=900)
    @pytest.mark.parametrize("slug", [
        "gh/binderhub-ci-repos/requirements/d687a7f9e6946ab01ef2baa7bd6d5b73c6e904fd",
        "git/{}/d687a7f9e6946ab01ef2baa7bd6d5b73c6e904fd".format(
            quote("https://github.com/binderhub-ci-repos/requirements", safe='')
        ),
        "git/{}/master".format(
            quote("https://github.com/binderhub-ci-repos/requirements", safe='')
        ),
        "gl/minrk%2Fbinderhub-ci/0d4a217d40660efaa58761d8c6084e7cf5453cca",
    ])
    @pytest.mark.remote
    async def test_build(app, needs_build, needs_launch, always_build, slug, pytestconfig):
        # can't use mark.github_api since only some tests here use GitHub
        if slug.startswith('gh/') and "not github_api" in pytestconfig.getoption('markexpr'):
            pytest.skip("Skipping GitHub API test")
        build_url = f"{app.url}/build/{slug}"
        r = await async_requests.get(build_url, stream=True)
        r.raise_for_status()
        events = []
        async for line in async_requests.iter_lines(r):
            line = line.decode('utf8', 'replace')
            if line.startswith('data:'):
                event = json.loads(line.split(':', 1)[1])
                events.append(event)
>               assert 'message' in event
E               AssertionError: assert 'message' in {'phase': 'Failed'}

binderhub/tests/test_build.py:43: AssertionError
------------------------------------------------------------------------------------------------------------- Captured log setup -------------------------------------------------------------------------------------------------------------
INFO     tornado.application:app.py:691 BinderHub starting on port 7200
------------------------------------------------------------------------------------------------------------- Captured log call --------------------------------------------------------------------------------------------------------------
DEBUG    traitlets:repoproviders.py:724 Fetching https://api.github.com/repos/binderhub-ci-repos/requirements/commits/d687a7f9e6946ab01ef2baa7bd6d5b73c6e904fd
DEBUG    traitlets:repoproviders.py:709 GitHub rate limit remaining 4983/5000. Reset in -646 days, 21:58:51.
DEBUG    docker.utils.config:config.py:21 Trying paths: ['/home/romain/.docker/config.json', '/home/romain/.dockercfg']
DEBUG    docker.utils.config:config.py:25 Found file at path: /home/romain/.docker/config.json
DEBUG    docker.auth:auth.py:197 Couldn't find auth-related section ; attempting to interpret as auth-only file
DEBUG    docker.auth:auth.py:134 Auth data for auths is absent. Client might be using a credentials store instead.
DEBUG    docker.auth:auth.py:134 Auth data for HttpHeaders is absent. Client might be using a credentials store instead.
DEBUG    kubernetes.client.rest:rest.py:219 response body: {"kind":"PodList","apiVersion":"v1","metadata":{"selfLink":"/api/v1/namespaces/binder-test/pods","resourceVersion":"4733"},"items":[]}

DEBUG    tornado.application:build.py:100 0 build pods
DEBUG    tornado.application:build.py:147 Build phase summary: {}
DEBUG    kubernetes.client.rest:rest.py:219 response body: {"kind":"PodList","apiVersion":"v1","metadata":{"selfLink":"/api/v1/namespaces/binder-test/pods","resourceVersion":"4733"},"items":[]}

DEBUG    kubernetes.client.rest:rest.py:219 response body: {"kind":"Pod","apiVersion":"v1","metadata":{"name":"build-test-2dadb309f1f4c3c6a6d9b398536f8780f4-2db-8908ff-d687a7","namespace":"binder-test","selfLink":"/api/v1/namespaces/binder-test/pods/build-test-2dadb309f1f4c3c6a6d9b398536f8780f4-2db-8908ff-d687a7","uid":"c854ef05-de8b-40fb-88d9-9fadb408898a","resourceVersion":"4734","creationTimestamp":"2020-01-16T16:38:13Z","labels":{"component":"binderhub-build","name":"build-test-2dadb309f1f4c3c6a6d9b398536f8780f4-2db-8908ff-d687a7"},"annotations":{"binder-repo":"https://github.com/binderhub-ci-repos/requirements"}},"spec":{"volumes":[{"name":"docker-socket","hostPath":{"path":"/var/run/docker.sock","type":"Socket"}},{"name":"default-token-6ksk8","secret":{"secretName":"default-token-6ksk8","defaultMode":420}}],"containers":[{"name":"builder","image":"jupyter/repo2docker:0.10.0","args":["jupyter-repo2docker","--ref","d687a7f9e6946ab01ef2baa7bd6d5b73c6e904fd","--image","test-2dadb309f1f4c3c6a6d9b398536f8780f4-2dbinderhub-2dci-2drepos-2drequirements-8908ff:d687a7f9e6946ab01ef2baa7bd6d5b73c6e904fd","--no-clean","--no-run","--json-logs","--user-name","jovyan","--user-id","1000","https://github.com/binderhub-ci-repos/requirements"],"resources":{"limits":{"memory":"0"},"requests":{"memory":"0"}},"volumeMounts":[{"name":"docker-socket","mountPath":"/var/run/docker.sock"},{"name":"default-token-6ksk8","readOnly":true,"mountPath":"/var/run/secrets/kubernetes.io/serviceaccount"}],"terminationMessagePath":"/dev/termination-log","terminationMessagePolicy":"File","imagePullPolicy":"IfNotPresent"}],"restartPolicy":"Never","terminationGracePeriodSeconds":30,"dnsPolicy":"ClusterFirst","serviceAccountName":"default","serviceAccount":"default","securityContext":{},"affinity":{"podAntiAffinity":{"preferredDuringSchedulingIgnoredDuringExecution":[{"weight":100,"podAffinityTerm":{"labelSelector":{"matchLabels":{"component":"binderhub-build"}},"topologyKey":"kubernetes.io/hostname"}}]}},"schedulerName":"default-scheduler","tolerations":[{"key":"hub.jupyter.org/dedicated","operator":"Equal","value":"user","effect":"NoSchedule"},{"key":"hub.jupyter.org_dedicated","operator":"Equal","value":"user","effect":"NoSchedule"},{"key":"node.kubernetes.io/not-ready","operator":"Exists","effect":"NoExecute","tolerationSeconds":300},{"key":"node.kubernetes.io/unreachable","operator":"Exists","effect":"NoExecute","tolerationSeconds":300}],"priority":0,"enableServiceLinks":true},"status":{"phase":"Pending","qosClass":"BestEffort"}}

INFO     tornado.application:build.py:296 Started build build-test-2dadb309f1f4c3c6a6d9b398536f8780f4-2db-8908ff-d687a7
INFO     tornado.application:build.py:298 Watching build pod build-test-2dadb309f1f4c3c6a6d9b398536f8780f4-2db-8908ff-d687a7
INFO     tornado.application:build.py:329 Watching logs of build-test-2dadb309f1f4c3c6a6d9b398536f8780f4-2db-8908ff-d687a7
INFO     tornado.application:build.py:357 Finished streaming logs of build-test-2dadb309f1f4c3c6a6d9b398536f8780f4-2db-8908ff-d687a7
----------------------------------------------------------------------------------------------------------- Captured log teardown ------------------------------------------------------------------------------------------------------------
ERROR    asyncio:base_events.py:1703 Task was destroyed but it is pending!
task: <Task pending name='Task-7' coro=<<async_generator_athrow without __name__>()>>
============================================================================================================== warnings summary ==============================================================================================================
/home/romain/miniconda3/envs/binderhub/lib/python3.8/site-packages/_pytest/mark/structures.py:323
  /home/romain/miniconda3/envs/binderhub/lib/python3.8/site-packages/_pytest/mark/structures.py:323: PytestUnknownMarkWarning: Unknown pytest.mark.auth_test - is this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/latest/mark.html
    warnings.warn(

/home/romain/miniconda3/envs/binderhub/lib/python3.8/site-packages/_pytest/mark/structures.py:323
  /home/romain/miniconda3/envs/binderhub/lib/python3.8/site-packages/_pytest/mark/structures.py:323: PytestUnknownMarkWarning: Unknown pytest.mark.remote - is this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/latest/mark.html
    warnings.warn(

/home/romain/miniconda3/envs/binderhub/lib/python3.8/site-packages/html5lib/_trie/_base.py:3
  /home/romain/miniconda3/envs/binderhub/lib/python3.8/site-packages/html5lib/_trie/_base.py:3: DeprecationWarning: Using or importing the ABCs from 'collections' instead of from 'collections.abc' is deprecated since Python 3.3, and in 3.9 it will stop working
    from collections import Mapping

/home/romain/miniconda3/envs/binderhub/lib/python3.8/site-packages/_pytest/mark/structures.py:323
  /home/romain/miniconda3/envs/binderhub/lib/python3.8/site-packages/_pytest/mark/structures.py:323: PytestUnknownMarkWarning: Unknown pytest.mark.github_api - is this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/latest/mark.html
    warnings.warn(

binderhub/tests/test_build.py::test_build[gh/binderhub-ci-repos/requirements/d687a7f9e6946ab01ef2baa7bd6d5b73c6e904fd]
  /home/romain/miniconda3/envs/binderhub/lib/python3.8/site-packages/docker/utils/utils.py:283: DeprecationWarning: urllib.parse.splitnport() is deprecated as of 3.8, use urllib.parse.urlparse() instead
    host, port = splitnport(parsed_url.netloc)

-- Docs: https://docs.pytest.org/en/latest/warnings.html
Recorded http responses for api.github.com in http-record.api.github.com.json
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! stopping after 1 failures !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
=========================================================================================== 1 failed, 2 passed, 3 deselected, 5 warnings in 45.38s ===========================================================================================
Task was destroyed but it is pending!
task: <Task pending name='Task-1' coro=<BinderHub.watch_build_pods() running at /home/romain/inria/binderhub/binderhub/app.py:688> wait_for=<Future pending cb=[<TaskWakeupMethWrapper object at 0x7fd82b619fd0>()]>>
Task was destroyed but it is pending!
task: <Task pending name='Task-6' coro=<BuildHandler.keep_alive() running at /home/romain/inria/binderhub/binderhub/builder.py:91> wait_for=<Future pending cb=[<TaskWakeupMethWrapper object at 0x7fd82ad70b80>()]> cb=[IOLoop.add_future.<locals>.<lambda>() at /home/romain/miniconda3/envs/binderhub/lib/python3.8/site-packages/tornado/ioloop.py:690]>
Task was destroyed but it is pending!
task: <Task pending name='Task-3' coro=<HTTP1ServerConnection._server_request_loop() running at /home/romain/miniconda3/envs/binderhub/lib/python3.8/site-packages/tornado/http1connection.py:817> wait_for=<Future pending cb=[<TaskWakeupMethWrapper object at 0x7fd82ad78640>()]> cb=[IOLoop.add_future.<locals>.<lambda>() at /home/romain/miniconda3/envs/binderhub/lib/python3.8/site-packages/tornado/ioloop.py:690]>
Task was destroyed but it is pending!
task: <Task pending name='Task-4' coro=<RequestHandler._execute() running at /home/romain/miniconda3/envs/binderhub/lib/python3.8/site-packages/tornado/web.py:1699> wait_for=<Future pending cb=[<TaskWakeupMethWrapper object at 0x7fd82ad9fc40>()]> cb=[_HandlerDelegate.execute.<locals>.<lambda>() at /home/romain/miniconda3/envs/binderhub/lib/python3.8/site-packages/tornado/web.py:2329]>
(binderhub) romain@plume:~/inria/binderhub$

Hi @stakats, just to report that in my case (which seems similar to yours) the crash was apparently due to minikube VM exhausting its memory (I believe by default it only allocates 2000MB of main memory to the VM, at least on my setup)

Yes, I believe you are correct! Thank you, @rprimet. I increased minikube’s memory allocation, and I no longer have this problem. @betatim, is it worth thinking about trying to catch this error so that people know what’s happening when the VM runs out of memory?