Cull/Kill idle Jupyterhub notebooks

We are using Jupyterhub and configured kernel to submit the code as Pyspark jobs.
I would like to kill/cull the idle notebooks after 30 minutes.

Please find below the pip list and kernel.json of py3 kernel.
I tried below two steps but it did not cull/kill notebooks. Idle Notebooks are still running.

  1. Set below properties in jupyterhub_config.py
    c.NotebookApp.shutdown_no_activity_timeout = 900
    c.MappingKernelManager.cull_idle_timeout = 900
    c.MappingKernelManager.cull_connected = True

  2. Installed jupyterhub_idle_culler and set the below config in jupyterhub_config.py
    c.JupyterHub.services = [
    {
    ‘name’: ‘idle-culler’,
    ‘admin’: True,
    ‘command’: [
    sys.executable,
    ‘-m’, ‘jupyterhub_idle_culler’,
    ‘–timeout=1800’
    ],
    }
    ]
    But I can kill it only by using "yarn application -kill " command

Pip list related to jupyterhub
jupyter 1.0.0
jupyter-client 6.1.12
jupyter-console 6.4.0
jupyter-contrib-core 0.3.3
jupyter-contrib-nbextensions 0.5.1
jupyter-core 4.7.1
jupyter-highlight-selected-word 0.2.0
jupyter-latex-envs 1.4.6
jupyter-nbextensions-configurator 0.4.1
jupyter-packaging 0.7.12
jupyter-server 1.4.1
jupyter-telemetry 0.1.0
jupyterhub 1.4.1
jupyterhub-idle-culler 1.1
jupyterhub-ldapauthenticator 1.2.2
jupyterlab 3.0.14
jupyterlab-pygments 0.1.2
jupyterlab-server 2.4.0
jupyterlab-widgets 1.0.0
notebook 6.4.0

kernel.json

{
“argv”: [
“/usr/share/miniconda2/envs/py36/bin/python”,
“-m”,
“ipykernel”,
“-f”,
“{connection_file}”
],
“env” : {
“SPARK_HOME” : “/usr/hdp/current/spark2-client”,
“PYTHONPATH” : “/usr/hdp/current/spark2-client/python:/usr/hdp/current/spark2-client/python/lib/py4j-0.10.7-src.zip”,
“PYTHONSTARTUP” : “/usr/hdp/current/spark2-client/python/pyspark/shell.py”,
“PYSPARK_PYTHON” : “/usr/share/miniconda2/envs/py36/bin/python3”,
“PYSPARK_SUBMIT_ARGS” : “–master yarn --deploy-mode client --jars /usr/hdp/current/hive_warehouse_connector/hive-warehouse-connector-assembly-1.0.0.3.1.4.37-1.jar --py-files /usr/hdp/current/hive_warehouse_connector/pyspark_hwc-1.0.0.3.1.4.37-1.zip --conf spark.security.credentials.hiveserver2.enabled=false --conf spark.port.maxRetries=50 --queue analytics --executor-memory 1620m --num-executors 2 --driver-memory 2760m pyspark-shell”
},

“display_name”: “Python 3”,
“language”: “python”
}

What am I doing wrong? How to cull/kill idle notebooks? Please assist

Which is the correct way to cull idle kernels and notebook? - #7 by minrk has some information on how the cullers work.

You haven’t said what user interface your’re using. If it’s JupyterLab this may be relevant: Cull idle kernels · Issue #6893 · jupyterlab/jupyterlab · GitHub

Thanks for the reply. I am using jupyterhub with jupyter notebook as UI

If you can’t resolve the problem could you try a basic Python notebook without using Pyspark or any other external libraries, and see if it’s culled? That will help narrow down whether it’s a general problem, or something specific to Pyspark.