In my current k8s setup of jupyterhub, i have two kernels that the user can use : python 3 (ipykernel) and PySpark kernel. In my custom spawner i am setting certain environment variables (using pod.spec.containers[0].env
) but they are only being reflected in the python 3 kernel and not in the pyspark kernel (i am fetching the env vars using os.environ
).
Are the environment variables being set in the kernels context? Also how can i set these such that certain env vars are common for all the kernels.
Z2JH doesn’t do anything fancy with environment variables, it just sets them at the container level. Since it’s working for ipykernel it sounds like everything is correct.
Have you checked the pyspark kernel documentation for how it handles environment variables? Spark supports parallel/remote compute, and environment variables may differ across nodes.
2 Likes