Users pods connection to external spark cluster

I have JupyterHub deployed and I want to send PySpark jobs to an external standalone Spark cluster (outside of Kubernetes cluster).

I can connect to the Spark cluster but when this needs to “talk” back to the pod ,the connection cannot be established as the internal pod IP is referenced which of course is not reachable outside of the cluster.

24/12/09 10:47:39 WARN NettyRpcEnv: Ignored failure: java.io.IOException: Connecting to /10.42.1.138:2222 timed out (120000 ms)

To my eyes, this means that the users pods will need to be exposed as Kubernetes services in order for such connection to be established.

In any case, which are my options in this case?

KubeSpawner.services_enabled will create a service for every user pod.