Hi all,
My all-spark-notebook gets successfully spawned and I can work with spark in local mode.
Actually, I need to get that stuff working also in a more sophisticated setup.
We run a Spark cluster on Kubernetes. I this setup we need the all-spark-notebook to open some ports. During Spark context creation we specify the following options:
conf.set(“spark.driver.port”, “xxxx”)
conf.set(“spark.driver.blockManager.port”, “yyyy”)
The Spark driver, runnning in the notebook, dynamiclly spawns Spark workers that have to connect to their related driver pod using the specified ports.
We thought that beside the driver pod we would also need corresponding services to be created, e.g. name like “jupyter--driver” that opens the configured ports?
Any idea on that would be very appreciated!
Thanks and alle the best,
Meikel