Running Multiple users using Pyspark context

Hello All,

I’m setting up a 1 Master - 2 worker nodes on Openstack Centos 7 instance.
Running Anaconda 3.8, PySpark 3.1.2
If I create multiple users and create spark context, the context seems to be stalled.
Also if I create a second notebook, have the same stalled issue, any help?
Do I need to install Livy / SparkMagic?

Since no-ones replied it might be worth asking on a Pyspark forum.