Jupyter notebook with a spark scala notebook via spylon kernel

I used to have a clean install of the spylon kernel and supposedly after upgrading to a clean 18.04. I am struggling to get the spylon kernel working. The screenshot is basically quitting the spark shell right when it starts as you can see with the “scala> :quit” and that keeps happening over and over again, infinitely.

I tried to do a pip install and also tried conda install the spylon kernel but it’s simply not working. Here are the content of the kernel.json file. Are there any specific logs outside or so that I can look at to help fix this issue.

{
“display_name”: “sparky”,
“language”: “scala”,
“argv”: [
“/home/ops/Documents/spark-3.0.0-preview2-bin-hadoop2.7/bin/spark-shell”,
“{connection_file}”
],
“env”: {
“SPARK_HOME”: “/home/ops/Documents/spark-3.0.0-preview2-bin-hadoop2.7/”
}
}

ops@ops-XPS-15-9560:~/.ipython/kernels$ jupyter kernelspec list
Available kernels:
python2 /home/ops/.local/share/jupyter/kernels/python2
python3 /home/ops/.local/share/jupyter/kernels/python3
spylon_kernel /home/ops/.local/share/jupyter/kernels/spylon_kernel
ops@ops-XPS-15-9560:~/.ipython/kernels$

so the installation is also fine and also the spark-shell runs fine on its own

Hi Brook,
Do you want to use PySpark environment by using Jupyterhub, then this below link might be helpful

Regards,
Zain