PySpark - Read file from Azure blob: error - not found

Not sure this is correct forum, if so please help where I can post it.
I am trying to read file from Azure blob storage and getting error:

java.lang.RuntimeException: java.lang.ClassNotFoundException: Class$Secure not found

From the web I understand that additional libraries need to be installed, so i tried:

pip install azure-mgmt-resource 
pip install azure-mgmt-datalake-store 
pip install azure-datalake-store

but the problem wasn’t solved.
Some other posts only describe how to include jars in script execution, but I guess it’s not relevant for for jupyter notebook on k8s.

Thank you in advance,

Setting java jars in Spark config soled the problem!

# Setup the Configuration
conf = pyspark.SparkConf()
conf.set("spark.jars.packages", "org.apache.hadoop:hadoop-azure:3.3.4,")
conf.set("<STORAGE_ACCOUNT>", "<TOKEN>")
spark_context = SparkSession.builder.config(conf=conf).getOrCreate()