How to load AWS Profiles to Jupyter PySpark or Spark sessions

I have Jupyter Notebook running in local docker container and its started with the following shell script inside the container. I can access the Jupyter Notebook session (PySpark or Spark or Python3) However i would like to make use of my AWS profile (credentials) to my Jupyter Notebook session.

How can i achieve this? I normally use export AWS_PROFILE =<profile name>. This will provide the credentials for that session so that i can run Spark jobs from the Terminal but i would like to do the same with the Jupyter Notebook sessions as well.

Is there a workaround to set the profiles? or any other approach

#!/bin/bash
nohup /home/livy/bin/livy-server &
/usr/local/bin/jupyter notebook --allow-root --NotebookApp.token='' --NotebookApp.password='' --no-browser --ip=0.0.0.0

You could try setting environment variables for all your AWS credentials on the Docker command line, something like:

docker run -e AWS_ACCESS_KEY_ID=<...> -e AWS_SECRET_ACCESS_KEY=<...> -e AWS_SESSION_TOKEN=<...> ...

or you could try mounting your ~/.aws directory into the container:

docker run -e AWS_PROFILE=<> -v ~/.aws:/home/jovyan/.aws:ro ...

You’ll may need to modify your Docker user or the permissions on your ~/.aws files to ensure they’re readable inside the Docker container.