IAM role not working for S3 storage of notebook code


In documentation for https://github.com/danielfrg/s3contents/ it says:
It is also possible to use IAM Role-based access to the S3 bucket from an Amazon EC2 instance; to do that, just leave access_key_id and secret_access_key set to their default values (None), and ensure that the EC2 instance has an IAM role which provides sufficient permissions for the bucket and the operations necessary.

I have done this successfully once by setting values like this in jupyter_notebook_config.py:

from s3contents import S3ContentsManager
import os
c = get_config()
# Tell Jupyter to use S3ContentsManager for all storage.
c.NotebookApp.contents_manager_class = S3ContentsManager
c.NotebookApp.ip = '*'
# use S3 Contents Manager to store files
c.S3ContentsManager.access_key_id = None
c.S3ContentsManager.secret_access_key = None
c.S3ContentsManager.bucket = "somebucket"

And then I deployed the jupyter notebook in EKS, and for the IAM role attached to the worker EC2 nodes, I attached the ‘S3FullAccess’ policy. Everything worked good.

Now, I am trying to deploy JupyterHub using this: https://zero-to-jupyterhub.readthedocs.io/en/stable/

I did a similar thing, where I had another jupyter_notebook_config.py, and I included the same info as above (access key and secret key set to ‘None’). I built the docker image and pushed it to ECR.

I then ran a command like this:
helm upgrade $RELEASE jupyterhub/jupyterhub --namespace $NAMESPACE --version=0.8.2 --values config.yaml --debug

Where the config.yaml looks like this:

  secretToken: "297df077562a5116d995d5bbaebbb91bb042f5f339022462d5acc5d0d5e5dda1"
    name: XXXXXXXX.dkr.ecr.us-XXXX-1.amazonaws.com/bulk_jupyter
    pullPolicy: "Always"
    tag: basenotebooknothing

(The basenotebooknothing image is what I had built and pushed to ECR, with the access key and secret key set to ‘None’)
The new notebook image cannot be spawned and I get an error like this:

File “/opt/conda/bin/jupyterhub-singleuser”, line 10, in
File “/opt/conda/lib/python3.7/site-packages/jupyterhub/singleuser.py”, line 660, in main
return SingleUserNotebookApp.launch_instance(argv)
File “/opt/conda/lib/python3.7/site-packages/jupyter_core/application.py”, line 270, in launch_instance
return super(JupyterApp, cls).launch_instance(argv=argv, **kwargs)
File “/opt/conda/lib/python3.7/site-packages/traitlets/config/application.py”, line 663, in launch_instance
File “/opt/conda/lib/python3.7/site-packages/jupyterhub/singleuser.py”, line 558, in initialize
return super().initialize(argv)
File “”, line 2, in initialize
File “/opt/conda/lib/python3.7/site-packages/traitlets/config/application.py”, line 87, in catch_config_error
return method(app, *args, **kwargs)
File “/opt/conda/lib/python3.7/site-packages/notebook/notebookapp.py”, line 1766, in initialize
File “/opt/conda/lib/python3.7/site-packages/notebook/notebookapp.py”, line 1385, in init_configurables
File “/opt/conda/lib/python3.7/site-packages/s3contents/s3manager.py”, line 61, in init
File “/opt/conda/lib/python3.7/site-packages/s3contents/s3_fs.py”, line 94, in init
File “/opt/conda/lib/python3.7/site-packages/s3contents/s3_fs.py”, line 98, in init
File “/opt/conda/lib/python3.7/site-packages/s3contents/s3_fs.py”, line 166, in mkdir
File “/opt/conda/lib/python3.7/site-packages/s3fs/core.py”, line 448, in touch
self._call_s3(self.s3.put_object, kwargs, Bucket=bucket, Key=key)
File “/opt/conda/lib/python3.7/site-packages/s3fs/core.py”, line 184, in _call_s3
return method(**additional_kwargs)
File “/opt/conda/lib/python3.7/site-packages/botocore/client.py”, line 316, in _api_call
return self._make_api_call(operation_name, kwargs)
File “/opt/conda/lib/python3.7/site-packages/botocore/client.py”, line 613, in _make_api_call
operation_model, request_dict, request_context)
File “/opt/conda/lib/python3.7/site-packages/botocore/client.py”, line 632, in _make_request
return self._endpoint.make_request(operation_model, request_dict)
File “/opt/conda/lib/python3.7/site-packages/botocore/endpoint.py”, line 102, in make_request
return self._send_request(request_dict, operation_model)
File “/opt/conda/lib/python3.7/site-packages/botocore/endpoint.py”, line 132, in _send_request
request = self.create_request(request_dict, operation_model)
File “/opt/conda/lib/python3.7/site-packages/botocore/endpoint.py”, line 116, in create_request
File “/opt/conda/lib/python3.7/site-packages/botocore/hooks.py”, line 356, in emit
return self._emitter.emit(aliased_event_name, **kwargs)
File “/opt/conda/lib/python3.7/site-packages/botocore/hooks.py”, line 228, in emit
return self._emit(event_name, kwargs)
File “/opt/conda/lib/python3.7/site-packages/botocore/hooks.py”, line 211, in _emit
response = handler(**kwargs)
File “/opt/conda/lib/python3.7/site-packages/botocore/signers.py”, line 90, in handler
return self.sign(operation_name, request)
File “/opt/conda/lib/python3.7/site-packages/botocore/signers.py”, line 160, in sign
File “/opt/conda/lib/python3.7/site-packages/botocore/auth.py”, line 357, in add_auth
raise NoCredentialsError
botocore.exceptions.NoCredentialsError: Unable to locate credentials

I have S3FullAccess policy attached to the IAM role that is attached to my worker EC2 instance. Do you know why I am getting the above error? Thanks

I should mention that when I hardcode in the access key, secret key, and session token in my jupyter_notebook_config.py, the s3 storage works. I don’t want to do this however – I want to use an IAM role.


@nishi1288 did you find any solution for this?

hey @jonasneves no I didn’t… We’ve moved on to using EFS instead of s3 for storage of notebook code, seems to be easier for large number of users

1 Like