Not able to mount S3 buckets in JupyterLab

Hello, I have a JupyterLab docker image which I am using to run an instance of it in my openshift cluster.
I have modified the by adding some custom source code which can be used to mount some S3 buckets created using an S3 compatible storage (Noobaa) in my openshift cluster itself.

But, I am getting the below error while running the image:

Traceback (most recent call last):
      File "/opt/conda/lib/python3.9/site-packages/traitlets/", line 645, in get
        value = obj._trait_values[]
    KeyError: 'managers'

    During handling of the above exception, another exception occurred:

    Traceback (most recent call last):
      File "/opt/conda/lib/python3.9/site-packages/tornado/", line 1704, in _execute
        result = await result
      File "/opt/conda/lib/python3.9/site-packages/jupyter_server/services/contents/", line 112, in get
        if await ensure_async(cm.is_hidden(path)) and not cm.allow_hidden:
      File "/opt/conda/lib/python3.9/site-packages/hybridcontents/", line 86, in _wrapper
        prefix, mgr, mgr_path = _resolve_path(path, self.managers)
      File "/opt/conda/lib/python3.9/site-packages/traitlets/", line 686, in __get__
        return self.get(obj, cls)
      File "/opt/conda/lib/python3.9/site-packages/traitlets/", line 648, in get
        default = obj.trait_defaults(
      File "/opt/conda/lib/python3.9/site-packages/traitlets/", line 1752, in trait_defaults
        return self._get_trait_default_generator(names[0])(self)
      File "/opt/conda/lib/python3.9/site-packages/hybridcontents/", line 175, in _managers_default
        return {
      File "/opt/conda/lib/python3.9/site-packages/hybridcontents/", line 176, in <dictcomp>
        key: mgr_cls(parent=self,
      File "/opt/conda/lib/python3.9/site-packages/s3contents/", line 75, in __init__
        self._fs = S3FS(
      File "/opt/conda/lib/python3.9/site-packages/s3contents/", line 116, in __init__
      File "/opt/conda/lib/python3.9/site-packages/s3contents/", line 120, in init
      File "/opt/conda/lib/python3.9/site-packages/s3contents/", line 202, in mkdir
      File "/opt/conda/lib/python3.9/site-packages/fsspec/", line 86, in wrapper
        if loop[0] is None:
      File "/opt/conda/lib/python3.9/site-packages/fsspec/", line 66, in sync
        as an attribute of the instance.
      File "/opt/conda/lib/python3.9/site-packages/fsspec/", line 26, in _runner
      File "/opt/conda/lib/python3.9/site-packages/s3fs/", line 924, in _touch
        "size": out["ContentLength"],
      File "/opt/conda/lib/python3.9/site-packages/s3fs/", line 325, in _call_s3
      File "/opt/conda/lib/python3.9/site-packages/s3fs/", line 473, in set_session
        if fill_cache is None:
      File "/opt/conda/lib/python3.9/site-packages/aiobotocore/", line 37, in __aenter__
        self._client = await self._coro
      File "/opt/conda/lib/python3.9/site-packages/aiobotocore/", line 121, in _create_client
        client = await client_creator.create_client(
      File "/opt/conda/lib/python3.9/site-packages/aiobotocore/", line 46, in create_client
        return service_client
    AttributeError: 'AioClientCreator' object has no attribute '_register_lazy_block_unknown_fips_pseudo_regions'
[W 2022-07-11 14:23:24.787 ServerApp] Unhandled error

I am not able to understand if this error is coming up because of some mere dependency conflicts or something else is breaking up.

Can you please help me in resolving this issue?