Real-time collaboration for jupyterhub

Hi there,
I’ve been playing around with RTC extension on jupyterhub recently and I’ve been stuck at creating-collaboration-accounts for a while.

It mentions This configuration code runs when jupyterhub starts up, so I included the example code block under hub.extraConfig.'myConfig.py', then I got this error: TypeError: 'LazyConfigValue' object is not subscriptable. I feel like maybe I didn’t put the collab user creation code at the correct place?


Traceback:

[D 2023-11-08 07:54:45.453 JupyterHub application:905] Looking for /usr/local/etc/jupyterhub/jupyterhub_config in /srv/jupyterhub
Loading /usr/local/etc/jupyterhub/secret/values.yaml
No config at /usr/local/etc/jupyterhub/existing-secret/values.yaml
Loading extra config: myConfig.py
[E 2023-11-08 07:54:45.934 JupyterHub app:3382]
    Traceback (most recent call last):
      File "/usr/local/lib/python3.11/site-packages/jupyterhub/app.py", line 3379, in launch_instance_async
        await self.initialize(argv)
      File "/usr/local/lib/python3.11/site-packages/jupyterhub/app.py", line 2857, in initialize
        self.load_config_file(self.config_file)
      File "/usr/local/lib/python3.11/site-packages/traitlets/config/application.py", line 113, in inner
        return method(app, *args, **kwargs)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
      File "/usr/local/lib/python3.11/site-packages/traitlets/config/application.py", line 953, in load_config_file
        for config, fname in self._load_config_files(
      File "/usr/local/lib/python3.11/site-packages/traitlets/config/application.py", line 912, in _load_config_files
        config = loader.load_config()
                 ^^^^^^^^^^^^^^^^^^^^
      File "/usr/local/lib/python3.11/site-packages/traitlets/config/loader.py", line 626, in load_config
        self._read_file_as_dict()
      File "/usr/local/lib/python3.11/site-packages/traitlets/config/loader.py", line 659, in _read_file_as_dict
        exec(compile(f.read(), conf_filename, "exec"), namespace, namespace)  # noqa
        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
      File "/usr/local/etc/jupyterhub/jupyterhub_config.py", line 497, in <module>
        exec(config_py)
      File "<string>", line 63, in <module>
    TypeError: 'LazyConfigValue' object is not subscriptable
    
[D 2023-11-08 07:54:45.937 JupyterHub application:1031] Exiting application: jupyterhub

Deployment Info:

  • k8s 1.27
  • jupyterhub helm chart 3.1.0 + singleuser 4.0.2

Can you share your full configuration?

proxy:
  service:
    type: NodePort
ingress:
  enabled: true
  annotations:
    kubernetes.io/ingress.class: nginx
    nginx.ingress.kubernetes.io/proxy-body-size: 512m
    cert-manager.io/cluster-issuer: letsencrypt-prod
  hosts:
    - xxxxxxxxxx
  tls:
    - hosts:
      - xxxxxxxxxx
      secretName: xxxxxxxxxx
hub:
  networkPolicy:
    egress:
      - ports:
          - port: 6443
          - port: 443
  config:
    GenericOAuthenticator:
      client_id: xxxxxxxxxx
      client_secret: xxxxxxxxxx
      oauth_callback_url: xxxxxxxxxx
      authorize_url: xxxxxxxxxx
      token_url: xxxxxxxxxx
      userdata_url: xxxxxxxxxx
      login_service: xxxxxxxxxx
      username_claim: preferred_username
      enable_auth_state: true
      scope: 
        - openid 
      claim_groups_key: groups
      admin_groups:
        - xxxxxxxxxx
    JupyterHub:
      authenticator_class: generic-oauth
  templatePaths:
    - /tmp/jupyterhub/custom/templates
  extraConfig: 
    myConfig.py: |
      import z2jh
      from pathlib import Path

      ## Create a profile for a singleuser server, reduce redundant config
      def make_profile(display_name, description, environment:dict, 
        image="xxxxxxxxxx",
        cpu_limit=1, cpu_guarantee=0.5, mem_limit="1G", mem_guarantee="500M", 
        is_default=False):
        profile = {
          'display_name': display_name,
          'description': description,
          'default': is_default,
          'kubespawner_override': {
            'image': image,
            'default_url': '/lab',
            'cpu_limit': cpu_limit,
            'cpu_guarantee': cpu_guarantee,
            'mem_limit': mem_limit,
            'mem_guarantee': mem_guarantee,
            'environment': environment,
          }
        }
        return profile

      ## Make profile map
      profile_map = {
          'oauth_group_name': make_profile(
            display_name='Collaboration',
            description='Collaboration Environment',
            environment={
              xxxxxxxxxx
            }
          )
      }
      # Show profiles based on groups
      async def custom_options_form(spawner):
          ## Get auth state
          auth_state = await spawner.user.get_auth_state()
          # spawner.log.debug(auth_state)
          ## Get groups
          groups = auth_state["oauth_user"]["groups"]
          ## Show profiles based on groups
          if "xxxxxxxxxx" in groups:
              spawner.profile_list = list(profile_map.values())
          else:
            spawner.profile_list = [profile_map[k] for k in profile_map.keys() if k in groups] 
          if not spawner.profile_list:
              raise PermissionError("You don't have access to any profiles")
          return spawner._options_form_default()
      # Set profile options
      c.KubeSpawner.options_form = custom_options_form

      ## Create collaboration users
      for profile_key in profile_map.keys():
        # define a new user for the collaboration
        collab_user = f"{profile_key}_collab"
        # add the collab user to the 'collaborative' group
        # so we can identify it as a collab account
        c.JupyterHub.load_groups["collaborative"].append(collab_user)

        # finally, grant members of the project collaboration group
        # access to the collab user's server,
        # and the admin UI so they can start/stop the server
        c.JupyterHub.load_roles.append(
            {
                "name": f"collab-access-{profile_key}",
                "scopes": [
                    f"access:servers!user={collab_user}",
                    f"admin:servers!user={collab_user}",
                    "admin-ui",
                    f"list:users!user={collab_user}",
                ],
                "groups": [profile_key],
            }
        )
      # def pre_spawn_hook(spawner):
      #   group_names = {group.name for group in spawner.user.groups}
      #   if "collaborative" in group_names:
      #     spawner.log.info(f"Enabling RTC for user {spawner.user.name}")
      #     spawner.args.append("--LabApp.collaborative=True")
      # c.KubeSpawner.pre_spawn_hook = pre_spawn_hook


      # Add login page to template directory
      login_page = Path('/tmp/jupyterhub/custom/templates/login.html')
      login_page.parent.mkdir(exist_ok=True, parents=True)
      login_page.write_text(z2jh.get_config('custom.login_page'))

      # Extra config
      c.KubeSpawner.start_timeout = 30000
      c.KubeSpawner.http_timeout= 30000

singleuser:
  storage:
    type: none
  image:
    pullPolicy: Always
    pullSecrets: 
      - "xxxxxxxxxx"
  startTimeout: 300
  ## Execute before_start_hook script made in prebuilt image following the instructions in
  ## https://jupyter-docker-stacks.readthedocs.io/en/latest/using/common.html#startup-hooks
  ## But jupyterhub k8s doesn't run it by default(jupyterhub docker works fine)
  lifecycleHooks:
    postStart:
      exec:
        command:
          - "/bin/bash"
          - "-c"
          - |
            BEFORE_START_HOOK=/usr/local/bin/before-notebook.d/before_start_hook.sh
            [ -x $BEFORE_START_HOOK ] && bash $BEFORE_START_HOOK

scheduling:
  podPriority:
    enabled: true
  userPlaceholder:
    enabled: true
    replicas: 5
    # replicas: 20
  userPods:
    nodeAffinity:
      matchNodePurpose: require
  userScheduler:
    enabled: true

cull:
  enabled: true
  timeout: 3600
  every: 300
  maxAge: 43200

debug:
  enabled: true

You’re missing the initialisatoin of the list and dict properties, see the First, we are going to prepare to define the roles and groups: code block in

1 Like

Uh, I thought it’s a default config object, thanks! It works now