Jupyterhub helm upgrade failes when using official notebook image jupyter/datascience-notebook:ubuntu-20.04

When I upgrade via helm command:
( helm upgrade jhub --namespace jupns --values config10.yaml --debug /root/jupyterhub --timeout 1200s
)
it failes; I use multiple profile notebook and with 2 profies which are official jupyter r-notebook and scipy it is working correctly but when adding datasciecne notebook image as a third option it failes! even data science noteboook as first option failes:

This is failed log:
client.go:299: [debug] Starting delete for “hook-image-puller” DaemonSet
client.go:128: [debug] creating 1 resource(s)
client.go:299: [debug] Starting delete for “hook-image-awaiter” ServiceAccount
client.go:128: [debug] creating 1 resource(s)
client.go:299: [debug] Starting delete for “hook-image-awaiter” Role
client.go:128: [debug] creating 1 resource(s)
client.go:299: [debug] Starting delete for “hook-image-awaiter” RoleBinding
client.go:128: [debug] creating 1 resource(s)
client.go:299: [debug] Starting delete for “hook-image-awaiter” Job
client.go:128: [debug] creating 1 resource(s)
client.go:528: [debug] Watching for changes to Job hook-image-awaiter with timeout of 5m0s
client.go:556: [debug] Add/Modify event for hook-image-awaiter: ADDED
client.go:595: [debug] hook-image-awaiter: Jobs active: 0, jobs failed: 0, jobs succeeded: 0
client.go:556: [debug] Add/Modify event for hook-image-awaiter: MODIFIED
client.go:595: [debug] hook-image-awaiter: Jobs active: 1, jobs failed: 0, jobs succeeded: 0
upgrade.go:430: [debug] warning: Upgrade “jhub” failed: pre-upgrade hooks failed: timed out waiting for the condition
Error: UPGRADE FAILED: pre-upgrade hooks failed: timed out waiting for the condition
helm.go:88: [debug] pre-upgrade hooks failed: timed out waiting for the condition
UPGRADE FAILED
main.newUpgradeCmd.func2
helm.sh/helm/v3/cmd/helm/upgrade.go:202
github.com/spf13/cobra.(*Command).execute
github.com/spf13/cobra@v1.2.1/command.go:856
github.com/spf13/cobra.(*Command).ExecuteC
github.com/spf13/cobra@v1.2.1/command.go:974
github.com/spf13/cobra.(*Command).Execute
github.com/spf13/cobra@v1.2.1/command.go:902
main.main
helm.sh/helm/v3/cmd/helm/helm.go:87
runtime.main
runtime/proc.go:225
runtime.goexit
runtime/asm_amd64.s:1371


My config10.yaml file content:

proxy:
secretToken: “254334e575a7d8ebf8ac36581a17a3580cda1d362fdf668cf4e9204683c17c1d”
https:
enabled: true
hosts:
- kavosh..
type: secret
secret:
name: “--tls”
singleuser:
image:
name: library/scipyodbc
tag: “1.0.0”
profileList:
- display_name: “Python3”
description: “"
default: true
- display_name: “R”
description: "

kubespawner_override:
image: jupyter/r-notebook:r-4.1.2
- display_name: “Julia”
description: “**”
kubespawner_override:
image: jupyter/datascience-notebook:ubuntu-20.04
defaultUrl: “/lab”
storage:
dynamic:
storageClass: storage-hdd
cpu:
limit: 3
guarantee: 2
memory:
limit: 10G
guarantee: 10G
hub:
extraConfig:
templates: |
## Configure JupyterHub to look for the templates we inject with a ConfigMap
c.JupyterHub.template_paths.insert(0, “/etc/jupyterhub/templates”)
##c.JupyterHub.logo_file = “/usr/local/share/jupyterhub/static/logo.png”
extraVolumes:
- name: hub-templates
configMap:
name: hub-templates
- name: hub-external
configMap:
name: hub-external
extraVolumeMounts:
- mountPath: /etc/jupyterhub/templates
name: hub-templates
- mountPath: /usr/local/share/jupyterhub/static/images
name: hub-external
config:
JupyterHub:
admin_access: True
admin_users:
- saljooghi
authenticator_class: ldapauthenticator.LDAPAuthenticator
LDAPAuthenticator:
escape_userdn: false
lookup_dn: true
lookup_dn_search_filter: ({login_attr}={login})
lookup_dn_search_password: *******
lookup_dn_search_user: Jupyter@.
lookup_dn_user_dn_attribute: cn
server_address: 192.168.41.201
user_attribute: sAMAccountName
user_search_base: OU=Users,OU=,DC=,DC=*


Found the solution, if during helm upgrade pods looking for this image: jupyterhub-k8s-image-awaiter-1.2.0
since my server is isolated and has no connection to internet, upgrade failes…So I get the image with docker pull and save it as tar file and import it to micrk8s local cache image repo

Hi, Can you please document the steps, on how did you do it?