I am able to upgrade kernel-spark-py kernelspace docker image from 3.2.2 to 3.2.4 but unable to upgrade >=3.3.* getting below error
++ id -u 2024-02-02T12:41:56.081045448Z + myuid=1000 2024-02-02T12:41:56.081403815Z ++ id -g 2024-02-02T12:41:56.082613215Z + mygid=100 2024-02-02T12:41:56.082628679Z + set +e 2024-02-02T12:41:56.082958195Z ++ getent passwd 1000 2024-02-02T12:41:56.083876520Z + uidentry=jovyan:x:1000:100::/home/jovyan:/bin/bash 2024-02-02T12:41:56.083881219Z + set -e 2024-02-02T12:41:56.083927567Z + ‘[’ -z jovyan:x:1000:100::/home/jovyan:/bin/bash ‘]’ 2024-02-02T12:41:56.083979123Z + ‘[’ -z /usr/lib/jvm/java ‘]’ 2024-02-02T12:41:56.084006505Z + SPARK_CLASSPATH=‘:/opt/spark/jars/’ 2024-02-02T12:41:56.084347913Z + env 2024-02-02T12:41:56.084451386Z + grep SPARK_JAVA_OPT_ 2024-02-02T12:41:56.084629433Z + sort -t_ -k4 -n 2024-02-02T12:41:56.084819147Z + sed 's/[^=]=(.)/\1/g’ 2024-02-02T12:41:56.086162635Z + readarray -t SPARK_EXECUTOR_JAVA_OPTS 2024-02-02T12:41:56.086213182Z + ‘[’ -n ‘’ ‘]’ 2024-02-02T12:41:56.086216982Z + ‘[’ -z ‘]’ 2024-02-02T12:41:56.086366941Z + ‘[’ -z ‘]’ 2024-02-02T12:41:56.086376367Z + ‘[’ -n ‘’ ‘]’ 2024-02-02T12:41:56.086378569Z + ‘[’ -z ‘]’ 2024-02-02T12:41:56.086433013Z + ‘[’ -z x ‘]’ 2024-02-02T12:41:56.086437085Z + SPARK_CLASSPATH='/opt/spark/conf::/opt/spark/jars/’
Fri, Feb 2 2024 4:41:56 pm+ case “$1” in 2024-02-02T12:41:56.086519795Z + shift 1 2024-02-02T12:41:56.086522168Z + CMD=(“$SPARK_HOME/bin/spark-submit” --conf “spark.driver.bindAddress=$SPARK_DRIVER_BIND_ADDRESS” --deploy-mode client “$@”) 2024-02-02T12:41:56.086653230Z + exec /usr/bin/tini -g – /opt/spark/bin/spark-submit --conf spark.driver.bindAddress=10.1.51.157 --deploy-mode client --properties-file /opt/spark/conf/spark.properties --class org.apache.spark.deploy.PythonRunner local:///usr/local/bin/kernel-launchers/python/scripts/launch_ipykernel.py --RemoteProcessProxy.kernel-id 1dc796da-10c1-4122-9865-3da2254d0b8d --RemoteProcessProxy.port-range 0…0 --RemoteProcessProxy.response-address 10.1.37.148:8877 --RemoteProcessProxy.public-key MIGfMA0GCSqGSIb3DQEBAQUAA4GNADCBiQKBgQDRg22pQLuU9vfR8/JNwMf1bCPLgkIycV1eEd2w/1p6qG7qEme/zFmyRGXMhhnCiUUvkSXMC+dI3Qa0LQ2LyjyOXc0nS1WGsfh6jNwcabHxdL1gr6GsQalX7JowjHO2oZ74DCQkVlIU1dFRvjs6ObEjSTrn4CdmDYmH2uxWL1lplQIDAQAB --RemoteProcessProxy.spark-context-initialization-mode lazy 2024-02-02T12:41:58.316889712Z 24/02/02 12:41:58 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform… using builtin-java classes where applicable 2024-02-02T12:41:58.779998953Z [D 2024-02-02 12:41:58,779.779 launch_ipykernel] Using connection file ‘/tmp/kernel-1dc796da-10c1-4122-9865-3da2254d0b8d_1c5mw58g.json’. 2024-02-02T12:41:58.780390168Z [I 2024-02-02 12:41:58,780.780 launch_ipykernel] Signal socket bound to host: 0.0.0.0, port: 59075 2024-02-02T12:41:58.780675308Z [D 2024-02-02 12:41:58,780.780 launch_ipykernel] JSON Payload ‘b’{“shell_port”: 45637, “iopub_port”: 56141, “stdin_port”: 45915, “control_port”: 41093, “hb_port”: 56491, “ip”: “0.0.0.0”, “key”: “20fd6bb2-43da-4247-afef-a28201464eaa”, “transport”: “tcp”, “signature_scheme”: “hmac-sha256”, “kernel_name”: “”, “pid”: 60, “pgid”: 14, “comm_port”: 59075, “kernel_id”: “1dc796da-10c1-4122-9865-3da2254d0b8d”}’ 2024-02-02T12:41:58.793321811Z [D 2024-02-02 12:41:58,793.793 launch_ipykernel] Encrypted Payload 'b’eyJ2
ZXJzaW9uIjogMSwgImtleSI6ICJHNVBZeUcyMUljd2FIa1NBQm1JeERDRlM2QStQTUtHajZtM0YxZ1dJUCtaNmVibzl0L2lLUXQ2OVBEaFdTZUhJeUFpRkJWbnBGNGRXL2dwSlZYdDg3eGhCd2U0ZDY5ODBzOWs4eGl4cnYvOVJZZ2h2NTBReGNmUXdwMXIrWldsSmMweWg2WkFPY3psaXIwc0w1T2RiQUdDUGJFK1N4QUVjVFU0cmRReTN3d0U9IiwgImNvbm5faW5mbyI6ICJpdEc1VitRZExyMUVqNGlOSmxFamhBUmIrNlE4Y3cxaG80ZG1vMFdxMU1xbnJ6NDNVaE9tUkxSQ2FEdGlMUHFYVUVQdjVNVk92YldnYzVtbXQrZm0rNzFWelNvdTl3Z0RPcEdmZEZSamo0SUtWOEZHOU1XUzVGRDlIVG9vZUVMS09oQjA2dTJZK0dzQU9lSmZheWFBVUxLZmhkNjdldVJnRnl5ZFMrQ3pMUGlKL2xiUWVXVVFxd3hXMTBLRk5UbUNpR2ZzTnZBWERjNFRYeWozZm04VVpQbStJdFEzQWwvQlYrSnV2dFVRQ2dHbWtMUjc2a2FhVXRuMWN0YlNNL0xBckJnYnZ6bkJIN2NXdmVibU9KTFdLanEybE1YYURVRWJPY2gwcGQ1OHlJSncvWFN3NXVwemxtaUFNUVFleE5QMVRjMUEzMWQrODE3NTZvZjY5K0VxZ1pIMU81VjNuMmhXZHVXc1ZpanlkcVI4T2JMS3BkZlVVU0lydENMOENHbklHOEdhNHdUQ2I1eUhZdnMzaitNUXpBbGF2MWI4TGVLTURKYThmSWFlc1lreHd3Sys0elJac1JMQUxKL2JmUEhiVGtNYVUrNW9qcE1IaWNFK3FscGpKdz09In0=’ 2024-02-02T12:41:59.826079854Z 24/02/02 12:41:59 INFO SparkContext: Running Spark version 3.3.0 2024-02-02T12:41:59.858253697Z 24/02/02 12:41:59 INFO ResourceUtils: ============================================================== 2024-02-02T12:41:59.858713466Z 24/02/02 12:41:59 INFO ResourceUtils: No custom resources configured for spark.driver. 2024-02-02T12:41:59.859092015Z 24/02/02 12:41:59 INFO ResourceUtils: ============================================================== 2024-02-02T12:41:59.859549855Z 24/02/02 12:41:59 INFO SparkContext: Submitted application: jovyan-1dc796da-10c1-4122-9865-3da2254d0b8d 2024-02-02T12:41:59.886646458Z 24/02/02 12:41:59 INFO ResourceProfile: Default ResourceProfile created, executor resources: Map(cores → name: cores, amount: 1, script: , vendor: , memory → name: memory, amount: 1024, script: , vendor: , offHeap → name: offHeap, amount: 0, script: , vendor: ), task resources: Map(cpus → name: cpus, amount: 1.0) 2024-02-02T12:41:59.903208956Z 24/02/02 12:41:59 INFO ResourceProfile: Limiting resource is cpus at 1 tasks per executor 2024-02-02T12:41:59.905512162Z 24/02/02 12:41:59 I
NFO ResourceProfileManager: Added ResourceProfile id: 0 2024-02-02T12:41:59.973691578Z 24/02/02 12:41:59 INFO SecurityManager: Changing view acls to: jovyan 2024-02-02T12:41:59.974217171Z 24/02/02 12:41:59 INFO SecurityManager: Changing modify acls to: jovyan 2024-02-02T12:41:59.974791510Z 24/02/02 12:41:59 INFO SecurityManager: Changing view acls groups to: 2024-02-02T12:41:59.975282448Z 24/02/02 12:41:59 INFO SecurityManager: Changing modify acls groups to: 2024-02-02T12:41:59.975781592Z 24/02/02 12:41:59 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(jovyan); groups with view permissions: Set(); users with modify permissions: Set(jovyan); groups with modify permissions: Set()
Fri, Feb 2 2024 4:42:00 pm24/02/02 12:42:00 INFO Utils: Successfully started service ‘sparkDriver’ on port 7078.
Fri, Feb 2 2024 4:42:00 pm24/02/02 12:42:00 INFO SparkEnv: Registering MapOutputTracker
Fri, Feb 2 2024 4:42:00 pm24/02/02 12:42:00 INFO SparkEnv: Registering BlockManagerMaster 2024-02-02T12:42:00.453654033Z 24/02/02 12:42:00 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information 2024-02-02T12:42:00.454287437Z 24/02/02 12:42:00 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up 2024-02-02T12:42:00.458138134Z 24/02/02 12:42:00 INFO SparkEnv: Registering BlockManagerMasterHeartbeat
Fri, Feb 2 2024 4:42:00 pm24/02/02 12:42:00 INFO DiskBlockManager: Created local directory at /var/data/spark-1e0ad5fa-5cab-4ebe-b86e-0df84656b14e/blockmgr-5b0bc958-c7e8-4755-aeb0-7f1ce0d39e7b
Fri, Feb 2 2024 4:42:00 pm24/02/02 12:42:00 INFO MemoryStore: MemoryStore started with capacity 413.9 MiB 2024-02-02T12:42:00.526553305Z 24/02/02 12:42:00 INFO SparkEnv: Registering OutputCommitCoordinator
Fri, Feb 2 2024 4:42:00 pm24/02/02 12:42:00 INFO Utils: Successfully started service ‘SparkUI’ on port 4040.
Fri, Feb 2 2024 4:42:00 pm24/02/02 12:42:00 INFO SparkKubernetesClientFactory: Auto-configuring K8S client using current context from users K8S config file
Fri, Feb 2 2024 4:42:02 pm24/02/02 12:42:02 INFO ExecutorPodsAllocator: Going to request 2 executors from Kubernetes for ResourceProfile Id: 0, target: 2, known: 0, sharedSlotFromPendingPods: 2147483647.
Fri, Feb 2 2024 4:42:02 pm24/02/02 12:42:02 ERROR ExecutorPodsSnapshotsStoreImpl: Going to stop due to IllegalArgumentException
Fri, Feb 2 2024 4:42:02 pmjava.lang.IllegalArgumentException: ‘jovyan-1dc796da-10c1-4122-9865-3da2254d0b8d-fc982a8d69d71956’ in spark.kubernetes.executor.podNamePrefix is invalid. must conform Object Names and IDs | Kubernetes and the value length <= 47 2024-02-02T12:42:02.493551796Z at org.apache.spark.internal.config.TypedConfigBuilder.$anonfun$checkValue$1(ConfigBuilder.scala:108)
Fri, Feb 2 2024 4:42:02 pm at org.apache.spark.internal.config.TypedConfigBuilder.$anonfun$transform$1(ConfigBuilder.scala:101) 2024-02-02T12:42:02.493559821Z at scala.Option.map(Option.scala:230)
Fri, Feb 2 2024 4:42:02 pm at org.apache.spark.internal.config.OptionalConfigEntry.readFrom(ConfigEntry.scala:239)
Fri, Feb 2 2024 4:42:02 pm at org.apache.spark.internal.config.OptionalConfigEntry.readFrom(ConfigEntry.scala:214)
Fri, Feb 2 2024 4:42:02 pm at org.apache.spark.SparkConf.get(SparkConf.scala:261) 2024-02-02T12:42:02.493576744Z at org.apache.spark.deploy.k8s.KubernetesConf.get(KubernetesConf.scala:70) 2024-02-02T12:42:02.493580574Z at org.apache.spark.deploy.k8s.KubernetesExecutorConf.(KubernetesConf.scala:156)
Fri, Feb 2 2024 4:42:02 pm at org.apache.spark.deploy.k8s.KubernetesConf$.createExecutorConf(KubernetesConf.scala:246)
Fri, Feb 2 2024 4:42:02 pm at org.apache.spark.scheduler.cluster.k8s.ExecutorPodsAllocator.$anonfun$requestNewExecutors$1(ExecutorPodsAllocator.scala:392) 2024-02-02T12:42:02.493593349Z at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:158) 2024-02-02T12:42:02.493598303Z at org.apache.spark.scheduler.cluster.k8s.ExecutorPodsAllocator.requestNewExecutors(ExecutorPodsAllocator.scala:385) 2024-02-02T12:42:02.493604503Z at org.apache.spark.scheduler.cluster.k8s.ExecutorPodsAllocator.$anonfun$onNewSnapshots$35(ExecutorPodsAllocator.scala:349) 2024-02-02T12:42:02.493610303Z at org.apache.spark.scheduler.cluster.k8s.ExecutorPodsAllocator.$anonfun$onNewSnapshots$35$adapted(ExecutorPodsAllocator.scala:342) 2024-02-02T12:42:02.493615916Z at scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62) 2024-02-02T12:42:02.493621694Z at scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)
Fri, Feb 2 2024 4:42:02 pm at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49) 2024-02-02T12:42:02.493631782Z at org.apache.spark.scheduler.cluster.k8s.ExecutorPodsAllocator.onNewSnapshots(ExecutorPodsAllocator.scala:342)
Fri, Feb 2 2024 4:42:02 pm at org.apache.spark.scheduler.cluster.k8s.ExecutorPodsAllocator.$anonfun$start$3(ExecutorPodsAllocator.scala:120)
Fri, Feb 2 2024 4:42:02 pm at org.apache.spark.scheduler.cluster.k8s.ExecutorPodsAllocator.$anonfun$start$3$adapted(ExecutorPodsAllocator.scala:120) 2024-02-02T12:42:02.493648099Z at org.apache.spark.scheduler.cluster.k8s.ExecutorPodsSnapshotsStoreImpl$SnapshotsSubscriber.org$apache$spark$scheduler$cluster$k8s$ExecutorPodsSnapshotsStoreImpl$SnapshotsSubscriber$$processSnapshotsInternal(ExecutorPodsSnapshotsStoreImpl.scala:138)
Fri, Feb 2 2024 4:42:02 pm at org.apache.spark.scheduler.cluster.k8s.ExecutorPodsSnapshotsStoreImpl$SnapshotsSubscriber.processSnapshots(ExecutorPodsSnapshotsStoreImpl.scala:126) 2024-02-02T12:42:02.493655719Z at org.apache.spark.scheduler.cluster.k8s.ExecutorPodsSnapshotsStoreImpl.$anonfun$addSubscriber$1(ExecutorPodsSnapshotsStoreImpl.scala:81) 2024-02-02T12:42:02.493659453Z at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) 2024-02-02T12:42:02.493662943Z at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) 2024-02-02T12:42:02.493666668Z at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)
Fri, Feb 2 2024 4:42:02 pm at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) 2024-02-02T12:42:02.493677784Z at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
Fri, Feb 2 2024 4:42:02 pm at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) 2024-02-02T12:42:02.493685114Z at java.lang.Thread.run(Thread.java:750)
Fri, Feb 2 2024 4:42:02 pm24/02/02 12:42:02 INFO DiskBlockManager: Shutdown hook called
Fri, Feb 2 2024 4:42:02 pm24/02/02 12:42:02 INFO ShutdownHookManager: Shutdown hook called