Number of users that can be served on a Linux Virtual machine

Hi, leads,

I have a Ubuntu 20.04.4 LTS (GNU/Linux 5.4.0-122-generic x86_64) system with a Memory of 24.05 GB. It has Intel Core Processor (Haswell, no TSX, IBRS) CPU 1999.990 MHz and one core.

Running in a docker container the below is the memory and CPU usage for a single-user:
CONTAINER CPU % MEM USAGE / LIMIT
d55d80938ebf 2.07% 176.9MiB / 7.771GiB

I want to know roughly how much memory and CPU will be used by 100/200/500 simultaneous users with this configuration.?

PS: I am running a Deep Learning task on this.

If anybody has experience with a similar configuration, I would like to know an estimate of the CPU and Memory usage.

Do you really mean that you have only one core? The fewest cores you can get on a Haswell chip is 2, I think.

JupyterHub itself doesn’t use a lot of resources. When you have 100 or more users, ~all of resource consumption is actually the users themselves, so it’s entirely up to you and your users how much resources you want to allocate for them. Deep learning tasks are pretty intense, so I think you are going to have a hard time hosting more than a few users on a single core, but I also don’t really understand how you get only one core these days.

I’d reserve around ~256 MB of RAM and ~0.2 CPU for JupyterHub overhead. The rest, you can only find out from the workload your users are going to do. Actively training a model can easily take many GB of RAM and all the CPU you have available.

1 Like