I am using JH on k8s version 3.3.7 behind a nginx. I just want the user to upload files with size less than 10M. So I added the configuration in nginx.conf “client_max_body_size 10M”. However, this does not prevent users uploading files that greater than 10M.
From the browser debug console, I can see that the file has been separated into chunks of size 1M, then they are sent to the sever side. I guess that’s why the nginx did not block this.
I see some relevant code of jupyterlab in packages.filebrowser.src.model.ts (may be the same code used in JH):
export const CHUNK_SIZE = 1024 * 1024;
async upload(file: File): Promise<Contents.IModel> {
// We do not support Jupyter Notebook version less than 4, and Jupyter
// Server advertises itself as version 1 and supports chunked
// uploading. We assume any version less than 4.0.0 to be Jupyter Server
// instead of Jupyter Notebook.
const serverVersion = PageConfig.getNotebookVersion();
const supportsChunked =
serverVersion < [4, 0, 0] /* Jupyter Server */ ||
serverVersion >= [5, 1, 0]; /* Jupyter Notebook >= 5.1.0 */
const largeFile = file.size > LARGE_FILE_SIZE;
if (largeFile && !supportsChunked) {
const msg = this._trans.__(
'Cannot upload file (>%1 MB). %2',
LARGE_FILE_SIZE / (1024 * 1024),
file.name
);
console.warn(msg);
throw msg;
}
const err = 'File not uploaded';
if (largeFile && !(await this._shouldUploadLarge(file))) {
throw 'Cancelled large file upload';
}
await this._uploadCheckDisposed();
await this.refresh();
await this._uploadCheckDisposed();
if (
this._items.find(i => i.name === file.name) &&
!(await shouldOverwrite(file.name))
) {
throw err;
}
await this._uploadCheckDisposed();
const chunkedUpload = supportsChunked && file.size > CHUNK_SIZE;
return await this._upload(file, chunkedUpload);
}
Is there a config to set the chunk size bigger or a workaround to only allow users upload files with size < 10M? Thanks.