Best practice advice on parallel processing a suite of zarr files with Dask and xarray

I set the time chunk to 468 to reach a reasonable block size (30 Mb) for data that will be served on a cloud store eventually.

Isn’t 30 MB chunk size rather small considering that the default for dask’s auto chunksize is 128 MiB possibly being increased in a future release (see here)?