Hello everyone,
I am Lukas from Potsdam Institute for Climate Impact Research and I am working with the CM 2.6 on PANGEO.
I was wondering wether there is a way (and I am sure there is) to visualize and to analyze the oceanic convection within in the model. Is there an explicit parameter to have a look on or should I need to think further?
I hope that this is the right place to come up with such questions and I am looking forward to discussing with you.
Hi @lufiedl - welcome to the new forum! Thanks for asking your question here.
The easiest way to diagnose deep convection would be via the mixed-layer depth. Here is a mixed layer depth plot from argo
which clearly shows the signature of deep convection in the North Atlantic. Source notebook
Unfortunately, I don’t think that CM2.6 output mixed layer depth. It would be possible to re-calculate it form the 3D T and S data. However, that would be very expensive. We should first check to see whether the MLD is available somewhere. I’ll ping Steve Griffies on this.
So I heard from Steve that they did output it. We just have to figure out what files it is in so we can load it into Google Cloud. The original files live in cyverse. Here is a screenshot of some of the names
The fields are named “mld” for “mixed layer depth” (as per Levitus definition) and “hblt”, which is the KPP boundary layer thickness. Both are 2d fields, so much smaller than the 3d fields.
I do not see mld or hblt on the list Ryan gave. Perhaps we need to transfer them over to Pangeo…?
Or perhaps these fields are in ocean.nc. Can you provide an ncdump -h for that file? Or perhaps it is in ventilate.nc?
I could check on the GFDL Archives, but the files are huge and migrated to tape, so presumably you can check quicker on Pangeo for the contents of these files…
The file list you see is from cyverse. My workflow for getting them “to Pangeo” is
Download them to a Columbia server using irods
Open with xarray.open_mfdataset
Export to zarr
Upload to google cloud storage
Right now it is unfortunately a rather manual process.
I will check on the suggested files and report back. It will take several hours to transfer the ~200 GB of netcdf files needed to do an ncdump. This is why we like zarr and cloud-based storage.
Thank you so much for your effort @rabernat and @StephenGriffies . It’s good to hear, that the data fields for the MLD are available.
Once the ocean.nc files are uploaded, how can I find them in the data catalog?
Unfortuantely the download failed because I ran out of space on our server’s scratch disk. This is a rather frustrating problem! I will free up some space and keep trying.
@lufiedl - just wanted to confirm that the data will finish downloading to my server tomorrow. Then it should be another day or two before they are on Google Cloud.
Sorry for the delays and thanks for your patience!
@lufiedl - you will be happy to know that the data are currently uploading to Google Cloud.
Just an interesting point I noticed when preparing this dataset. The total size of the netCDF files for the last 20 years of the simulation was 3.8 TB. The size of the exact same data in zarr format is 1.1 TB, without any loss of data.
Everything worked out extremely well! Again, thank you so much for you effort.So far I just had a quick look on one or two data points but I am looking forward to working with the sets in more detail the next weeks.