Reader.load() issues with ATL06

Hello team Icepyx,

I am running into a few problems with reader.load().

For the sake of reproducibility: my download query is…

poly = [(-50.5, 64), (-50.5, 63.5), (-49, 63.5), (-49, 64), (-50.5, 64)]

region_a = ipx.Query(product = 'ATL06',
                     spatial_extent = poly,
                     date_range = ['2019-01-01','2020-01-01'],
                     start_time='00:00:00', end_time='23:59:59')

region_a.earthdata_login('<myusername>','<myemailaddress>')

region_a.order_vars.append(var_list= ["h_li","latitude","longitude"])
#print(region_a.order_vars.wanted)
region_a.download_granules('../icesat-2_data/')

Trying to load the data using…:

directory = '../icesat-2_data/'
pattern = "processed_ATL{product:2}_{datetime:%Y%m%d%H%M%S}_{rgt:4}{cycle:2}{orbitsegment:2}_{version:3}_{revision:2}.h5"
reader = ipx.Read(data_source = directory, product = "ATL06", filename_pattern = pattern)
reader.vars.append(var_list = ["h_li","latitude","longitude"])
# print(reader.vars.wanted)
reader.load()

this returns the following error trace:

---------------------------------------------------------------------------
KeyError                                  Traceback (most recent call last)
File ~\Miniconda3\envs\icepyx\lib\site-packages\xarray\backends\netCDF4_.py:182, in _nc4_require_group(ds, group, mode, create_group)
    181 try:
--> 182     ds = ds.groups[key]
    183 except KeyError as e:

File ~\Miniconda3\envs\icepyx\lib\site-packages\h5netcdf\utils.py:14, in Frozen.__getitem__(self, key)
     13 def __getitem__(self, key):
---> 14     return self._mapping[key]

File ~\Miniconda3\envs\icepyx\lib\site-packages\h5netcdf\core.py:397, in _LazyObjectLookup.__getitem__(self, key)
    396     key = "_nc4_non_coord_" + key
--> 397 if self._objects[key] is not None:
    398     return self._objects[key]

KeyError: 'land_ice_segments'

During handling of the above exception, another exception occurred:

OSError                                   Traceback (most recent call last)
Cell In [7], line 3
      1 reader.vars.append(var_list = ["h_li","latitude","longitude"])
      2 # reader.vars.wanted
----> 3 reader.load()

File ~\Miniconda3\envs\icepyx\lib\site-packages\icepyx\core\read.py:542, in Read.load(self)
    535 # DevNote: I'd originally hoped to rely on intake-xarray in order to not have to iterate through the files myself,
    536 # by providing a generalized url/source in building the catalog.
    537 # However, this led to errors when I tried to combine two identical datasets because the single dimension was equal.
    538 # In these situations, xarray recommends manually controlling the merge/concat process yourself.
    539 # While unlikely to be a broad issue, I've heard of multiple matching timestamps causing issues for combining multiple IS2 datasets.
    540 for file in self._filelist:
    541     all_dss.append(
--> 542         self._build_single_file_dataset(file, groups_list)
    543     )  # wanted_groups, vgrp.keys()))
    545 if len(all_dss) == 1:
    546     return all_dss[0]

File ~\Miniconda3\envs\icepyx\lib\site-packages\icepyx\core\read.py:682, in Read._build_single_file_dataset(self, file, groups_list)
    680 grp_path = wanted_groups_list[0]
    681 wanted_groups_list = wanted_groups_list[1:]
--> 682 ds = self._read_single_grp(file, grp_path)
    683 is2ds, ds = Read._add_vars_to_ds(
    684     is2ds, ds, grp_path, wanted_groups_tiered, wanted_dict
    685 )
    687 # if there are any deeper nested variables, get those so they have actual coordinates and add them

File ~\Miniconda3\envs\icepyx\lib\site-packages\icepyx\core\read.py:602, in Read._read_single_grp(self, file, grp_path)
    598 try:
    599     grpcat = is2cat.build_catalog(
    600         file, self._pattern, self._source_type, grp_paths=grp_path
    601     )
--> 602     ds = grpcat[self._source_type].read()
    604 # NOTE: could also do this with h5py, but then would have to read in each variable in the group separately
    605 except ValueError:

File ~\Miniconda3\envs\icepyx\lib\site-packages\intake_xarray\base.py:39, in DataSourceMixin.read(self)
     37 def read(self):
     38     """Return a version of the xarray with all the data in memory"""
---> 39     self._load_metadata()
     40     return self._ds.load()

File ~\Miniconda3\envs\icepyx\lib\site-packages\intake\source\base.py:285, in DataSourceBase._load_metadata(self)
    283 """load metadata only if needed"""
    284 if self._schema is None:
--> 285     self._schema = self._get_schema()
    286     self.dtype = self._schema.dtype
    287     self.shape = self._schema.shape

File ~\Miniconda3\envs\icepyx\lib\site-packages\intake_xarray\base.py:18, in DataSourceMixin._get_schema(self)
     15 self.urlpath = self._get_cache(self.urlpath)[0]
     17 if self._ds is None:
---> 18     self._open_dataset()
     20     metadata = {
     21         'dims': dict(self._ds.dims),
     22         'data_vars': {k: list(self._ds[k].coords)
     23                       for k in self._ds.data_vars.keys()},
     24         'coords': tuple(self._ds.coords.keys()),
     25     }
     26     if getattr(self, 'on_server', False):

File ~\Miniconda3\envs\icepyx\lib\site-packages\intake_xarray\netcdf.py:92, in NetCDFSource._open_dataset(self)
     88 else:
     89     # https://github.com/intake/filesystem_spec/issues/476#issuecomment-732372918
     90     url = fsspec.open(self.urlpath, **self.storage_options).open()
---> 92 self._ds = _open_dataset(url, chunks=self.chunks, **kwargs)

File ~\Miniconda3\envs\icepyx\lib\site-packages\xarray\backends\api.py:531, in open_dataset(filename_or_obj, engine, chunks, cache, decode_cf, mask_and_scale, decode_times, decode_timedelta, use_cftime, concat_characters, decode_coords, drop_variables, inline_array, backend_kwargs, **kwargs)
    519 decoders = _resolve_decoders_kwargs(
    520     decode_cf,
    521     open_backend_dataset_parameters=backend.open_dataset_parameters,
   (...)
    527     decode_coords=decode_coords,
    528 )
    530 overwrite_encoded_chunks = kwargs.pop("overwrite_encoded_chunks", None)
--> 531 backend_ds = backend.open_dataset(
    532     filename_or_obj,
    533     drop_variables=drop_variables,
    534     **decoders,
    535     **kwargs,
    536 )
    537 ds = _dataset_from_backend_dataset(
    538     backend_ds,
    539     filename_or_obj,
   (...)
    547     **kwargs,
    548 )
    549 return ds

File ~\Miniconda3\envs\icepyx\lib\site-packages\xarray\backends\h5netcdf_.py:389, in H5netcdfBackendEntrypoint.open_dataset(self, filename_or_obj, mask_and_scale, decode_times, concat_characters, decode_coords, drop_variables, use_cftime, decode_timedelta, format, group, lock, invalid_netcdf, phony_dims, decode_vlen_strings)
    369 def open_dataset(
    370     self,
    371     filename_or_obj,
   (...)
    385     decode_vlen_strings=True,
    386 ):
    388     filename_or_obj = _normalize_path(filename_or_obj)
--> 389     store = H5NetCDFStore.open(
    390         filename_or_obj,
    391         format=format,
    392         group=group,
    393         lock=lock,
    394         invalid_netcdf=invalid_netcdf,
    395         phony_dims=phony_dims,
    396         decode_vlen_strings=decode_vlen_strings,
    397     )
    399     store_entrypoint = StoreBackendEntrypoint()
    401     ds = store_entrypoint.open_dataset(
    402         store,
    403         mask_and_scale=mask_and_scale,
   (...)
    409         decode_timedelta=decode_timedelta,
    410     )

File ~\Miniconda3\envs\icepyx\lib\site-packages\xarray\backends\h5netcdf_.py:181, in H5NetCDFStore.open(cls, filename, mode, format, group, lock, autoclose, invalid_netcdf, phony_dims, decode_vlen_strings)
    178         lock = combine_locks([HDF5_LOCK, get_write_lock(filename)])
    180 manager = CachingFileManager(h5netcdf.File, filename, mode=mode, kwargs=kwargs)
--> 181 return cls(manager, group=group, mode=mode, lock=lock, autoclose=autoclose)

File ~\Miniconda3\envs\icepyx\lib\site-packages\xarray\backends\h5netcdf_.py:132, in H5NetCDFStore.__init__(self, manager, group, mode, lock, autoclose)
    129 self.format = None
    130 # todo: utilizing find_root_and_group seems a bit clunky
    131 #  making filename available on h5netcdf.Group seems better
--> 132 self._filename = find_root_and_group(self.ds)[0].filename
    133 self.is_remote = is_remote_uri(self._filename)
    134 self.lock = ensure_lock(lock)

File ~\Miniconda3\envs\icepyx\lib\site-packages\xarray\backends\h5netcdf_.py:192, in H5NetCDFStore.ds(self)
    190 @property
    191 def ds(self):
--> 192     return self._acquire()

File ~\Miniconda3\envs\icepyx\lib\site-packages\xarray\backends\h5netcdf_.py:185, in H5NetCDFStore._acquire(self, needs_lock)
    183 def _acquire(self, needs_lock=True):
    184     with self._manager.acquire_context(needs_lock) as root:
--> 185         ds = _nc4_require_group(
    186             root, self._group, self._mode, create_group=_h5netcdf_create_group
    187         )
    188     return ds

File ~\Miniconda3\envs\icepyx\lib\site-packages\xarray\backends\netCDF4_.py:188, in _nc4_require_group(ds, group, mode, create_group)
    185             ds = create_group(ds, key)
    186         else:
    187             # wrap error to provide slightly more helpful message
--> 188             raise OSError(f"group not found: {key}", e)
    189 return ds

OSError: [Errno group not found: land_ice_segments] 'land_ice_segments'

I have attempted to remedy this using by using h5py.File(f), and the allkeys() function here to iteratively open all the files in the directory, list their keys (and sub keys), and remove the files that do not have land_ice_segments.
Having done that (removing 2 of the the 47 files originally downloaded)…the same error message persists.


The documentation warns of duplicate gran_idx and I don’t think that is happening here

rgt_cyc = []
for f in files:
    F = h5py.File(f,'r')
    rgt_cyc.append((F['orbit_info/rgt'][()][0], F['orbit_info/cycle_number'][()][0]))
len(rgt_cyc) == len(list(set(rgt_cyc)))
>>> True

Whilst trying to identify unpick this…i found reader.load() had no issues when working with these six files

['../icesat-2_data\\processed_ATL06_20190503212504_05440305_005_01.h5',
 '../icesat-2_data\\processed_ATL06_20190507211645_06050305_005_01.h5',
 '../icesat-2_data\\processed_ATL06_20190527075008_09020303_005_01.h5',
 '../icesat-2_data\\processed_ATL06_20190531074148_09630303_005_01.h5',
 '../icesat-2_data\\processed_ATL06_20190605195242_10470305_005_01.h5',
 '../icesat-2_data\\processed_ATL06_20190609194422_11080305_005_01.h5']

adding in… another file (processed_ATL06_20190326104632_13440203_005_01.h5) and the error message returns. And i think this stems from these files not having the same list of gt’s.

I have tried selectively downloading (using order_vars.append(keyword_list = ['gt3l']) and run into the same error message.

…I think my question is: is (a) what am I doing wrong? and (b) it possible to combine all of these files - without having to selectively throw away some tracks? (and ideally extend the initial download query to include the full time series).

Thank you in advance.

Hi @tlohde,

While running region_a.order_granules(), I noticed a lot of lines like:

NSIDC returned these messages
['Granule 228833223 contained no data within the spatial and/or temporal '
 'subset constraints to be processed',
 'Granule 228825218 contained no data within the spatial and/or temporal '
 'subset constraints to be processed',
...

which seems to indicate some data is missing. So I tried reading the ICESat-2 HDF5 files using datatree.open_datatree and it does look like some of the laser beams are missing, e.g. this one only has 2 (gt3l and gt3r) out of 6 beams:

import datatree

is2dt = datatree.open_datatree(
    "icesat-2_data/processed_ATL06_20190108030055_01630205_005_01.h5",
    engine="h5netcdf",
    phony_dims="access",
)
print(is2dt)
DataTree('None', parent=None)
│   Dimensions:  ()
│   Data variables:
│       *empty*
│   Attributes: (12/49)
│       Conventions:                        CF-1.6
│       Processing Parameters:              This file was gernerated by the ICESa...
│       citation:                           Cite these data in publications as fo...
│       contributor_name:                   Thomas E Neumann (thomas.neumann@nasa...
│       contributor_role:                   Instrument Engineer, Investigator, Pr...
│       creator_name:                       GSFC I-SIPS > ICESat-2 Science Invest...
│       ...                                 ...
│       summary:                            ATL06 provides estimates of the ice-s...
│       time_coverage_duration:             325.0
│       time_coverage_end:                  2019-01-08T03:06:20.000000Z
│       time_coverage_start:                2019-01-08T03:00:55.000000Z
│       time_type:                          CCSDS UTC-A
│       title:                              ATLAS/ICESat-2 L3A Land Ice Height
├── DataTree('METADATA')
│   │   Dimensions:  ()
│   │   Data variables:
│   │       *empty*
│   │   Attributes:
│   │       Description:            ISO19115 Structured Metadata Represented within HDF5
│   │       iso_19139_dataset_xml:  <?xml version="1.0"?>\n<gmd:DS_Series xsi:schemaL...
│   │       iso_19139_series_xml:   <?xml version="1.0" encoding="UTF-8"?>\n<gmd:DS_S...
│   ├── DataTree('AcquisitionInformation')
│   │   ├── DataTree('lidar')
│   │   │       Dimensions:  ()
│   │   │       Data variables:
│   │   │           *empty*
│   │   │       Attributes:
│   │   │           description:  ATLAS on ICESat-2 determines the range between the satellit...
│   │   │           identifier:   ATLAS
│   │   │           pulse_rate:   10000 pps
│   │   │           type:         Laser Altimeter
│   │   │           wavelength:   532 nm
│   │   ├── DataTree('lidarDocument')
│   │   │       Dimensions:  ()
│   │   │       Data variables:
│   │   │           *empty*
│   │   │       Attributes:
│   │   │           edition:          Pre-Release
│   │   │           publicationDate:  12/31/17
│   │   │           title:            A document describing the ATLAS instrument will be prov...
│   │   ├── DataTree('platform')
│   │   │       Dimensions:  ()
│   │   │       Data variables:
│   │   │           *empty*
│   │   │       Attributes:
│   │   │           description:  Ice, Cloud, and land Elevation Satellite-2
│   │   │           identifier:   ICESat-2
│   │   │           type:         Spacecraft
│   │   └── DataTree('platformDocument')
│   │           Dimensions:  ()
│   │           Data variables:
│   │               *empty*
│   │           Attributes:
│   │               edition:          31-Dec-16
│   │               publicationDate:  31-Dec-16
│   │               title:            The Ice, Cloud, and land Elevation Satellite-2 (ICESat-...
│   ├── DataTree('DataQuality')
│   │   │   Dimensions:  ()
│   │   │   Data variables:
│   │   │       *empty*
│   │   │   Attributes:
│   │   │       scope:    NOT_SET
│   │   ├── DataTree('CompletenessOmission')
│   │   │       Dimensions:  ()
│   │   │       Data variables:
│   │   │           *empty*
│   │   │       Attributes:
│   │   │           evaluationMethodType:  directInternal
│   │   │           measureDescription:    TBD
│   │   │           nameOfMeasure:         TBD
│   │   │           unitofMeasure:         TBD
│   │   │           value:                 NOT_SET
│   │   └── DataTree('DomainConsistency')
│   │           Dimensions:  ()
│   │           Data variables:
│   │               *empty*
│   │           Attributes:
│   │               evaluationMethodType:  directInternal
│   │               measureDescription:    TBD
│   │               nameOfMeasure:         TBD
│   │               unitofMeasure:         TBD
│   │               value:                 NOT_SET
│   ├── DataTree('DatasetIdentification')
│   │       Dimensions:  ()
│   │       Data variables:
│   │           *empty*
│   │       Attributes: (12/14)
│   │           VersionID:                   005
│   │           abstract:                    ATL06 provides estimates of the ice-sheet me...
│   │           characterSet:                utf8
│   │           creationDate:                2021-09-05T11:16:31.000000Z
│   │           credit:                      The software that generates the ATL06 produc...
│   │           fileName:                    ATL06_20190108030055_01630205_005_01.h5
│   │           ...                          ...
│   │           purpose:                     ATL06 provides estimates of the ice-sheet me...
│   │           shortName:                   ATL06
│   │           spatialRepresentationType:   along-track
│   │           status:                      onGoing
│   │           topicCategory:               geoscientificInformation
│   │           uuid:                        6abf99e8-3be1-30e0-a623-28f0d12fc9b8
│   ├── DataTree('Extent')
│   │       Dimensions:  ()
│   │       Data variables:
│   │           *empty*
│   │       Attributes:
│   │           eastBoundLongitude:      -41.329960843275046
│   │           northBoundLatitude:      80.00112360641836
│   │           rangeBeginningDateTime:  2019-01-08T03:00:54.862971Z
│   │           rangeEndingDateTime:     2019-01-08T03:05:36.716995Z
│   │           southBoundLatitude:      62.24122894734801
│   │           westBoundLongitude:      -50.388809857370006
│   ├── DataTree('Lineage')
│   │   ├── DataTree('ANC06-01')
│   │   │       Dimensions:  ()
│   │   │       Data variables:
│   │   │           *empty*
│   │   │       Attributes:
│   │   │           description:  MERIT 3 arcsec Digital Elevation Model reformatted into HDF...
│   │   │           fileName:     merit_3as_20200617_001_01.h5
│   │   │           shortName:    ANC06-01
│   │   │           uuid:         3db8ab65-d2ac-37c7-86db-17c762268673
│   │   │           version:      20200617
│   │   ├── DataTree('ANC06-02')
│   │   │       Dimensions:  ()
│   │   │       Data variables:
│   │   │           *empty*
│   │   │       Attributes:
│   │   │           description:  ArcticDEM 32m Digital Elevation Model reformatted into HDF5.
│   │   │           fileName:     arcticdem32m_20190611_001_01.h5
│   │   │           shortName:    ANC06-02
│   │   │           uuid:         ce07ef72-0bf4-353b-8475-fb568b029905
│   │   │           version:      20190611
│   │   ├── DataTree('ANC06-03')
│   │   │       Dimensions:  ()
│   │   │       Data variables:
│   │   │           *empty*
│   │   │       Attributes:
│   │   │           description:  REMA Antarctica 100m Digital Elevation Model filled and ref...
│   │   │           fileName:     atl06rema100m_20190628_001_01.h5
│   │   │           shortName:    ANC06-03
│   │   │           uuid:         3e0a81bd-bbd5-35fa-b868-3254f9355b7f
│   │   │           version:      20190628
│   │   ├── DataTree('ANC17')
│   │   │       Dimensions:  ()
│   │   │       Data variables:
│   │   │           *empty*
│   │   │       Attributes:
│   │   │           description:  Land Ice Height SNR Significance Table
│   │   │           fileName:     anc17_snr_f_20170718_001_01.h5
│   │   │           shortName:    ANC17
│   │   │           uuid:         52fe4165-bac0-3dc9-a9b3-921a591a7dc5
│   │   │           version:      20170718
│   │   ├── DataTree('ANC19')
│   │   │       Dimensions:  ()
│   │   │       Data variables:
│   │   │           *empty*
│   │   │       Attributes:
│   │   │           description:  TAI to UTC leapsecond file retrieved from ftp://maia.usno.n...
│   │   │           fileName:     tai_utc_2017.dat
│   │   │           shortName:    ANC19
│   │   │           uuid:         7c66d365-278a-31f7-8fe4-9c80e2f012e5
│   │   │           version:      001
│   │   ├── DataTree('ANC25-06')
│   │   │       Dimensions:  ()
│   │   │       Data variables:
│   │   │           *empty*
│   │   │       Attributes:
│   │   │           description:  HDF5 template file that defines the organization and defaul...
│   │   │           fileName:     ANC25-06_20210831171721_045_01.h5
│   │   │           shortName:    ANC25-06
│   │   │           uuid:         d29cdb37-801e-3a00-a034-d20c51047ad6
│   │   │           version:      045
│   │   ├── DataTree('ANC26-06')
│   │   │       Dimensions:  ()
│   │   │       Data variables:
│   │   │           *empty*
│   │   │       Attributes:
│   │   │           description:  HDF5 template file that defines the organization and defaul...
│   │   │           fileName:     ANC26-06_20210831171731_045_01.h5
│   │   │           shortName:    ANC26-06
│   │   │           uuid:         2093f4f5-2274-3520-8824-cae25668b08b
│   │   │           version:      045
│   │   ├── DataTree('ANC28')
│   │   │       Dimensions:  ()
│   │   │       Data variables:
│   │   │           *empty*
│   │   │       Attributes:
│   │   │           description:  DTU Mean Sea Surface re-referenced to the WGS84 ellipsoid.
│   │   │           fileName:     dtu13_20180705_001_01.nc
│   │   │           shortName:    ANC28
│   │   │           uuid:         56f47040-a72e-3109-99c2-bc1658e6dda4
│   │   │           version:      20180705
│   │   ├── DataTree('ANC36-06')
│   │   │       Dimensions:  ()
│   │   │       Data variables:
│   │   │           *empty*
│   │   │       Attributes:
│   │   │           description:  ISO 19139 XML file containing Series-level metadata informa...
│   │   │           fileName:     ANC36-06_20210831170309_005_01.xml
│   │   │           shortName:    ANC36-06
│   │   │           uuid:         7e1fe2fa-7a92-39a3-9f24-630dca51625a
│   │   │           version:      005
│   │   ├── DataTree('ANC38-06')
│   │   │       Dimensions:  ()
│   │   │       Data variables:
│   │   │           *empty*
│   │   │       Attributes:
│   │   │           description:  ISO 19139 XML file containing DataSet-level metadata inform...
│   │   │           fileName:     ANC38-06_20210831170314_005_01.xml
│   │   │           shortName:    ANC38-06
│   │   │           uuid:         f2325ab1-962d-3bf1-a1ef-7e66bdb9ffb3
│   │   │           version:      005
│   │   ├── DataTree('ATL03')
│   │   │       Dimensions:  ()
│   │   │       Data variables:
│   │   │           *empty*
│   │   │       Attributes: (12/15)
│   │   │           description:   ICESat-2 ATLAS L2A Global Geolocated Photon data products.
│   │   │           end_cycle:     2
│   │   │           end_geoseg:    671084
│   │   │           end_orbit:     1751
│   │   │           end_region:    5
│   │   │           end_rgt:       163
│   │   │           ...            ...
│   │   │           start_geoseg:  555757
│   │   │           start_orbit:   1751
│   │   │           start_region:  5
│   │   │           start_rgt:     163
│   │   │           uuid:          6050c1de-3d15-3ad2-9f5a-b7b6d56cef26
│   │   │           version:       005
│   │   ├── DataTree('ATL09')
│   │   │       Dimensions:  ()
│   │   │       Data variables:
│   │   │           *empty*
│   │   │       Attributes: (12/15)
│   │   │           description:   ICESat-2 ATLAS L3A atmosphere data products.
│   │   │           end_cycle:     2
│   │   │           end_geoseg:    2007126
│   │   │           end_orbit:     1751
│   │   │           end_region:    14
│   │   │           end_rgt:       163
│   │   │           ...            ...
│   │   │           start_geoseg:  12
│   │   │           start_orbit:   1751
│   │   │           start_region:  1
│   │   │           start_rgt:     163
│   │   │           uuid:          8672b328-3422-345d-b8b9-418830801f5d
│   │   │           version:       005
│   │   └── DataTree('Control')
│   │           Dimensions:  ()
│   │           Data variables:
│   │               *empty*
│   │           Attributes:
│   │               description:  Text-based keyword=value file generated automatically withi...
│   │               fileName:     CTL_atlas_l3a_is_007739532.ctl
│   │               shortName:    CNTL
│   │               version:      1
│   ├── DataTree('ProcessStep')
│   │   ├── DataTree('Browse')
│   │   │       Dimensions:  ()
│   │   │       Data variables:
│   │   │           *empty*
│   │   │       Attributes:
│   │   │           identifier:          atlas_brw
│   │   │           processDescription:  Browse processing is performed for each granule SIPS...
│   │   │           runTimeParameters:   CTL_atlas_l3a_is_007739532.ctl
│   │   │           softwareDate:        Jul 28 2021
│   │   │           softwareTitle:       Creates ATLAS HDF5 browse files
│   │   │           softwareVersion:     Version 2.5
│   │   │           stepDateTime:        2021-09-05T11:19:48.000000Z
│   │   ├── DataTree('Metadata')
│   │   │       Dimensions:  ()
│   │   │       Data variables:
│   │   │           *empty*
│   │   │       Attributes:
│   │   │           identifier:          atlas_meta
│   │   │           processDescription:  Metadata information is processed by the metadata ut...
│   │   │           runTimeParameters:   CTL_atlas_l3a_is_007739532.ctl
│   │   │           softwareDate:        Jul 28 2021
│   │   │           softwareTitle:       Creates ATLAS XML metadata files
│   │   │           softwareVersion:     Version 4.5
│   │   │           stepDateTime:        2021-09-05T11:19:59.000000Z
│   │   ├── DataTree('PGE')
│   │   │       Dimensions:  ()
│   │   │       Data variables:
│   │   │           *empty*
│   │   │       Attributes:
│   │   │           ATBDDate:            12/04/2019
│   │   │           ATBDTitle:           Algorithm Theoretical Basis Document (ATBD) For Land...
│   │   │           ATBDVersion:         N/A
│   │   │           documentDate:        Feb 2020
│   │   │           documentation:       ATLAS Science Algorithm Software Design Description ...
│   │   │           identifier:          atlas_l3a_is
│   │   │           processDescription:  Computes surface heights for each beam, along and ac...
│   │   │           runTimeParameters:   CTL_atlas_l3a_is_007739532.ctl
│   │   │           softwareDate:        Jul 28 2021
│   │   │           softwareTitle:       ASAS L3A Icesheet PGE
│   │   │           softwareVersion:     Version 4.5
│   │   │           stepDateTime:        2021-09-05T11:16:31.000000Z
│   │   └── DataTree('QA')
│   │           Dimensions:  ()
│   │           Data variables:
│   │               *empty*
│   │           Attributes:
│   │               identifier:          atl06_qa_util
│   │               processDescription:  QA processing is performed by an external utility on...
│   │               runTimeParameters:   CTL_atlas_l3a_is_007739532.ctl
│   │               softwareDate:        Jul 28 2021
│   │               softwareTitle:       ATL06 QA Utility
│   │               softwareVersion:     Version 4.5
│   │               stepDateTime:        2021-09-05T11:19:46.000000Z
│   ├── DataTree('ProductSpecificationDocument')
│   │       Dimensions:  ()
│   │       Data variables:
│   │           *empty*
│   │       Attributes:
│   │           ShortName:        ATL06_SDP
│   │           characterSet:     utf8
│   │           edition:          v4.3
│   │           language:         eng
│   │           publicationDate:  Feb 2020
│   │           title:            ICESat-2-SIPS-SPEC-4260 - ATLAS Science Algorithm Stand...
│   ├── DataTree('QADatasetIdentification')
│   │       Dimensions:  ()
│   │       Data variables:
│   │           *empty*
│   │       Attributes:
│   │           abstract:      An ASCII product that contains statistical information on ...
│   │           creationDate:  2021-09-05T11:19:46.000000Z
│   │           fileName:      ATL06_20190108030055_01630205_005_01.qa
│   └── DataTree('SeriesIdentification')
│           Dimensions:  ()
│           Data variables:
│               *empty*
│           Attributes: (12/19)
│               VersionID:                         005
│               abstract:                          ATL06 provides estimates of the ice-sh...
│               characterSet:                      utf8
│               credit:                            The software that generates the ATL06 ...
│               format:                            HDF
│               formatVersion:                     5
│               ...                                ...
│               purpose:                           ATL06 provides estimates of the ice-sh...
│               resourceProviderOrganizationName:  National Aeronautics and Space Adminis...
│               revisionDate:                      2016-06-09
│               shortName:                         ATL06
│               status:                            onGoing
│               topicCategory:                     geoscientificInformation
├── DataTree('ancillary_data')
│       Dimensions:              (phony_dim_0: 1)
│       Dimensions without coordinates: phony_dim_0
│       Data variables:
│           atlas_sdp_gps_epoch  (phony_dim_0) datetime64[ns] ...
│           data_end_utc         (phony_dim_0) |S27 ...
│           data_start_utc       (phony_dim_0) |S27 ...
│           end_delta_time       (phony_dim_0) datetime64[ns] ...
│           granule_end_utc      (phony_dim_0) |S27 ...
│           granule_start_utc    (phony_dim_0) |S27 ...
│           start_delta_time     (phony_dim_0) datetime64[ns] ...
│       Attributes:
│           Description:  Contains information ancillary to the data product. This ma...
│           data_rate:    Data within this group pertain to the granule in its entirety.
├── DataTree('gt3l')
│   │   Dimensions:  ()
│   │   Data variables:
│   │       *empty*
│   │   Attributes:
│   │       Description:         Contains subgroups organized by Ground Track (gt1l, ...
│   │       atlas_beam_type:     strong
│   │       atlas_pce:           pce3
│   │       atlas_spot_number:   5
│   │       atmosphere_profile:  profile_3
│   │       groundtrack_id:      gt3l
│   │       sc_orientation:      Backward
│   └── DataTree('land_ice_segments')
│           Dimensions:    (phony_dim_1: 1)
│           Dimensions without coordinates: phony_dim_1
│           Data variables:
│               h_li       (phony_dim_1) float32 ...
│               latitude   (phony_dim_1) float64 ...
│               longitude  (phony_dim_1) float64 ...
│           Attributes:
│               Description:  The land_ice_height group contains the primary set of deriv...
│               data_rate:    Data within this group are sparse.  Data values are provide...
├── DataTree('gt3r')
│   │   Dimensions:  ()
│   │   Data variables:
│   │       *empty*
│   │   Attributes:
│   │       Description:         Contains subgroups organized by Ground Track (gt1l, ...
│   │       atlas_beam_type:     weak
│   │       atlas_pce:           pce3
│   │       atlas_spot_number:   6
│   │       atmosphere_profile:  profile_3
│   │       groundtrack_id:      gt3r
│   │       sc_orientation:      Backward
│   └── DataTree('land_ice_segments')
│           Dimensions:    (phony_dim_2: 1)
│           Dimensions without coordinates: phony_dim_2
│           Data variables:
│               h_li       (phony_dim_2) float32 ...
│               latitude   (phony_dim_2) float64 ...
│               longitude  (phony_dim_2) float64 ...
│           Attributes:
│               Description:  The land_ice_height group contains the primary set of deriv...
│               data_rate:    Data within this group are sparse.  Data values are provide...
└── DataTree('orbit_info')
        Dimensions:         (sc_orient_time: 1)
        Coordinates:
          * sc_orient_time  (sc_orient_time) datetime64[ns] 2019-01-07T23:30:00
        Data variables:
            sc_orient       (sc_orient_time) int8 ...
        Attributes:
            Description:  Contains data that are common among all beams for the granu...
            data_rate:    These parameters are constant for a given granule.

So to answer your questions:

You’re not necessarily doing anything wrong since the subsetter is doing its job (removing laser tracks that aren’t needed)? The unfortunate thing is that sometimes there are 2 lasers returned, sometimes 6, and the current icepyx reader can’t handle the inconsistency. Maybe using region_a.order_granules(subset=False) could allow you to keep all 6 laser tracks, but that’s a question for @JessicaS11 perhaps

Depends on how you want to combine them :smiley: If you describe your scientific goal briefly (i.e. what you’re trying to do with the ICESat-2 elevation data), it’ll be easier for us to better advise on what’s the best way. At the moment, I can only guess that you you want to look at elevation change over time or something?

Thanks for bringing this to our attention, @tlohde (and with a really great and thorough report! So appreciated!) and thanks for your response, @weiji14! I’m excited to see how it looks reading IS2 data in with DataTree and look forward to exploring that more.

To the issue at hand:

@weiji14 got it exactly right that the icepyx reader cannot yet handle the inconsistency in number of lasers returned. We solved a similar issue a few months back, but had not yet encountered this version of the issue.

One short-term solution (as suggested by @weiji14) would be to simply order data without subsetting. This might be a viable solution given your relatively small number of granules, though then you will have to manually subset the data later in your workflow. A better solution is obviously for icepyx to fix the bug to handle these cases automatically (I’ve opened a GitHub issue here). If you’re able/interested/willing, we’d be happy to help you through the process of implementing a bug fix (no worries if you’ve never contributed to software before!).

Hello, and thank you @weiji14 and @JessicaS11 for your prompt and helpful replies.

The intention, at the moment, is a little vague. But yes, broadly speaking: surface elevation change over time - ideally capturing as much seasonality as possible. Having looked at elevations and dh_dt from ATL14 and 15 - which do a pretty good job in my study area - I firstly, want to see what the underlying data looks like. With a view to (potentially) interleaving it with the CryoSat-2 EOLIS Point product.


Re: ordering w/o subsetting:

region_a.order_vars.append(var_list= ["h_li","latitude","longitude"])
region_a.download_granules('../download/', subset=False)

# and then ...

pattern = "processed_ATL{product:2}_{datetime:%Y%m%d%H%M%S}_{rgt:4}{cycle:2}{orbitsegment:2}_{version:3}_{revision:2}.h5"
directory = '../download/'
reader = ipx.Read(data_source = directory, product = "ATL06", filename_pattern = pattern)
reader.vars.append(var_list = ["h_li","latitude","longitude"])
ds = reader.load()

This returns the same error.

I have slightly improved my earlier attempt, only reading in files that have ‘land_ice_segments’ in all six beams. This allows 37 of the 47 files to be opened together with reader.load().


re: helping implement a fix. I am interested and willing, but not wholly sure how capable I am (I haven’t contributed to software before - but happy to give it a go, with some welcome guidance).