RasterioIOError when using CDSE STAC API

Hi all,

I need someone’s help to understand what is going wrong in my workflow!

I am trying to read Sentinel-2 data through the CDSE STAC API and stackstac pyhton library and applying a workflow for snow classification as done here

S3_ENDPOINT = “eodata.dataspace.copernicus.eu”

ACCESS_KEY = “”
SECRET_KEY = “”

import os
os.environ[“AWS_S3_ENDPOINT”] = S3_ENDPOINT
os.environ[“AWS_ACCESS_KEY_ID”] = ACCESS_KEY
os.environ[“AWS_SECRET_ACCESS_KEY”] = SECRET_KEY

from shapely.geometry import shape
from shapely.geometry.polygon import Polygon

geometry = {‘type’: ‘Polygon’,
‘coordinates’: [[[-56.055536, -12.63809],
[-56.055536, -12.523493],
[-55.88178, -12.523493],
[-55.88178, -12.63809],
[-56.055536, -12.63809]]]}

bounds = shape(geometry).bounds

import pystac_client

CDSE_URL = “https://stac.dataspace.copernicus.eu/v1”
cat = pystac_client.Client.open(CDSE_URL)
cat.add_conforms_to(“ITEM_SEARCH”)

start_dt = “2025-07-01”
end_dt = “2025-07-30”

from shapely import to_geojson
import json

params = {
“collections”: [“sentinel-2-l1c”],
“intersects”: geometry,
“datetime”: f"{start_dt}T00:00:00Z/{end_dt}T23:59:59Z"
}

items = list(cat.search(**params).items_as_dicts())
print(f"Number of STAC items returned: {len(items)}")

import rioxarray
import stackstac

stack = stackstac.stack(
items=items,
resolution=(0.00025, 0.00025),
bounds_latlon=bounds,
epsg=4326,
gdal_env=stackstac.DEFAULT_GDAL_ENV.updated(
{
“GDAL_NUM_THREADS”: -1,
“GDAL_HTTP_UNSAFESSL”: “YES”,
“GDAL_HTTP_TCP_KEEPALIVE”: “YES”,
“AWS_VIRTUAL_HOSTING”: “FALSE”,
“AWS_HTTPS”: “YES”,
}
),
)

stack.load()

The code is working well but when applying on a long time-series (the idea is to run it for all the Sentinel-2 era) it turned out that I started to get errors like this when loading the lazy dataset into memory. A random example here:

RuntimeError: Error reading Window(col_off=1024, row_off=1024, width=226, height=676) from ‘s3://eodata/Sentinel-2/MSI/L1C_N0500/2023/01/03/S2B_MSIL1C_20230103T143729_N0510_R096_T19HCB_20240811T114613.SAFE/GRANULE/L1C_T19HCB_A030438_20230103T144503/IMG_DATA/T19HCB_20230103T143729_B08.jp2’: RasterioIOError(‘Read failed. See previous exception for details.’)

see opened issue in the CDSE forum https://forum.dataspace.copernicus.eu/t/stac-api-rasterioioerror/4884

This appears on random dates. I have inserted a loop to retry the data loading and after some retrials, sometimes works, sometimes not and I have the impression the more dates I process, the more frequent is the error. So I suspect an issue linked to some access limitations.

If someone could give some some hints, I would really appreciate!!

Thanks in advance

Valentina

it works for me in generic cli fwiw - sometimes read might fail for network reasons I guess, maybe this gives you confidence that it can work

AWS_ACCESS_KEY_ID=$CDSE_KEY AWS_SECRET_ACCESS_KEY=$CDSE_SECRET AWS_VIRTUAL_HOSTING=FALSE AWS_S3_ENDPOINT=eodata.dataspace.copernicus.eu  

gdal raster clip --window 1024,1024,226,676 /vsis3/eodata/Sentinel-2/MSI/L1C_N0500/2023/01/03/S2B_MSIL1C_20230103T143729_N0510_R096_T19HCB_20240811T114613.SAFE/GRANULE/L1C_T19HCB_A030438_20230103T144503/IMG_DATA/T19HCB_20230103T143729_B08.jp2 outtest.tif0…10…20…30…40…50…60…70…80…90…100 - done in 00:00:41.

gdal raster info --stats outtest.tifDriver: GTiff/GeoTIFF
Files: outtest.tif Size is 226, 676

…Band 1 Block=226x19 Type=UInt16, ColorInterp=GrayMinimum=1506.000, Maximum=6580.000, Mean=3627.399, StdDev=768.807Metadata:STATISTICS_MINIMUM=1506STATISTICS_MAXIMUM=6580STATISTICS_MEAN=3627.3989959156STATISTICS_STDDEV=768.80652000362STATISTICS_VALID_PERCENT=

I haven’t yet found a version of the python code that works, but I’m not familiar with stackstac

Hi Michael,

thanks a lot for your answer! I have tried your code and it works. For another date that it was making problems however also does not work

HTTP response code on

https://eodata.dataspace.copernicus.eu/eodata/Sentinel-2/MSI/L1C_N0500/2022/02/02/S2A_MSIL1C_202202… 429

Could you please confirm that this date is not working for you as well?

Thank you so much for your help!