Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Appearance settings
Discussion options

Is there some maximum size/extent of the ROI when requesting Landsat-C2-L2 tiles from MPC?

I am using the sits package in R. My ROI covers nearly 4000 sq km. (72 x 77 km). The function sits_cube() completes, returning a tibble of available tiles, but when I try to run sits_cube_copy(), I get:

Error: ! .check_raster_cube_files: Invalid data cube - missing files

I assume that MPC is enforcing some quota, since my workflow completes successfully on smaller ROI's. I have acquired tiles from several other smaller locations.
Are the MPC extent limitations documented somewhere?

Thanks

You must be logged in to vote

Replies: 4 comments · 2 replies

Comment options

Could you share the exact code you are trying to run?

You must be logged in to vote
0 replies
Comment options

Hi @ghidalgo3
Here's the code snippet that causes the "Invalid data cube" error

`source <- "MPC"
collection <- "LANDSAT-C2-L2"
bands <- c("GREEN", "SWIR16", "CLOUD")
from_date <- "2008-06-01"
to_date <- "2018-05-31"
aoi <- st_read("okavango_delta.gpkg", layer = "aoi_c")

LS_cube <- sits::sits_cube(
source = source,
collection = collection,
bands = bands,
roi = aoi,
start_date = from_date,
end_date = to_date
)
`
If the extent of the area of interest (aoi), is larger than 75 km width or height, then the error occurs. If it is smaller, then the download runs cleanly.
75 km = 2500 pixels x 30 meters/pixel. So I assume that 2500 pixels is some enforced limitation to download size. Is this correct??

The whole project is hosted at: this repo

Thanks, Micha

You must be logged in to vote
2 replies
@rolfsimoes
Comment options

Hi Micha,

Thank you for reporting this.

I tried to get the same error with a large area, but I could not reproduce it. My process worked fine and downloaded a bigger area than your 75km limit.

My download took many hours, and SITS successfully renewed the MPC access tokens when needed.

This means the problem is most likely related to the version of SITS you are using. A newer version of SITS probably fixed the limit you found.

I suggest the following action plan for you:

  1. Update SITS
    The first thing you should do is update your SITS package to the newest version from CRAN. This is the simplest way to fix the problem. Use this command in R to update SITS:
install.packages("sits")

After you update, please try your original code again.

  1. Alternative strategy if the issue continues:
    Try to reduce the total size of the request by asking for data year by year. This avoids any single large request that the MPC API might still reject. Make sure you copy all the data into the same local directory. Once all the annual data is copied to the local folder, SITS can open the full 10-year data cube from your local disk using the
sits_cube(source = "MPC", ..., local_dir = "your_local_folder")

command. See more details on local cubes in SITS documentation: ?sita_cube

Let us know if the update fixes the issue!

@ghidalgo3
Comment options

I second partitioning the search to be annual and then merging the results client-side. I'm not familiar with this sits library, but I know that enumerating large numbers of STAC items through the STAC API is prone to failures if the service experiences extreme load during your enumeration.

If you need to enumerate all of the STAC items for Landsat over a period of time, I suggest you instead use the stac-geoparquet exported version of the collection to bypass the API altogether: https://planetarycomputer.microsoft.com/docs/quickstarts/stac-geoparquet/

This may not be easy to do in R, I understand if that's not an option.

Comment options

Dear @ghidalgo3 @micha-silver

@rolfsimoes and I are part of the development team of the R sits package. As @rolfsimoes suggested in his email, @micha-silver can break his request in small chunks, per year and per band, and copy each year-band combination to a local disk. Then he would use sits to create a regular data cube in a specified interval. In this way, he would reduce the size of individual requests to MPC. This strategy works well and fits in the MPC restrictions.

You must be logged in to vote
0 replies
Comment options

Thanks @gilbertocamara . I will try shorter time ranges, with a large AOI, and report back.

You must be logged in to vote
0 replies
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
🙏
Q&A
Labels
None yet
4 participants
Morty Proxy This is a proxified and sanitized view of the page, visit original site.