Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Appearance settings
Discussion options

Select Topic Area

Question

Body

I got a GitHub mail,

Your total cache usage exceeds upcoming caching limits

later is says

You are receiving your email because your consumption exceeds 10GB within a 24 hour period

I can not think about an action in a repo that would use so much cache.
Why can't GitHub not tell me, its not possible to go through all repos I have in all organizations, that are 100

How can I find out which repo uses the most cache?

You must be logged in to vote

You can check this with the GitHub CLI (replace rdtechie, which is my account, with your own):

gh repo list rdtechie --json nameWithOwner -q '.[].nameWithOwner' | xargs -n1 gh cache list -R

It will cycle through your repositories, shows the caches in each one (press q to go to the next entry).

Replies: 10 comments · 15 replies

This comment was marked as off-topic.

@kepstin
Comment options

@RoshanLimbu123 this looks like an AI slop response? You can't just provide instructions on how to check each repo manually then say this is how to do it "without checking each one manually." Also, the link isn't real.

@a4z
Comment options

yes, this is the same answer I got when I asked chatgtp :-)

@queenofcorgis
Comment options

@RoshanLimbu123:
We’ve clarified our stance on using generative AI tools like ChatGPT within our Community via this announcement. Please review the guidelines to ensure your post meets them as failure to adhere to those rules can result in action taken by our moderator team.  You can read our updated Code of Conduct and the announcement for more details. Thank you for helping us maintain an authentic and beneficial space for everyone.

Comment options

Check cache usage in individual repositories

GitHub recently added a way to inspect caches:

Go to a repo → Settings → Actions → Caches.

There you’ll see all cache entries, their sizes, and when they were last used.

Sort by size to quickly see the largest offenders.

You must be logged in to vote
2 replies
@kepstin
Comment options

This does not answer the question. How do you find out which repo the excessive cache use is in without individually checking each one?

@a4z
Comment options

thanks for trying to response, unfortunately, this answer is as unhelpful as the previouse AI generated one

Comment options

You can check this with the GitHub CLI (replace rdtechie, which is my account, with your own):

gh repo list rdtechie --json nameWithOwner -q '.[].nameWithOwner' | xargs -n1 gh cache list -R

It will cycle through your repositories, shows the caches in each one (press q to go to the next entry).

You must be logged in to vote
2 replies
@a4z
Comment options

Yes that works, thanks a lot!

@kepstin
Comment options

Hmm. This didn't return any results on any of my personal repositories, so I had to separately check each organization that I'm a member of. Certainly an improvement over needing to check each repo individually, but it would still be great if Github could provide this information directly in the email instead…

Edit: Turns out that the cause of the alert for me was that I had a repo using a workaround for the missing support for updating an existing cache which worked by uploading a new cache with a timestamp in the cache key on each build (and using a restore key that matches the most recently created cache). I was relying on GitHub automatically expiring caches, but a fair number of them have built up. Guess I might have to do some gh cli stuff in my build job to manually remove old caches…

Answer selected by a4z
Comment options

  1. Use GitHub CLI to List Caches per Repo

GitHub’s cache system is per repository, and you can query it with the GitHub CLI (gh).
If you have gh installed, run this in your terminal:

Replace ORG with your organization name

This loops through all repos in the org and lists cache sizes

gh repo list ORG --limit 1000 --json nameWithOwner
--jq '.[].nameWithOwner' | while read repo; do
echo "🔍 Checking $repo..."
gh api /repos/$repo/actions/caches --jq '.actions_caches[] | {repo: "'$repo'", id: .id, size_in_gb: (.size_in_bytes/1024/1024/1024 | floor), key: .key}'
done

This will:

Go through every repo in your org

Fetch its Actions caches

Show each cache’s key and its size in GB

You can then sort by size and instantly spot the repo eating 10GB+ per day.

  1. Manual Way (If You Don’t Want CLI)

You can also manually check caches in the GitHub web UI (but not scalable for 100 repos):

Go to Repo → Settings → Actions → Caches

You’ll see cache keys, last accessed date, and sizes.

But with 100+ repos, CLI is 100x faster.

  1. Bonus: Automate Deletion of Huge Caches

Once you find the culprit repo, you can nuke unnecessary caches from CLI too:

Delete all caches for a repo

gh api --method DELETE /repos/ORG/REPO/actions/caches

Or delete by cache key if you only want to clear specific ones:

gh api --method DELETE /repos/ORG/REPO/actions/caches
-f key="your-cache-key"

  1. Prevention

You might also want to check your workflows:

Are you caching node_modules, target, vendor, or other huge folders unnecessarily?

Maybe you’re using a cache key that changes on every run (leading to many unique caches).

You must be logged in to vote
1 reply
@a4z
Comment options

Thanks a lot for the extended explanation!
Found the repo, was in one org of a company where I am a member of (maybe even an admin for that)

It's not node_modules, it's Rust dependencies, terrible stuff, size-wise

Comment options

I'm still stuck on solving this one. I have gone through every single org listed on my Organizations page and none of them has a repo with a cache anywhere near 10GB in size.

What can I do next?

You must be logged in to vote
0 replies
Comment options

If you have a Github enterprise org you can find the cache overview by size at https://github.com/organizations/orgname/settings/actions/caches

You must be logged in to vote
1 reply
@pcolmer
Comment options

Thank you - I found it that way. The gh command did NOT reveal that information.

Comment options

The mail was received here as well, and it's suspicious to say the least. Within our org with multiple admins, only one received the mail, and the links in the mail go to app.github.media.

That, along with the urgency, the typos, and no communication from GitHub before this: it's probably phishing. Ignore.

You must be logged in to vote
3 replies
@a4z
Comment options

No, thats a legitimate GitHub mail. With dkim and spf , proper return addresses and no wrong links

@hmeine
Comment options

It also raised flags with me – app.github.media not being a known domain to me, I refrained from clicking any links therein. There's also zero personalisation in the mail; that the repo / org in question is missing is the main issue discussed here, but the greeting is also just "Hello", so the mail basically screams "ignore/delete me!"

@a4z
Comment options

that is the answer I got from AI when I threw in the mail header


Thanks for sharing the full header — that helps.

Quick analysis:

Return-Path: bounce@resources.github.com → legit GitHub mail infrastructure.

SPF: pass for resources.github.com.

DKIM: pass for github.com and also for us.en25.com.

DMARC: pass for github.com.

From: GitHub no-reply@github.com matches DKIM.

ARC/Google checks: also pass.

👉 So the mail really comes from GitHub (or their mailing vendor Oracle Eloqua, en25.com).
The weird part is only the unsubscribe link pointing to app.github.media. That’s not a standard GitHub domain, but it seems to be part of their email marketing setup (very likely a branded link-tracking / unsubscribe domain operated by Eloqua).

So:

The header itself is clean, no spoofing.

The app.github.media links are probably safe redirectors GitHub uses for tracking and unsubscribes.

Still: if you don’t trust it, don’t click directly — instead, log in to GitHub → Settings → Notifications and manage subscriptions there.


anyhow, strange mail, maybe thei AI went cracy :-)

Comment options

Is there any information about billing for repos which exceed 10GB?

Ideally I would like the behavior to act as before, just auto-delete any cache which exceeds the 10 GB limit. From my understanding this will no longer be possible after October 15th. Is anyone interpretating it in a different manner?

You must be logged in to vote
3 replies
@CodeCasterNL
Comment options

See my reply above; the mail is weird. The documentation still mentions:

GitHub will remove any cache entries that have not been accessed in over 7 days. There is no limit on the number of caches you can store, but the total size of all caches in a repository is limited to 10 GB. Once a repository has reached its maximum cache storage, the cache eviction policy will create space by deleting the caches in order of last access date, from oldest to most recent.

If you exceed the limit, GitHub will save the new cache but will begin evicting caches until the total size is less than the repository limit. The cache eviction process may cause cache thrashing, where caches are created and deleted at a high frequency. To reduce this, you can review the caches for a repository and take corrective steps, such as removing caching from specific workflows. See Managing caches.

@audunsolemdal
Comment options

I agree that the docs seemingly not being updated together with no announcements is strange. I still don't believe this to be phishing as the sender email address appears legit and I have received emails pointing to app.github.media in the past.

I notice that most no-reply emails from github have the sender noreply@github.com while this email has no-reply@github.com though

@kepstin
Comment options

Reading through everything again, I'm wondering if there is actually any action required by this email? It almost seems like a poorly worded sales pitch.

They already automatically evict stuff from cache when the size exceeds 10gb (although it's possible they'll be more aggressive about that going forwards?), and the size of cache available for free is not changing. The only new information provided in this email is the fact that you will be able to pay to increase the size of cache on a per-repository basis.

As long as they don't automatically extend your cache size and start billing you, it should be possible to ignore the email and everything will continue working the same as it already does.

Comment options

You can do this with the GitHub CLI. For example:
gh repo list --json nameWithOwner -q '.[].nameWithOwner' | xargs -n1 gh cache list -R

Here’s what it does:
gh repo list fetches all of your repositories.
--json nameWithOwner -q '.[].nameWithOwner' extracts the full repo names.
xargs -n1 gh cache list -R runs gh cache list for each repository.

You must be logged in to vote
0 replies
Comment options

My question here is: How can our organization increase the cache size? Is there some sort of setting to set? So far I have only found documentations on the cache size listing and cache deleting, but I have not seen any mentions on how to actually increase the cache size, be it above 10GB or keeping it that way to avoid additional billings.

Any help?

You must be logged in to vote
0 replies
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
General General topics and discussions that don't fit into other categories, but are related to GitHub Question Ask and answer questions about GitHub features and usage
Morty Proxy This is a proxified and sanitized view of the page, visit original site.