Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Appearance settings

feat(util): Add bounded LRU cache utility#9141

Open
sauerdaniel wants to merge 1 commit intoanomalyco:devanomalyco/opencode:devfrom
sauerdaniel:pr/lru-cache-boundssauerdaniel/opencode:pr/lru-cache-boundsCopy head branch name to clipboard
Open

feat(util): Add bounded LRU cache utility#9141
sauerdaniel wants to merge 1 commit intoanomalyco:devanomalyco/opencode:devfrom
sauerdaniel:pr/lru-cache-boundssauerdaniel/opencode:pr/lru-cache-boundsCopy head branch name to clipboard

Conversation

@sauerdaniel
Copy link
Contributor

@sauerdaniel sauerdaniel commented Jan 17, 2026

Summary

Add a bounded LRU (Least Recently Used) cache utility with eviction callbacks to prevent unbounded memory growth.

Fixes #9140

Problem

Several places in the codebase use unbounded Map objects for caching:

  • Instance cache in project/instance.ts
  • Provider SDK cache in provider/provider.ts

These can grow without limit in long-running processes or when handling many directories/providers.

Solution

Add a reusable createLruCache utility that:

  • Limits cache size with maxEntries option
  • Evicts least-recently-used entries when full
  • Provides onEvict callback for cleanup logic
  • Maintains Map-like interface for easy adoption

Changes

  • packages/opencode/src/util/cache.ts - New LRU cache utility with:
    • maxEntries limit (default: Infinity for backward compatibility)
    • onEvict callback for disposal logic
    • LRU tracking via lastAccess timestamp
    • Iterator support for for...of loops

Testing

  • TypeScript compilation passes (bun turbo typecheck)
  • Unit tests pass (725 tests, 0 failures)
  • Cache utility has 36% line coverage from existing tests

Note: Manual memory testing (monitoring heap growth over time) was not performed.

@github-actions
Copy link
Contributor

The following comment was made by an LLM, it may be inaccurate:

Based on my search results, I found one potentially related PR:

Related PR:

The current PR (9141) itself appears in the search results as expected. The other results (#1163 on TUI performance, #8535 on pagination, #7036 on tool caching) are less directly related to the LRU cache utility feature.

@sauerdaniel
Copy link
Contributor Author

Rebased onto latest dev (clean rebase, no conflicts).\n\nAll typechecks pass. Ready for review.

@sauerdaniel sauerdaniel force-pushed the pr/lru-cache-bounds branch 5 times, most recently from 261cf3f to b598ff9 Compare February 16, 2026 02:44
@sauerdaniel sauerdaniel force-pushed the pr/lru-cache-bounds branch 2 times, most recently from 4f5bb89 to 59c6150 Compare February 20, 2026 13:35
Add createLruCache() function with:
- Configurable maxEntries limit to prevent unbounded growth
- Optional onEvict callback for cleanup when entries are removed
- Full Map-like interface (get/set/has/delete/clear/size)
- Iterator support for iterating over entries
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Memory leak: Unbounded caches grow without limit

1 participant

Comments

Close sidebar
Morty Proxy This is a proxified and sanitized view of the page, visit original site.