-
Notifications
You must be signed in to change notification settings - Fork 1.4k
LogEventInfo - Changed LayoutCache from dictionary to linked list #6025
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: dev
Are you sure you want to change the base?
Conversation
Note Other AI code review bot(s) detectedCodeRabbit has detected other AI code review bot(s) in this pull request and will avoid duplicating their findings in the review comments. This may lead to a less comprehensive review. WalkthroughRefactors LogEventInfo’s internal layout caching from a locked Dictionary to a lock-free singly linked list using Interlocked.CompareExchange. Adds a private LayoutCacheLinkedNode type, updates Add and TryGet cache paths to use list insertion and traversal. Public API and exception behavior are unchanged. Changes
Sequence Diagram(s)sequenceDiagram
autonumber
participant Caller
participant LogEventInfo
participant Layout
participant Cache as LayoutCache (internal)
Caller->>LogEventInfo: TryGetCachedLayoutValue(layout)
LogEventInfo->>Cache: Traverse linked nodes
alt Match found
Cache-->>LogEventInfo: Return cached value
LogEventInfo-->>Caller: Value
else Not found
LogEventInfo-->>Caller: Miss
end
sequenceDiagram
autonumber
participant Caller
participant LogEventInfo
participant Interlocked
participant Cache as LayoutCache (internal)
Caller->>LogEventInfo: AddCachedLayoutValue(layout, value)
LogEventInfo->>Cache: Read current head
loop Retry until success
LogEventInfo->>Interlocked: CompareExchange(head, newNode{Next=head}, head)
alt CAS success
Interlocked-->>LogEventInfo: Swapped
LogEventInfo-->>Caller: Done
else CAS failed
Interlocked-->>LogEventInfo: No swap
LogEventInfo->>Cache: Reload head
end
end
Estimated code review effort🎯 3 (Moderate) | ⏱️ ~25 minutes Poem
Pre-merge checks and finishing touches❌ Failed checks (1 warning)
✅ Passed checks (2 passed)
✨ Finishing touches
🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
🧹 Nitpick comments (2)
src/NLog/LogEventInfo.cs (2)
72-72
: Strengthen memory visibility on lock‑free cache reads (optional)To guarantee readers always see fully published nodes across CPUs, consider volatile semantics for the head and volatile reads during traversal.
Option A (simplest): make the head volatile.
- private LayoutCacheLinkedNode? _layoutCache; + private volatile LayoutCacheLinkedNode? _layoutCache;Option B: use Volatile.Read on head and next in TryGet.
- var nodeHead = _layoutCache; + var nodeHead = Volatile.Read(ref _layoutCache); while (nodeHead != null) { if (ReferenceEquals(nodeHead.Key, layout)) { value = nodeHead.Value; return true; } - nodeHead = nodeHead.Next; + nodeHead = Volatile.Read(ref nodeHead.Next); }Also applies to: 637-646
814-825
: Invalidate layout cache when message/parameters change (avoid stale values)If Message/Parameters/FormatProvider mutate after precalc, cached layout outputs can become stale. Consider clearing the cache in ResetFormattedMessage.
private void ResetFormattedMessage(bool rebuildMessageTemplateParameters) { if (_messageFormatter is null || _messageFormatter.Target is ILogMessageFormatter) { _formattedMessage = null; + // Cached layout values may depend on message/parameters/format-provider + Volatile.Write(ref _layoutCache, null); }Please confirm this matches intended lifecycle (i.e., LogEventInfo not mutated after layouts start precalculating). If mutation never happens post-precalc, this can be skipped.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (1)
src/NLog/LogEventInfo.cs
(3 hunks)
🧰 Additional context used
🧬 Code graph analysis (1)
src/NLog/LogEventInfo.cs (1)
src/NLog/Layouts/SimpleLayout.cs (1)
Layout
(57-487)
🔇 Additional comments (2)
src/NLog/LogEventInfo.cs (2)
625-633
: CAS prepend loop looks correctLock-free insertion is correct and wait-free for writers. No correctness issues spotted.
841-852
: Node type is minimal and appropriateImmutable key/value, single Next pointer—fits the prepend pattern. LGTM.
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull Request Overview
Replaces Dictionary-based layout caching with a linked list implementation to reduce memory allocations for small cache sizes. The change aims to optimize performance when caching 1-4 layout outputs by eliminating the upfront allocation overhead of Dictionary objects.
- Replaced
IDictionary<Layout, object?>
with customLayoutCacheLinkedNode
linked list - Implemented lock-free concurrent operations using
Interlocked.CompareExchange
- Added new
LayoutCacheLinkedNode
nested class for the linked list structure
Tip: Customize your code reviews with copilot-instructions.md. Create the file or learn how to get started.
The Dictionary needs to allocate 3 objects upfront, and additional allocates 2 larger array when inserting more than 3 items:
The linked list will reduce allocations, when only needing to cache 1-4 layout-outputs. The linked-list should still have fast lookup when less than 10 layout-outputs.
After having removed the implicit caching of layout-output on Layout.Render(), and the caching only happens on precalculate, then the linked-list should not grow aggressively with many duplicate entries.