⚡️ Speed up method PromptManager.list_prompts by 84%#1
Open
codeflash-ai[bot] wants to merge 1 commit intomainSaga4/python-sdk:mainfrom
codeflash/optimize-PromptManager.list_prompts-ma2wg3klSaga4/python-sdk:codeflash/optimize-PromptManager.list_prompts-ma2wg3klCopy head branch name to clipboard
Open
⚡️ Speed up method PromptManager.list_prompts by 84%#1codeflash-ai[bot] wants to merge 1 commit intomainSaga4/python-sdk:mainfrom codeflash/optimize-PromptManager.list_prompts-ma2wg3klSaga4/python-sdk:codeflash/optimize-PromptManager.list_prompts-ma2wg3klCopy head branch name to clipboard
PromptManager.list_prompts by 84%#1codeflash-ai[bot] wants to merge 1 commit intomainSaga4/python-sdk:mainfrom
codeflash/optimize-PromptManager.list_prompts-ma2wg3klSaga4/python-sdk:codeflash/optimize-PromptManager.list_prompts-ma2wg3klCopy head branch name to clipboard
Conversation
Here is an optimized version of your code. Since your only function (`list_prompts`) just calls `list(self._prompts.values())`, and the profiling shows this is the main runtime, you can avoid creating a new list by returning a view (if the caller never mutates or sorts the result), or a tuple (with `tuple(self._prompts.values())`), which is slightly faster and uses less memory than a list. Document that this is a tuple now if you return it (the type annotation must be changed only if the function's return type changes). However, if you must keep the return type as `list`, this is basically optimal. To further optimize, you can return a statically cached empty list (avoid list allocation) if there are no items. This gives a minor speed boost in empty cases. Here's the refactored code. **Explanation:** - Checks for an empty dict to avoid allocating a new list in the common case of zero prompts. - Retains list conversion for caller compatibility. - Keeps fast native dict `values()` access. If mutation of the returned list by the caller is never required and you control call sites, you *could*. - Change return type to `tuple[Prompt, ...]` - `return tuple(self._prompts.values())` This is slightly faster and smaller in memory, but not backward compatible. No other meaningful optimizations can be made to this function given Python's built-in dict and memory model!
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
📄 84% (0.84x) speedup for
PromptManager.list_promptsinsrc/mcp/server/fastmcp/prompts/prompt_manager.py⏱️ Runtime :
4.09 microseconds→2.23 microseconds(best of21runs)📝 Explanation and details
Here is an optimized version of your code.
Since your only function (
list_prompts) just callslist(self._prompts.values()), and the profiling shows this is the main runtime, you can avoid creating a new list by returning a view (if the caller never mutates or sorts the result), or a tuple (withtuple(self._prompts.values())), which is slightly faster and uses less memory than a list. Document that this is a tuple now if you return it (the type annotation must be changed only if the function's return type changes).However, if you must keep the return type as
list, this is basically optimal.To further optimize, you can return a statically cached empty list (avoid list allocation) if there are no items. This gives a minor speed boost in empty cases.
Here's the refactored code.
Explanation:
values()access.If mutation of the returned list by the caller is never required and you control call sites, you could.
tuple[Prompt, ...]return tuple(self._prompts.values())This is slightly faster and smaller in memory, but not backward compatible.
No other meaningful optimizations can be made to this function given Python's built-in dict and memory model!
✅ Correctness verification report:
🌀 Generated Regression Tests Details
To edit these changes
git checkout codeflash/optimize-PromptManager.list_prompts-ma2wg3kland push.