Add Limits for pre-allocation resource control#58
Open
lilith wants to merge 1 commit intopedrocr:masterpedrocr/rawloader:masterfrom
Open
Add Limits for pre-allocation resource control#58lilith wants to merge 1 commit intopedrocr:masterpedrocr/rawloader:masterfrom
lilith wants to merge 1 commit intopedrocr:masterpedrocr/rawloader:masterfrom
Conversation
Add a Limits struct with three optional fields: - max_pixels: width × height limit - max_side: per-axis limit - max_input_bytes: input buffer size limit All fields are Option<u64>. Unset fields impose no limit and add zero overhead — when no dimension limit is set, decode_with_limits skips the metadata probe entirely and has the same cost as decode(). When a dimension limit IS set, decode_with_limits uses the existing dummy=true path to run a metadata-only parse, checks dimensions against the limits, and only allocates the pixel buffer if the file fits. Rejected files never allocate. No decoder code, macros, or signatures are touched. The internal alloc_image! 500M/50k panic still acts as a backstop for the existing decode() API. Public API additions: - rawloader::Limits - rawloader::decode_with_limits(reader, &limits) - rawloader::decode_file_with_limits(path, &limits)
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
Adds caller-configurable resource limits without touching any decoder code. Every field is optional — unset limits add zero overhead.
How it works
When
limitshas any dimension limit set,decode_with_limitsreuses the existingdummy=truepath: it parses all the metadata, allocates only a 1-element placeholder vec, and returns aRawImagewithwidth/heightpopulated. Same code that fuzzing uses viadecode_dummy.The flow:
decode_unsafe(buffer, true)— full metadata parse, no pixel allocationdecode_unsafe(buffer, false)only if dimensions fitRejected files never allocate the pixel buffer. Metadata parsing happens twice but it's cheap (microseconds) compared to the actual decode.
When all dimension limits are
None, the probe is skipped entirely anddecode_with_limitshas the same cost asdecode(). Onlymax_input_bytes(a cheap size check) is applied in that case.What this doesn't change
decode,decode_file,decode_dummy,decode_unwrappedare untoucheddecode_threaded, no packed functionsalloc_image_plain!500M/50k panic still acts as a backstop for code paths that don't go throughdecode_with_limitsPublic API additions
Diff
140 insertions / 1 deletion across 2 files (
src/decoders/mod.rs,src/lib.rs).The 1 deletion is an unrelated single-line clippy fix (
decoders::RawLoader::new()→RawLoader::new()) that was preventing clean compile on master.