Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Appearance settings

Add Limits for pre-allocation resource control#58

Open
lilith wants to merge 1 commit intopedrocr:masterpedrocr/rawloader:masterfrom
lilith:limitslilith/rawloader:limitsCopy head branch name to clipboard
Open

Add Limits for pre-allocation resource control#58
lilith wants to merge 1 commit intopedrocr:masterpedrocr/rawloader:masterfrom
lilith:limitslilith/rawloader:limitsCopy head branch name to clipboard

Conversation

@lilith
Copy link
Copy Markdown

@lilith lilith commented Apr 9, 2026

Summary

Adds caller-configurable resource limits without touching any decoder code. Every field is optional — unset limits add zero overhead.

use rawloader::Limits;

let limits = Limits {
    max_pixels: Some(50_000_000),
    ..Limits::default()
};
let image = rawloader::decode_file_with_limits("photo.cr2", &limits)?;

How it works

When limits has any dimension limit set, decode_with_limits reuses the existing dummy=true path: it parses all the metadata, allocates only a 1-element placeholder vec, and returns a RawImage with width/height populated. Same code that fuzzing uses via decode_dummy.

The flow:

  1. Probe: decode_unsafe(buffer, true) — full metadata parse, no pixel allocation
  2. Validate: check dimensions against limits
  3. Real decode: decode_unsafe(buffer, false) only if dimensions fit

Rejected files never allocate the pixel buffer. Metadata parsing happens twice but it's cheap (microseconds) compared to the actual decode.

When all dimension limits are None, the probe is skipped entirely and decode_with_limits has the same cost as decode(). Only max_input_bytes (a cheap size check) is applied in that case.

What this doesn't change

  • Existing decode, decode_file, decode_dummy, decode_unwrapped are untouched
  • No decoder code, no macros, no decode_threaded, no packed functions
  • The internal alloc_image_plain! 500M/50k panic still acts as a backstop for code paths that don't go through decode_with_limits

Public API additions

pub struct Limits {
    pub max_pixels: Option<u64>,
    pub max_side: Option<u64>,
    pub max_input_bytes: Option<u64>,
}

pub fn decode_with_limits(reader: &mut dyn Read, limits: &Limits) -> Result<RawImage, RawLoaderError>;
pub fn decode_file_with_limits<P: AsRef<Path>>(path: P, limits: &Limits) -> Result<RawImage, RawLoaderError>;

Diff

140 insertions / 1 deletion across 2 files (src/decoders/mod.rs, src/lib.rs).

The 1 deletion is an unrelated single-line clippy fix (decoders::RawLoader::new()RawLoader::new()) that was preventing clean compile on master.

Add a Limits struct with three optional fields:
- max_pixels: width × height limit
- max_side: per-axis limit
- max_input_bytes: input buffer size limit

All fields are Option<u64>. Unset fields impose no limit and add zero
overhead — when no dimension limit is set, decode_with_limits skips the
metadata probe entirely and has the same cost as decode().

When a dimension limit IS set, decode_with_limits uses the existing
dummy=true path to run a metadata-only parse, checks dimensions against
the limits, and only allocates the pixel buffer if the file fits.
Rejected files never allocate.

No decoder code, macros, or signatures are touched. The internal
alloc_image! 500M/50k panic still acts as a backstop for the existing
decode() API.

Public API additions:
- rawloader::Limits
- rawloader::decode_with_limits(reader, &limits)
- rawloader::decode_file_with_limits(path, &limits)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant

Morty Proxy This is a proxified and sanitized view of the page, visit original site.