Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Appearance settings

Conversation

@WendellAdriel
Copy link
Contributor

@WendellAdriel WendellAdriel commented Nov 30, 2025

As mentioned on #57973 there are some issues with the concurrency handling on Http::batch and Http::pool that were not captured before.

Core Issue

The concurrency parameter wasn't being respected because all HTTP promises were created upfront before reaching Guzzle's EachPromise. This caused all requests to execute in parallel regardless of the concurrency limit.

Example of the problem:

Http::batch(fn ($batch) => [
    $batch->get('https://httpbin.org/delay/5'),
    $batch->get('https://httpbin.org/delay/5'),
    $batch->get('https://httpbin.org/delay/5'),
])->concurrency(1)->send();
// Expected: ~15 seconds | Actual: ~5 seconds ❌

Solution Approach

Instead of creating a separate DeferredRequest class (which had maintenance overhead), this PR extends PendingRequest itself with optional deferred execution capability:

  1. Added deferred mode to PendingRequest:

    • New $deferredRequests and $deferredKey properties track deferred state
    • New setDeferred() method enables deferred mode
    • HTTP verb methods (get, post, etc.) check for deferred mode
    • When deferred, methods return a DeferredPromise that stores closures
  2. Introduced DeferredPromise class:

    • Lightweight wrapper that stores the request closure in the pool/batch array
    • Supports promise chaining (.then(), .otherwise())
    • Captures transformations and applies them when the promise executes
    • No breaking changes to return types
  3. Updated Pool and Batch classes:

    • Create PendingRequest instances directly with deferred mode enabled
    • No separate DeferredRequest class needed
    • Generator-based promise creation yields promises on-demand

Key Benefits

No breaking changes - Pool::as() and Batch::as() return PendingRequest (or DeferredPromise for HTTP verbs)

Full method chaining - withMiddleware(), withHeaders(), timeout(), etc. all work automatically

Promise chaining support - .then() and .otherwise() work on deferred requests

Low maintenance - Only 6 HTTP verb methods needed explicit deferred checks

Clean architecture - Deferred execution is an internal implementation detail

Example Usage

All existing code continues working, plus promise chaining now works:

// Method chaining works
Http::pool(fn ($pool) => [
    $pool->as('api')->withHeaders(['X-Token' => 'secret'])->get('https://api.example.com'),
    $pool->withTimeout(10)->post('https://api.example.com/data', ['key' => 'value']),
]);

// Promise chaining works
Http::pool(fn ($pool) => [
    $pool->get('https://api.example.com')->then(fn ($response) => $response->json()),
    $pool->get('https://api.example.com')->otherwise(fn ($e) => ['error' => $e->getMessage()]),
]);

// Concurrency properly controlled
Http::batch(fn ($batch) => [
    $batch->get('https://httpbin.org/delay/5'),
    $batch->get('https://httpbin.org/delay/5'),
    $batch->get('https://httpbin.org/delay/5'),
])->concurrency(1)->send(); // Now takes ~15 seconds ✅

@WendellAdriel
Copy link
Contributor Author

@taylorotwell @cosmastech please take a look at this and let me know your thoughts

@WendellAdriel
Copy link
Contributor Author

BTW, there are failing tests that I need to address yet, I just wanted to open the discussion so you can see what I found and what I was thinking about to fix it.

@cosmastech
Copy link
Contributor

@taylorotwell @cosmastech please take a look at this and let me know your thoughts

I had tried this sort of approach before, but I think I switched because it didn't work with chained promises. Like $pool->get('laravel.com')->then(fn ($response) => $response->json())

Not sure if it works with multiple chained calls either (like setting the user agent and then calling get()).

@WendellAdriel
Copy link
Contributor Author

@taylorotwell @cosmastech please take a look at this and let me know your thoughts

I had tried this sort of approach before, but I think I switched because it didn't work with chained promises. Like $pool->get('laravel.com')->then(fn ($response) => $response->json())

Not sure if it works with multiple chained calls either (like setting the user agent and then calling get()).

The tests that were failing were related to that I think, like this one:

$responses = $this->factory->pool(fn (Pool $pool) => [
    $pool->withMiddleware($middleware)->post('https://example.com', ['hyped-for' => 'laravel-movie']),
]);

But now I managed to fix it in my last commit by handling the method chaining issues in the DeferredRequest class. if you want to run some more tests with this approach to check, it would be great @cosmastech!

@WendellAdriel
Copy link
Contributor Author

@taylorotwell @cosmastech I'll be out of home for some time and today I won't be able to work on this anymore, however tomorrow if any other changes are needed, I can take a look at it.

Please let me know your thoughts on the fix.

@jonnywilliamson
Copy link
Contributor

@cosmastech Was wondering if this is what you experienced as well?

Hi there! First off, thanks for all the work on this feature, it's been really useful!

I've been implementing and switching loads of my old gnarly Guzzle pool code to the new Http::batch and pool concurrency recently and ran into a specific behavior regarding promise chaining that I wanted to clarify.

  • Are chain links supported in the requests used by the Batch / Pool methods?

Is it expected behavior that the Batch returns the result of the initial request only, ignoring the return value of any .then() chain attached to it?

I have created a reproduction test case below. It demonstrates that while the chained request (URL 2) does execute successfully (proved by capturing it in a side-effect collection), the $responses array returned by Http::batch contains the response from the first URL, effectively discarding the result of the chain.

Is this the intended design?

<?php

use Illuminate\Http\Client\Batch;
use Illuminate\Http\Client\Request;
use Illuminate\Support\Facades\Http;

it('ignores the promise chain and returns the result of the initial request in the batch', function () {
    Http::preventStrayRequests();

    Http::fake([
        'http://localhost/first'  => Http::response('You hit the first url', 200),
        'http://localhost/second' => Http::response('You hit the second url', 200),
    ]);

    // Capture internal chain results to prove execution
    $insideChainResults = collect();

    // 2. Execute Batch
    // We are creating a chain: Request 1 -> then -> Request 2
    // Intuitively, we might expect the result to be Request 2.
    // However, Laravel's Batch system looks at the request object created by ->as(),
    // NOT the promise returned by the closure array.
    $responses = Http::batch(fn (Batch $batch) => [
        $batch->as('request_1')
            ->get('http://localhost/first')
            ->then(function () use ($insideChainResults) {
                $response = Http::get('http://localhost/second');
                $insideChainResults->push($response->body());

                return $response;
            }),

        $batch->as('request_2')
            ->get('http://localhost/first')
            ->then(function () use ($insideChainResults) {
                $response = Http::get('http://localhost/second');
                $insideChainResults->push($response->body());

                return $response;
            }),

        $batch->as('request_3')
            ->get('http://localhost/first')
            ->then(function () use ($insideChainResults) {
                $response = Http::get('http://localhost/second');
                $insideChainResults->push($response->body());

                return $response;
            }),
    ])->send();

    // 3. Assertions

    // PROOF: The body matches the FIRST url, not the SECOND.
    // This proves the .then() chain was effectively ignored by the Batch results collector.
    expect($responses['request_1']->body())->toBe('You hit the first url');
    expect($responses['request_1']->body())->not->toBe('You hit the second url');

    expect($responses['request_2']->body())->toBe('You hit the first url');
    expect($responses['request_3']->body())->toBe('You hit the first url');

    // PROOF 2: The chain DID execute and fetch the second URL
    // We expect 6 total requests: 3 for 'first', 3 for 'second'
    Http::assertSentCount(6);

    // Verify specifically that the second URL was called 3 times
    $secondUrlCalls = Http::recorded(fn (Request $request) => $request->url() === 'http://localhost/second');
    expect($secondUrlCalls)->toHaveCount(3);

    // Verify the data was captured inside the chain
    // This proves the inner logic ran and received the correct data, even though the batch threw it away.
    expect($insideChainResults)->toHaveCount(3);
    expect($insideChainResults->every(fn ($body) => $body === 'You hit the second url'))->toBeTrue();
});

@jonnywilliamson
Copy link
Contributor

CleanShot 2025-12-03 at 10 10 02 Ah you have to be kidding...

A PR exactly about this just released into the framework 9 hours ago. Like.....I can't even... 😆

@shaedrich
Copy link
Contributor

@jonnywilliamson And then there's also #57973

@WendellAdriel
Copy link
Contributor Author

@shaedrich @jonnywilliamson this PR was a PoC to help out @cosmastech to see what was my POV on how to fix this. Most probably the PR that will get merged is @cosmastech one shared by @shaedrich.

We're waiting some feedback from Taylor on both to move forward.

@cosmastech
Copy link
Contributor

@jonnywilliamson

A PR exactly about this just released into the framework 9 hours ago. Like.....I can't even... 😆

😃 I came across that when I was working on a blog post about using Promises. Hope the change helps you in untangling those pooled Guzzle requests.

@WendellAdriel
Copy link
Contributor Author

@cosmastech I was just playing around with an alternative solution to not add the breaking changes you mentioned. I ended up with a similar approach that you did, but there are still some differences, however the result is the same. It was more for me to check how I could not introduce that breaking change xD

I see there are some tests failing, however, they don't seem related to this implementation.

@jonnywilliamson
Copy link
Contributor

jonnywilliamson commented Dec 3, 2025

😃 I came across that when I was working on a blog post about using Promises. Hope the change helps you in untangling those pooled Guzzle requests.

This is excellent, and I'll need to read it a couple of times to digest, but already this morning having the batch/pool be able to deal with chain promises is just excellent and making this super fast.

Once this concurrency issue gets fixed it's going to be wonderful.

I also agree with the definition I saw in the other thread that the way it should work is that with say, 10 requests and concurrency of 5, when ANY of the 1st 5 requests finish then the 6th request should be allowed to start. Not waiting for all 5 to finish before sending off the next one.

Really appreciate the work you're putting into this area. It's a massive part of my app and one I had to do unspeakable things withe code editor to make it work with raw guzzle promises. I'm not proud of myself.... 😆

@taylorotwell
Copy link
Member

So, what's the status of things here? @WendellAdriel @cosmastech

@cosmastech
Copy link
Contributor

cosmastech commented Dec 4, 2025

So, what's the status of things here? @WendellAdriel @cosmastech

@taylorotwell

I'm partial to my implementation because who doesn't love their own children? But they both achieve the same ends.

They operate the way you had described before. For example, you have 6 requests and a concurrency of 3. If request 1 is slow but 2 and 3 are fast, it will immediately progress to requests 4 and 5 while it waits for 1 to finish. It will always have 3 requests in-flight.

@WendellAdriel
Copy link
Contributor Author

Hey @taylorotwell
As @cosmastech mentioned, both implementations are achieving the same results.

I didn't benchmark both yet tho to see difference in performance if any. If you want I can do that. If you think it's not needed (because they are similar) I'd say that either implementation can be merged because any of these will fix the current issue.

It will come down to your preference on the approach of each solution.

I'm happy to discuss any additional details if needed.

@jonnywilliamson
Copy link
Contributor

@WendellAdriel @cosmastech
I'm very invested in this feature, cannot wait for it to be resolved!!

So let me get this straight, just so I know I have this correct in my mind...

It seems like both approaches have effectively converged on the same engine: Generators feeding Guzzle's EachPromise.

This effectively solves the original issue (everything firing at once) and ensures we get "Sliding Window" concurrency (where a new request starts precisely when a previous one finishes) rather than "Batching" (waiting for a group to finish).

Since the .then() chaining bug was fixed separately in #57967, the choice between these two PRs seemingly comes down to Architecture vs. API Consistency:

PR #57977 (Wendell)

  • Strategy: Modifies PendingRequest internals to support a "deferred" state.
  • The Trade-off: It keeps the syntax fluid but changes the return type of the pool verbs (get, post, etc.) to a custom DeferredPromise class (instead of a standard Guzzle Promise). This might affect strict type-checks or custom macros expecting the original interface, in fact I think it might affect some of my code, but I can update that easily enough.

PR #57973 (Cosmas)

  • Strategy: Wraps the execution logic externally (via LazyPromise or similar wrapper).
  • The Trade-off: It attempts to keep PendingRequest "pure" / stateless regarding the pool execution, aiming for higher isolation at the cost of an extra wrapper layer.

Does that sound about right? If so, it really just comes down to the boss' preference!

@cosmastech
Copy link
Contributor

cosmastech commented Dec 5, 2025

@WendellAdriel @cosmastech I'm very invested in this feature, cannot wait for it to be resolved!!

So let me get this straight, just so I know I have this correct in my mind...

It seems like both approaches have effectively converged on the same engine: Generators feeding Guzzle's EachPromise.

This effectively solves the original issue (everything firing at once) and ensures we get "Sliding Window" concurrency (where a new request starts precisely when a previous one finishes) rather than "Batching" (waiting for a group to finish).

Since the .then() chaining bug was fixed separately in #57967, the choice between these two PRs seemingly comes down to Architecture vs. API Consistency:

PR #57977 (Wendell)

  • Strategy: Modifies PendingRequest internals to support a "deferred" state.
  • The Trade-off: It keeps the syntax fluid but changes the return type of the pool verbs (get, post, etc.) to a custom DeferredPromise class (instead of a standard Guzzle Promise). This might affect strict type-checks or custom macros expecting the original interface, in fact I think it might affect some of my code, but I can update that easily enough.

PR #57973 (Cosmas)

  • Strategy: Wraps the execution logic externally (via LazyPromise or similar wrapper).
  • The Trade-off: It attempts to keep PendingRequest "pure" / stateless regarding the pool execution, aiming for higher isolation at the cost of an extra wrapper layer.

Does that sound about right? If so, it really just comes down to the boss' preference!

That sounds about right. The change in this PR adds a new class which breaks expectations about the docblock's return type.

It also feels a little bit harder to reason about the setDeferred stuff, which seems like it is a responsibility of the Pool/Batch and not the PendingRequest itself. It being a public method might lead people to try to use it themselves, which I don't think it's intended for. It keeps a reference to the pooled requests inside of the PendingRequest and inside of Pool/Batch, which seems like it could be problematic. (Though I cannot immediately think of a case of how it would be, so maybe not).

But again, I spent much more time with my own code than this, so I recognize I have a natural bias to feel that one is clearer than the other.

Mine has the shortcoming that the LazyPromise isn't fully fleshed out. resolve, reject, and cancel all just throw an exception, and maybe those should be considered more carefully. I also moved FluentPromise to a different namespace, which I can certainly revert if needed. Given that FluentPromise was released just a few days ago, I doubt it's going to break much for people, but still has the potential to be a breaking change.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants

Morty Proxy This is a proxified and sanitized view of the page, visit original site.