1

I am building a Laravel web app that performs some long running queries and utilizes a couple (both internal and external) APIs. I am having a hard time figuring out why I can't handle requests in parallel. To shed some light on my issue, here is the high level overview of my system/problem via an example:

  • Page loads
  • AJAX request called on page load which GET's a BigQuery result set (long running query), cleans the data and executes a python clustering algorithm which creates an image and returns the path to that image to the web app
    • Long running (~15 seconds)
    • Will max CPU while performing the Python clustering (at times)
  • AJAX request called which queries an external API for some information and simply displays it
    • Short running (~1-2 seconds)

The issue is that my AJAX requests are not being handled in parallel. The first one is received and the web app does not begin the other until the first is complete. I've checked the network tab in Chrome dev tools and both requests are being made in parallel but the web server is not handling them in parallel.

I cannot determine if this is an error in configuration with php, artisan, Laravel or if I have a whole other problem on my hands. I've done some testing with two simple route closures: one that simply returns a string and another which returns a string after sleep(10). When I call both with AJAX, the instantly returning route does not return until the long running request is served (after sleeping).

TL;DR: It's clear both AJAX calls are being fired and received in parallel, but how can I have my Laravel web app handle the requests in parallel (concurrently)?

Kevin Eger
  • 426
  • 7
  • 19
  • is this site on production server or local homestead? There is MaxClients configuration in Apache that you can check http://stackoverflow.com/a/1430890/2951316 – Can Celik Mar 13 '16 at 18:32
  • Just running locally with `php artisan serve`. – Kevin Eger Mar 13 '16 at 18:43
  • So you're not testing it from different clients? Please take a look at the part where it says "Will the requests be queued?" in the last link I provided. The answer states about 3 possible reasons for your issue. – Can Celik Mar 13 '16 at 18:48
  • From what I remember. Sessions block requests. So if you are using sessions and one request is taking some time to process the new request will have to wait untill the previous request is done. – Jamie Jun 09 '16 at 22:26

1 Answers1

1

For HTTP requests that might take a while, use Laravel's job structure to send the request as job and use either the built in queue or 3rd-party service provider to process the jobs. Laravel doesn't do parallel requests hence job was created.

You're problem is similar to the following thread:
handle multiple post requests to same url Laravel 5

API Docs:
https://laravel.com/docs/5.1/queues#configuration

Community
  • 1
  • 1
Nyankou
  • 86
  • 6