1

I want to run this piece of code concurrently. There is a txt file in the directory named as id.txt where all parameters are replace in the Cookie in headers below. If I'm running this by opening 10 cmd tabs then the process gets slow. So please help me how can I run this concurrently or any suggestions by you

$i=trim(file("index.txt")[0])+0;
$id=file("id.txt")[$i];
$idd=preg_replace('/\s+/', '', $id);
file_put_contents("index.txt",$i+1);
echo $idd;  
while(true){

$url8 = "https://www.example.com/";
$header= array("Accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,*/*;q=0.8",
                 "Accept-Language: en-US,en;q=0.5",
                 "Content-Type: application/x-www-form-urlencoded",
                 "Content-Length: 0",
                 "Cookie: $idd");

$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url8);
curl_setopt($ch, CURLOPT_HTTPHEADER,$header);
curl_setopt($ch, CURLOPT_USERAGENT, "Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:81.0) Gecko/20100101 Firefox/81.0");
curl_setopt($ch, CURLOPT_HEADER,1);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_NOBODY, 1);
$out = curl_exec($ch);
echo "\n";
echo $currentTime = date( 'h i s', time () );
if(preg_match('/done/',$out)){
    die(' done');   
}
}
  • 1
    instead of using curl are you interested in using guzzlehttp to handle your requests then you can use guzzle to send requests concurrently. – bhucho Nov 22 '20 at 09:51
  • @bhucho instead of guzzling cpu and resources, you could use `curl_multi_*` , https://www.php.net/manual/en/function.curl-multi-init.php – YvesLeBorg Nov 22 '20 at 09:56
  • @YvesLeBorg I've used curl multi but some requests didn't processed. – Sanjeev Singh Nov 22 '20 at 11:29
  • @bhucho please help me how can I do it through guzzlehttp :) – Sanjeev Singh Nov 22 '20 at 17:56
  • for that there are few requirements first you need [composer](https://getcomposer.org/)(so that you can install guzzle), then you need to do composer init it will initialize a composer.json file & then add guzzle package using it. Inform if you have done uptil this step. – bhucho Nov 22 '20 at 18:29
  • ... that loop should never exit, as CURLOPT_NOBODY should prevent curl from downloading the body, resulting in curl_exec() returning emptystring (or false), but *NEVER* `done` – hanshenrik Nov 22 '20 at 19:41
  • @bhucho I've installed composer and tried the package working fine but how can I run concurrently ? – Sanjeev Singh Nov 23 '20 at 08:06
  • @hanshenrik I need response from headers which is location and if the condition fulfils, code will get automatically exit. – Sanjeev Singh Nov 23 '20 at 08:12
  • @SanjeevSingh you can try out this answer [Using guzzle to perform batch requests](https://stackoverflow.com/questions/64919582/using-guzzle-to-perform-batch-requests/64924881#64924881), here the concurrency rate is 5, (means 5 requests will be send concurrently, I would recommend to not use more than 6) – bhucho Nov 23 '20 at 08:19
  • use [guzzle docs](https://docs.guzzlephp.org/en/stable/quickstart.html) to know more that which function does what – bhucho Nov 23 '20 at 08:21

1 Answers1

0

As I see, it gets slow because all the instances of the script you run are accessing the same file simultaneously, so there is a delay when obtaining the handle needed for accessing the file. You should use different filenames (ex. index1.txt, index1337.txt) and avoid writing to the same file with multiple instances.

I hope my answer helped :)

edshot
  • 21
  • 1
  • 4
  • Then I've to save cookie in a different txt file ? but it takes too much time to do it and then fetch it into cookie in headers. – Sanjeev Singh Nov 23 '20 at 08:08