The curl_multi_exec() function executes the requests asynchronously. The first call launches the requests and subsequent calls poll the status of the requests. But the subsequent requests do not block - i.e. the thread of execution will go around that loop as fast as it can, stealing CPU cycles which could be used more productively elsewhere.
A better solution is to inject a [u]sleep() in the loop:
$running && usleep(50000);
Further, in a general purpose library, it would be helpful to limit the amount of time the script will wait for responses. Currently, if a response is delayed, the script will bomb out when it reaches the time limit defined for PHP execution - not exit cleanly.
Since integer arithmetic is faster then dealing with times and clocks, the duration doesn't have to be exact, and curl_multi_exec() will return relatively quickly...
riccardo castagna - 2018-09-12 23:02:43 - In reply to message 2 from Colin McKinnon
And yes, I already knew the point of view of this solution (https://lampe2e.blogspot.com/2015/03/making-stuff-faster-with-curlmultiexec.html), which I had tested before and I had tested many others but they were all solutions for unsatisfactory performance standards for me.
Personally, the use I make of this class, and I use it a lot, is to receive several individual files from different servers and at the moment has always done its job very well.
A different thing is if the class is used to receive from different servers whole web pages that have other resources to load inside them (files css, files js ...). In this case, since the requests are asynchronous and simultaneous codes knead between them.
Is my opinion that the curl multy should not be used for resource requests that have other resources to load within them, I think it was not born for this but for other uses.
riccardo castagna - 2018-09-13 09:33:05 - In reply to message 4 from riccardo castagna
I can add that the Curl_Multi can also be used for requests for entire web contents that have other contents to load inside them (for example whole web pages or APIs) but in this case we must take precautions.
It is necessary to make an inventory of all the internal resources of each web content that you want to receive through the multi-curl, make the request through the cURL_Multi of the main resource and all internal resources that are within them, then from the main resource via ( for example) str_replace remove the internal resources that would have to be loaded and reinstate them from those that we have inventoried and inserted in the requests of the multi curl. This way you have complete control of all resources.
Fortunately, programming in php is not a philosophy that is the art of contradicting, there is the possibility of doing tests and tests prove the validity of something and tell us if something works or does not work.
riccardo castagna - 2018-09-14 13:21:23 - In reply to message 2 from Colin McKinnon
Dear Sir, I find it ethically incorrect to evaluate something without first having carried out a test. I find it improper to diminish the work of others (your comment is: that loop is not right, will not scale / v. inefficient ) and take advantage of the opportunity only to advertise the sale of your book : https://lampe2e.blogspot.com /2015/03/making-stuff-faster-with-curlmultiexec.html .
Before arriving at my solution I did a hard job of performance testing and CPU measurement on solutions proposed by others including that proposed by you which in my opinion is not good.
When I make public my work I am giving something of my value to others for free, I accept criticism only when these have real foundations. I'm not an idiot and I understood perfectly what your intentions were:
try to sell some more copies of your book by trampling on others without real arguments.