When updating wordpress, this error appears:
Error message: cURL error 28: Connection timed out after 10001 milliseconds

Resolution: Curl the site with the private IP and not the public IP. From inside the private network each server or device is known only by it’s private IP address and is always referenced using that address.

Ref: https://www.the-art-of-web.com/system/iptables-nat/

Also, on cpanel that is behind a firewall, you can check the file for the mappings:/var/cpanel/cpnat

[stextbox id=”warning”]Error: SSL read: error:00000000:lib(0):func(0):reason(0), errno 104[/stextbox]

Getting a curl error from a php script. Try to run curl without php.

ssh to the server
find the script

cd /var/www/vhosts/path/to/script
nano script.php

Find the call and execute from a command line

curl --verbose https://api-internal.script.com

Output:

* About to connect() to api-internal.script.com port 443
* Trying 54.183.xxx.xxx... Connection refused
* couldn't connect to host
* Closing connection #0
curl: (7) couldn't connect to host
[1] 418
[2] 419
[3] 420
[1] Done entityType=destination?searchDate=2014-08-31
[2]- Done partySize=4
[3]+ Done searchTime=12%3A00

It looks like the connection is being refused by the remote server. Curl the page directly which takes PHP out of the equation. This is the response you get when getting the URL directly from SSH.

Curl can be helpful in testing many things including web sites.

See if curl is installed

Using ssh:

[root@localhost root]# which curl

This will tell you if the system has curl installed. But you need to have libcurl, and the curl PHP extension to be able to use curl in PHP. To see if it’s enabled, simply do:

phpinfo();

in a PHP file, and see what it outputs. It will list all active extensions (and some more info). CTRL-F for curl in that output.

Check a site load time:

time curl -s http://www.coldriverdata.com > /dev/null

Output:

real    0m0.191s
user    0m0.004s
sys     0m0.000s

Stress test a Site:

The Curl syntax allows you to specify sequences and sets of URL’s. Say for example we’re going to run a load stress test against this site we can run…

curl -s "http://coldriverdata.com?[1-1000]"

This will make 1000 calls to coldriverdata.com i.e.

http://coldriverdata.com?1
http://coldriverdata.com?2
http://coldriverdata.com?3

http://coldriverdata.com?1000

So say you want to stress test your web application and it won’t complain if it’s fed an extra parameter, 10,000 calls could be done something like.

curl -s "http://yourappp.com/your_page_to_test.php?[1-10000]"

Multiple Pages

Easy just add each page to the command line.

curl -s "http://yourapp.com/page1.php?[1-1000]" "http://yourappp.com/page2.php?[1-1000]"

Or even…

curl -s "http://yourapp.com/page{1, 2}.php?[1-1000]"

Timing

Using the time command we can get a view on our performance

time curl -s "http://yourapp.com/page{1, 2}.php?[1-1000]"

real 0m0.606s
user 0m0.009s
sys 0m0.008s

Simulating consecutive users

OK, this is great for sending a whole bunch of calls one after the other but what about simultaneous calls. For this we can place the Curl calls in a script and set them running in the background. i.e. my_stress_test.sh

curl -s "http://yourapp.com/page{1, 2}.php?[1-1000]" &
pidlist="$pidlist $!"
curl -s "http://yourapp.com/page{1, 2}.php?[1-1000]" &
pidlist="$pidlist $!"
curl -s "http://yourapp.com/page{1, 2}.php?[1-1000]" &
pidlist="$pidlist $!"
curl -s "http://yourapp.com/page{1, 2}.php?[1-1000]" &
pidlist="$pidlist $!"
curl -s "http://yourapp.com/page{1, 2}.php?[1-1000]" &
pidlist="$pidlist $!"
curl -s "http://yourapp.com/page{1, 2}.php?[1-1000]" &
pidlist="$pidlist $!"
curl -s "http://yourapp.com/page{1, 2}.php?[1-1000]" &
pidlist="$pidlist $!"
for job in $pidlist do
echo $job
wait $job || let "FAIL+=1"
done

if [ "$FAIL" == "0" ]; then
echo "SUCCESS!"
else
echo "EPICFAIL! ($FAIL)"
fi

Then run

time my_stress_test.sh

NOTE:

This does not simulate user behaviour exactly as the browser is not only downloading the page but all attached images, javascripts, stylesheet etc. You could simulate this too by adding the URL’s to the url command.