Jump to content

500 server error after specific number of cURL executions


Kurrel

Recommended Posts

Hi there,

 

I'm writing an application that pulls from two separate web based APIs, collates the data into a single array of data.  Once done, it then enters a loop that cycles through the array and executes a cURL call for each cycle.  This worked perfectly for a controlled set of 10 artificial records and so I've started testing with live data.

 

On running the code, I eventually the following :

Internal Server Error

The server encountered an internal error or misconfiguration and was unable to complete your request.

 

I'm not sure how to debug this and so I'm hoping I can find some help here.

I'm running WAMP, and the php.ini contains the following:

max_execution_time = 0 ; Maximum execution time of each script, in seconds

max_input_time = 0 ; Maximum amount of time each script may spend parsing request data

 

Through testing, I have discovered the following :

The script can do exactly 35 iterations through the loop in under 2 minutes.

Any number greater than this causes the error.

If I skip the first 20 curl submits, I can then do 21 to 55 iterations before failure, so I do not believe it is the 36th value.

If I put a sleep(5) command into the loop, I can still do 35 and no more iterations than that.

If I put a sleep(10) command into the loop, it fails on 35 and needs less.

 

$action		= "post";
				$url			= "http://url.url.url/api/v1.0/udo_Position/create"; // I've replaced the actual URL out of necessity

				// set user agent {
					print("Setting CURL options, using URL ".$url."<br>");
					$cd = curl_init();
					curl_setopt($cd, CURLOPT_HEADER, 1);
					curl_setopt($cd, CURLOPT_URL, $url);
					curl_setopt($cd, CURLOPT_USERPWD, 'username:password');
					curl_setopt($cd, CURLOPT_RETURNTRANSFER, 1);	// Don't send return value to screen
					curl_setopt($cd, CURLOPT_USERAGENT, "Firefox/2.0.0.1"); // spoofing FireFox 2.0
					curl_setopt($cd, CURLOPT_VERBOSE, true);
				// }

				if ($action == 'get') {
					;
				} elseif ($action == 'post') {
					print("POST set;<br>");
					print("<pre style='font-family:verdana;font-size:13'>");
					print_r($pointval);
					print("</pre>");
					$pointval["save"]	= "Save"; // The AddNew step form requires this.

					curl_setopt($cd, CURLOPT_POSTFIELDS, $pointval);
				} else {
					print("Invalid usage. Please see example<br>");
					return;
				}

				$reply = curl_exec($cd);

				$http_status = curl_getinfo($cd, CURLINFO_HTTP_CODE);
				print("<pre style='font-family:verdana;font-size:13'>");
				print_r($http_status);
				print("</pre>");

				print("Here is the reply from the AddNew function: <br>");
				print("<pre style='font-family:verdana;font-size:13'>");
				print_r($reply);
				print("</pre>");

				if (curl_error($cd)) {
					print("Error: ".(curl_error($cd))."<br>");
				}

				curl_close($cd);
				print("Insert Successful.<br>");

 

I am uncertain whether it is a PHP, Apache or cURL issue at this point.  I have another script that pushes a great deal more cURL calls to the same system but pulls from only one external resource, instead of two.  Any help or suggestions would be appreciated.

Link to comment
Share on other sites

1, I didn't include the loop, just the CURL call that is inside the loop.

2, I'm almost certain it's returned via my server.  It's a full on white screen without me printing out the response... is there any further way to make sure of this?

3, Unfortunately, the script is pretty specific to APIs I know on all three systems.  However, you have got me thinking.  I'm going to load a large set of dummy data and remove one, then the other and finally both exterior calls from the script.  If one is breaking, that'll be a big step in the right direction.

Link to comment
Share on other sites

The only reason I can think that it would be coming from your server is if it's a threading issue, I'm not sure how cURL forks requests so it may be that it's flooding your servers connection threads.  If that's the case you'll need someone else to help you out, and probably the same person will point out that I have used completly erronious language in my atempt to describe an issue I know very little about ;D

 

Do let us know how you get on with the data testing though.

Link to comment
Share on other sites

So, filling 50 rows with data and trying just the set of cURL processes results in the same timeout.  35, 36 and 37 records succeeds but 39 does not so I am able to send a few more.

 

I'm now sure it must be a time out somewhere, but trying to figure out where remains elusive.

Link to comment
Share on other sites

While you're waiting for a curl expert ... check the basics:

 

Did you check the Apache error log?

 

Could there be a memory leak in curl causing the php memory use to continuously increase?

 

Do you have error_reporting on so you can see any PHP warnings, notices, or errors?

 

Link to comment
Share on other sites

Hi there,

 

I've checked the apache error logs and it contains details of recent restarts.  I can see errors I'd expect from other things if I look a week or two back, but nothing related to this.

 

How would I check for a memory leak?  I've watched task manager for a while but it doesn't give very particular date.

 

I have error reporting set to the following :

error_reporting  =  E_ALL & ~E_NOTICE & ~E_STRICT

Is there a way to check these details despite the 500 error (seeing as it dominates the screen)

Link to comment
Share on other sites

After reading up on it, I've changed error_reporting to

error_reporting  =  E_ALL & E_NOTICE & ~E_STRICT

 

It's providing some more details to work with.

 

Thanks for that tip, even if it doesn't answer THIS problem it's useful to know about.

Link to comment
Share on other sites

For development, I usually use error_reporting(-1); (or E_ALL) so I see ALL of the messages.

 

On the memory issue, you can use memory_get_usage, which "Returns the amount of memory, in bytes, that's currently being allocated to your PHP script. " Now that I think about it, though, there is a specific error that is printed when PHP exceeds the memory limit. But you could check the value before and/or after each loop cycle to see how much it is jumping.

 

I would think that a 500 error would put something in the Apache error log.

 

This may be an issue with PHP/Apache not releasing the socket right away when the curl session is closed. This might be causing you to exceed the allowed number of open file descriptors (which is an O/S level setting). See if there is a "Keep Alive" setting (or something) in the curl options and whether you can reduce the time or turn it off.

 

Have you considered/tried moving the curl_init() and curl_close() functions outside the loop? I don't know how much rewrite this would require (or if it would even work), but I wonder if using the same curl handle would help.

Link to comment
Share on other sites

  • 1 month later...

I was having the same problem with my CURL and suspected it was a Go daddy issue.  I doubled post_max_size within my php5.ini file and it works now.

 

Perhaps this could work for your issue, or you may have found a solution a long time ago.

Link to comment
Share on other sites

This thread is more than a year old. Please don't revive it unless you have something important to add.

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.