Jump to content

Cron job help


cmgriffing

Recommended Posts

I have some cron jobs that pull xml and some webcam images from their respective source sites (NOAA, DOT) and saved to the local server.  This is done to ease the strain on the external sites in the event of a user spike on our website. 

 

Now, the code works most of the time, but you can see that I dont have any error handling.  Sometimes, one of the webcam images will fail to load, and other times I have seen the weather xml feed fail producing some unaestetic php error codes on the site until a half hour later when the cron job runs again.

 

My questions:

-Whats the best way to make it try again if it fails?

 

-Are the set time limit statements necessary?  I did it as a keep alive.

 

-I also put the sleep statements there to space things out, since there are 6 webcam scripts that run.

 

Anyway, I would appreciate any suggestions.

 

Thanks

-Chris

 

[attachment deleted by admin]

Link to comment
Share on other sites

Alright here is my code in php tags:

 

This is the forecast cron job.

<?php
set_time_limit(60);
//Stevens Weather
copy("http://www.wrh.noaa.gov/forecast/xml/xml.php?duration=96&interval=6&lat=47.75&lon=-121.09", "stvns_wthr.xml");
sleep(20);
set_time_limit(60);
copy("http://www.wrh.noaa.gov/forecast/xml/xml.php?duration=96&interval=6&lat=47.43&lon=-121.41", "snoq_wthr.xml");
sleep(20);
set_time_limit(60);
copy("http://www.wrh.noaa.gov/forecast/xml/xml.php?duration=96&interval=6&lat=47.28&lon=-120.42", "mssn_wthr.xml");
sleep(20);
set_time_limit(60);
copy("http://www.wrh.noaa.gov/forecast/xml/xml.php?duration=96&interval=6&lat=45.34&lon=-121.72", "baker_wthr.xml");
sleep(20);
set_time_limit(60);
copy("http://www.wrh.noaa.gov/forecast/xml/xml.php?duration=96&interval=6&lat=46.9&lon=-121.51", "crstl_wthr.xml");
?>

 

Here is one of the webcam scripts.

<?php
set_time_limit(60);
//Stevens Webcams
copy("http://www.stevenspass.com/Stevens/SiteAssets/_ftp/webcam/stevenspass.jpg", "stevenspass.jpg");
sleep(10);

set_time_limit(60);
copy("http://www.stevenspass.com/Stevens/SiteAssets/_ftp/webcam/stevenspass2.jpg", "stevenspass2.jpg");
sleep(10);

set_time_limit(60);
copy("http://images.wsdot.wa.gov/us2/Stevens/sumteast.jpg", "sumteast.jpg");
sleep(10);

set_time_limit(60);
copy("http://images.wsdot.wa.gov/us2/stvldg/sumtwest.jpg", "sumtwest.jpg");
?>

 

So my questions again are:

 

1) Are the set time limit statements necessary for keeping the script from timing out if I have no access to the php.ini to change timeout length?

 

2) How should I implement error handling that retries the fetching of a file if it fails?  My first guess is a try/catch, but I havent used those before.

 

Thanks.

-Chris

Link to comment
Share on other sites

This thread is more than a year old. Please don't revive it unless you have something important to add.

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.