Jump to content

My script keeps running, even after i close firefox.


zeonos

Recommended Posts

I have a script there download a lot of images from a website it crawls, but when ever i started a crawler, it will keep downloading images.

Even if i close firefox, it will continue to download images from the site.

 

How can i stop it?.

Link to comment
Share on other sites

I did a google search and found a post on another forum that may be relevant to your problem

Normally, if you're sending output, this doesn't happen: if the webserver detects a disconnect, the PHP process is also stoppend (which is why we have functions like register_shutdown_function to make sure something is done (unelss you encounter a fatal error of course)).

 

However, as HTTP is stateless, the only point in time the webserver will detect a disconnect is when it is trying to send content (see also the remarks at ignore_user_abort (default is false, which is what you want)). Depending on the context of the request, a workable kludge could be in big loops to send non-displayed data to the user, in HTML that could be to send whitespace and flush. This can end up in any buffer though, (PHP's, servers, other places in the network) so detection is still not 100%, but that is about unavoidable. A sane time limit to avoid infinite looping, and only upping that limit for requests that actually need them is about the best you can do.

 

of particular note is the following:

However, as HTTP is stateless, the only point in time the webserver will detect a disconnect is when it is trying to send content (see also the remarks at ignore_user_abort (default is false, which is what you want)). Depending on the context of the request, a workable kludge could be in big loops to send non-displayed data to the user, in HTML that could be to send whitespace and flush.

 

in laymens terms, this means that PHP normally does stop executing when a user aborts (IE closes the browser or tab) however, the server will only detect this abortion when it tries to send data to the user (and of course fails). Since you didn't post any code, I can't really be of more help, sorry.

Link to comment
Share on other sites

The code is pretty much.

 

Target url.

Crawl url for threads.

crawl threads for image links.

foreach imagelink, check if it exist, if it does, download it.

when its done with all the links, it will move to the next target url in the database.

Link to comment
Share on other sites

This thread is more than a year old. Please don't revive it unless you have something important to add.

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.