Jump to content

Getting script to insert 40,000 rows without stopping


RAELWEB

Recommended Posts

Hi all,

 

I am managing a social network and am creating a script to allow the admin to private message all users on the site. Even setting the max execution time to unlimited within the file does not allow us to insert all 40,000 rows into the 'message' table, it stops after several thousand rows. Our hosts tell us setting this globally is a bad idea/unstable.

 

Does anyone have some advice on the best to do this? No email notifications need to be sent so it is purely allowing the script to run it's full course without stopping. I did consider a cron job but without emails doing in 'batches' seems a bit unecessary so I'm clearly missing something. ANy and all advice would be most welcome!

 

Richard

Link to comment
Share on other sites

If your code that inserts 40,000 rows is taking more than a few seconds, there is probably something wrong with your code and you would need to post it in order to get help with it. I'm going to guess you are putting a query inside of a loop. You should either be using a multi-row INSERT or a INSERT ... SELECT statement.

Link to comment
Share on other sites

You could also create a separate message system one of which where you provide your message there. When a user logs in it checks that table for new messages. If theres a new one that isn't in their inbox yet with the regular messages they would get then you can have it add to there inbox at that particular moment. This concept is a bit rough around the edges and would need some smoothing to prevent duplicates and other small issues that come to mind. But it would help reduce the overhead on your server tryin to do 40k+ inserts.

 

Also your hosting provider may be right it could be a bad idea. The hardware that makes up your server might not be able to handle a large set of queries like that and it may be causing the mysql server to crash. I have had similar issues in the past with some of clients hosting companies. Where tooo many queries are going on at once and it bottle necks the server causing it to choke itself to a halt.

 

Alternatively you could go a long the lines you currently are. But when running the insert if your doing it in a dynamic loop you could try keeping a count $i = 0; then every pass of the loop $i=$i+1; and then when $i reaches I dunno say 5,000 you put a sleep/pause in the script for a few seconds to allow it to catch up to itself then reset the count to 0 at that point and then continue.. Although I will say thats a fairly piss poor alternative. More so as you say your building a social network.. you only have 40k now but if your service flurishes and becomes the next FB for example then your eventually going to have millions, and when you reach a couple hundred thousand your going to start having slight issues with your database in general unless you handle it correctly now. Up to and Including load balancing between multiple sql servers.

 

 

Link to comment
Share on other sites

Thank you everyone for your responses so far, they

 

If your code that inserts 40,000 rows is taking more than a few seconds, there is probably something wrong with your code and you would need to post it in order to get help with it. I'm going to guess you are putting a query inside of a loop. You should either be using a multi-row INSERT or a INSERT ... SELECT statement.

 

That's an interesting point. I don't have the code to hand at the moment but it is a loop indeed, literally just getting the user ID's from DB in one query, looping through them and for each ID inserting a new row in the mail table with a single query and with the unique user id in one of the columns. So is there a way to insert 40,000 rows (each with a unique user id) in it without a single query for each one? I'll look into the two things you suggested online myself but any further advice would be great. Of course when I'm back at the office I can post the exact code as well.

 

Thanks again guys.

 

 

Link to comment
Share on other sites

This is only a suggestion.  It would need to be tested.

 

Suggestion, let MySQL do all the work.

 

1. Create a temp table that holds the message.

2. Insert the message to the temp table.

3. Use a INSERT ... SELECT statement to populate the mail table.

4. Temp table destroys itself when the script ends.

 

<?php
$tmp = 'CREATE TEMPORARY TABLE message (
    subject VARCHAR(50) NOT NULL
    , content TEXT NOT NULL 
    , from VARCHAR(50) NOT NULL 
     , ts TIMESTAMP DEFAULT CURRENT_TIMESTAMP)';

mysql_query($tmp);

$sql = "INSERT INTO message(subject,content,from) VALUES ('New Message','You have a new message.','Administrator')";

mysql_query($sql);

$put_to_mail_table = "INSERT mail (subject,content,from,ts,to) SELECT message.subject,message.content,message.from,message.ts,user.id FROM message,user";

mysql_query($put_to_mail_table);
?>

Link to comment
Share on other sites

It would really help if you showed the fraking code. Here is an example of how to insert all the records in a single query statement. Of course, none of the field names would work for your exact situation since you have provided no code to work from

INSERT INTO privatemessages
(senderID, receiverID, message)
VALUES (SELECT $adminID as senderID, userID as receiverID, $message as message
        FROM users
        WHERE userID <> $adminID)

Link to comment
Share on other sites

This thread is more than a year old. Please don't revive it unless you have something important to add.

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.