Jump to content

Insert data into 100 million rows without 500 error?


casualventures

Recommended Posts

So when I execute this it'll go for about 130k rows then give a 500 error. Any ideas how to avoid this and get it to complete?

 

$i=10000000;
while($i<=99999999)
  { 
  $public = $i;
  $value = rand(10000000, 99999999);  
  mysql_query("INSERT INTO yummy_table (public, value) VALUES('$public', '$value' ) ") or die(mysql_error());  
  $i++;
  }
echo "done";

Link to comment
Share on other sites

Why wouldn't you just generate each row as you need it? Add a row with the next public value with a generated random value? The result will be the same.

 

To get your existing code to operate several times faster would require that you use a multi-value insert query -

 

INSERT INTO yummy_table (public, value) VALUES (x,y),(),(),(),...

 

You would build a query string with as many (x,y) values as you can (there is a limit to the size of each query string.) By putting 10k - 100k values in each query, you will reduce the number of queries.

Link to comment
Share on other sites

This thread is more than a year old. Please don't revive it unless you have something important to add.

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.