Jump to content

php undefined offset


eo92866

Recommended Posts

hi... i'm getting an undefined offset for the associative arrays [28-46] in this format.

 

                $sql="INSERT INTO tableName (name1, name2, name3, name4... this goes all the way to name46) VALUES ('".$data[0]."', '".$data[1]."', '".$data[2]."', '".$data[3]."' ...this goes all the way to name46)";

                $sqlData = mysql_query($sql);

 

 

 

i have read that i can prevent the notices by doing the following, but it's not working for me:

 

                $sql="INSERT INTO tableName
                                                (
                                                'name1' => "isset(.$data[0].) ? $data[0] : 'default value'",
                                                'name2' => "isset(.$data[1].) ? $data[1] : 'default value'",
                                                'name3' => "isset(.$data[2].) ? $data[2] : 'default value'",
                                                'name4' => "isset(.$data[3].) ? $data[3] : 'default value'",
#this goes all the way to name46
					)";

                $sqlData = mysql_query($sql);

 

can anyone please assist me?

Link to comment
Share on other sites

the information is very unique... not just name1-46 naturally.

 

i am placing csv data into mysql...

 

$handle = fopen("$value", "r");

$row = 1;

$i = 0;

        while ($i<10000)
                {
                $data = fgetcsv($handle, 1000, ",");
#plus prior code here...

Link to comment
Share on other sites

The only way a count() of the $data array would give that high of a value is if there are no new-lines in the file or they are not being seen by php.

 

Perhaps if you post an example of what the csv file structure looks like (how many values per line do you expect to get.)

Link to comment
Share on other sites

the csv file has 46 columns and over 10000 rows. these columns consist of fields that can be null values, but mainly have text, some numbers and some special characters like, @ etc. i do expect to get anywhere between 8000 to 10000 characters per row/line.

 

an example of the data:

 

US      9200        This is the first day.        382700          Above-12        etc...

 

does this help?

Link to comment
Share on other sites

so all of the code up to the mysql insert is:

 


$files = glob("/my/file/path/US_*.csv");

// just to see what is there
#print_r($files);


$max_loop = 1;
$count = 0;

//loop for iterating $files into array $file and then into $value
foreach($files as $file => $value)
        {
        //increment the loop by 1
        $count++;
        //set the variable $count to variable $max_loop
        if ($count==$max_loop)
                {
                //create a break, because now have made the counter equal to 1
                break;
                //get the contents of $value
                $contents = file_get_contents($value);
                }
        }

// just to see what is there
#print_r($contents);


// place $value into fopen... to be opened and into array $handle, the "r" stands for read only
$handle = fopen("$value", "r");

$row = 1;

$i = 0;

        while ($i<10000)
                {
                $data = fgetcsv($handle, 1000, ",");
//undefined offset error on $sql
                $sql="INSERT INTO tableName (name1, name2, name3, name4... this goes all the way to name46) VALUES ('".$data[0]."', '".$data[1]."', '".$data[2]."', '".$data[3]."' ...this goes all the way to name46)";

                $sqlData = mysql_query($sql);

 

Notice: Undefined offset: 28 in /myfile.php on line 52

 

 

Link to comment
Share on other sites

You are likely reading past the end of the file with your while ($i<10000) logic or you have lines that don't contain the expected number of csv data.

 

The php.net page for fgetcsv contains an example of how you would read until the end of the file is reached and someone already suggested using count($data) to find out how many pieces of data was actually read from each line.

Link to comment
Share on other sites

Before you attempt to process the data, make sure that you are even reading the data. The following is the fgetcsv example from the php.net documentation, wrapped in a foreach() loop that gets all the matching file names out of the folder of your choice -

<?php
$files = glob("myfilepath/US_*.csv");
foreach($files as $file){
echo "File: $file<br />\n";
$row = 1;
if (($handle = fopen($file, "r")) !== FALSE) {
	while (($data = fgetcsv($handle, 1000, ",")) !== FALSE) {
		$num = count($data);
		echo "<p> $num fields in line $row: <br /></p>\n";
		$row++;
		for ($c=0; $c < $num; $c++) {
			echo $data[$c] . "<br />\n";
		}
		// form and execute the query here...
		//$sql="INSERT INTO tableName (name1, name2, name3, name4... this goes all the way to name46) VALUES ('".$data[0]."', '".$data[1]."', '".$data[2]."', '".$data[3]."' ...this goes all the way to name46)";
		//$sqlData = mysql_query($sql);
	}
	fclose($handle);
} // end fopen()
} // end foreach()
?>

Link to comment
Share on other sites

if count($data) is not equal to the expected value, you should display the file name/line number and data you did get from the file for that line.

 

You either have a row in a file that doesn't have the expected number of csv data OR you have a new-line as part of the data and the fgetcsv function has stopped reading the line at that point. You might also have data that contains comma's that is not enclosed inside of  double-quotes in the data.

Link to comment
Share on other sites

i think i have a combination of:

 

a row in a file that doesn't have the expected number of csv data

 

 

a new-line as part of the data and the fgetcsv function has stopped reading the line at that point

 

 

which brings me to a question... would it be better to utilize a different process of handling the data instead of fgetcsv? like one of the functions that are listed on fgetcsv page? o am i completely missing something?

 

Link to comment
Share on other sites

If some of your data in the file does not contain the expected number of fields, your code needs to deal with that, by either ignoring the whole line or filling the missing values with a known value when it builds the query. If you have data that contains new-lines, as long as that piece of data is enclosed in double-quotes, the new-line will be treated as data and not the end of the line.

 

Computers only do what their code tells them to do. No matter what code you use, it must handle the data or you must fix the data so that it always matches what the code expects.

Link to comment
Share on other sites

went back and reviewed data... all the fields have characters, either with needed information or with N/A.

 

and in my sql query... i'm looking for 47 positions, so i should have all of the request names that i'm inserting into, being placed into the associative array.

 

but now i'm just lost and confusing myself a bit more... cause i feel i'm at the same spot i started from with my original code. i may have gone about it differently and not as elegantly though.

Link to comment
Share on other sites

$k      = array_keys($data);
$fields = array();
for($i = 0; $i < 47; $i++)
{
   if ( !in_array($i, $k) )
   {
      $data[$i] = 'N/A';
   }
   $fields = 'name' . $i;
}

$query  = "INSERT INTO tableName ( " . implode(', ', $fields) . " ) VALUES ( '" . implode("', '", $data) . "' )";
$result = mysql_query($query)or trigger_error(mysql_error());

Link to comment
Share on other sites

thank you...

 

i'm still not getting this to work properly... i believe my issue is that i'm now not placing the field headings into $fields (and i've attempted placing all of the names of the field headings into $fields with no success... i'm receiving a bad argument warning at the implode and a Notice: Column count doesn't match value count at row 1.

 

$files = glob("/my/file/path/US_*.csv");

$handle = fopen("$files", "r");

#               ini_set('auto_detect_line_endings', true);
                $data = fgetcsv($handle, 1000, ",");
                $k = array_keys($data);
                $fields = array();
                        for($i = 0; $i < 47; $i++)
                                {
                                if ( !in_array($i, $k) )
                                        {
                                        $data[$i] = 'N/A';
                                        }
                                $fields = 'name' . $i;
                                }

        $query  = "INSERT INTO tableName ( " . implode(', ', $fields) . " ) VALUES ( '" . implode("', '", $data) . "' )";
        $result = mysql_query($query)or trigger_error(mysql_error());

                fclose($handle);
                print "Import done";

mysql_close($con);

Link to comment
Share on other sites

This thread is more than a year old. Please don't revive it unless you have something important to add.

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.