2

This is my array:

array(4) {
  [0]=>
  array(500000) {
    ["1234"]=>
    array(3) {
      ["fileName"]=>
      string(10) "monkey.jpg"
      ["path"]=>
      string(20) "animales/monkey.jpg"
      ["dateTime"]=>
       string(19) "2016-10-12 19:46:25"
    }
    ["3456"]=>
    array(3) {
      ["fileName"]=>
      string(9) "horse.jpg"
      ["path"]=>
      string(19) "animales/horse.jpg"
      ["dateTime"]=>
       string(19) "2016-10-12 19:46:25"
    }
    .... and many more...
  }
 ... and many more...
}

I want to store the content into my database:

$sql = "INSERT INTO files (id,fileName,path,dateTime) values(?,?,?,?) ";

foreach($array as $key => $value){
    if(is_array($value)){
        foreach($value as $key => $v){
            foreach($v as $k => $item){
                if(is_array($v)){
                    $s = str_replace("\\","",$v['dateTime']);
                    $d = strtotime($s);
                    $dateTime = date('Y.m.d H:i:s', $d);  
                    $q->execute(array($key,$v['fileName'],$v['path'],$dateTime));
                }
            }
        }
    }
}

My problem is, that I have over 500.000 entries. So my system crashes. I think it is because there are so many loops inside the loop. Is there a way to read the content with only one loop or some other way faster?

Note: the $array is a spliced array created like this ($array[] = array_splice($orinal_array, 0,count($original_array)); I actually did that to make the system faster

peace_love
  • 5,582
  • 8
  • 38
  • 100
  • 1
    *Notice:* Your last for loop is redundant. – Joas May 21 '18 at 08:27
  • Instead of inserting data using loop create array and insert 100 data at a time. So it will execute faster, Also change your execution time from you php.ini – Mitul May 21 '18 at 09:09

2 Answers2

2

Please have a look at this answer:

MYSQL import data from csv using LOAD DATA INFILE

You should convert your data to csv and rely on LOAD DATA INFILE

Not that you should upload to your mysql server the csv file to rely on this Mysql functionality

koalaok
  • 3,414
  • 7
  • 30
  • 67
1

Using serialize or json_encode is probably a way to go. This way, you won't have to traverse all elements and handle slashes, because all becomes a string that can be later read using json_decode or deserialize PHP functions.

Side note: please use meaningful variable names so that people helping you won't have to figure out what you meant. i.e:

foreach($v as $k => $item){

is a bit worse than

foreach($fileCollection as $fileIterator => $fileDetails){

If you REALLY need to traverse all the data and store each file property in a separate column, all you need is 2x foreach (one for collection, and one for each file).

foreach($globalCollection as $fiveHundredThousandRows){ 
    foreach ($fiveHundredThousandRows as $fileIterator => $fileData){
        $timestamp = strtotime($fileData['dateTime']);
        $q->execute(array($fileIterator,$fileData['fileName'],$fileData['path'],date( 'Y.m.d H:i:s', $timestamp)));
    }
}
Pang
  • 8,605
  • 144
  • 77
  • 113
Jan Myszkier
  • 2,517
  • 1
  • 12
  • 22
  • I replaced my code with yours. But I only get a blank page. Did not figure out why yet – peace_love May 21 '18 at 09:41
  • `$globalCollection[0]`: this is where I use my array: `$array[0]`? – peace_love May 21 '18 at 09:49
  • Found, what was the problem. Something with the date. – peace_love May 21 '18 at 09:51
  • Will check now, if it works with huge file and let you know – peace_love May 21 '18 at 09:52
  • Hi, it worksm but now I get only the first block of the array with the key 0: `[0]=> array(500000) {..}` My other keys `[1]=> array(500000) {..}, [3]=> array(245) {..}` are not inserted – peace_love May 21 '18 at 10:02
  • ah, my bad then. Then you WILL need that additional second foreach to iterate over those array elements where each of the element is 500000 items long. updating the answer in a moment... – Jan Myszkier May 21 '18 at 10:17
  • Is it a better performance to go loop through the first block with key and insert it into the database and after that start the next loop. Meaning treat the keys separately somehow – peace_love May 21 '18 at 10:25