This is my array:
array(4) {
[0]=>
array(500000) {
["1234"]=>
array(3) {
["fileName"]=>
string(10) "monkey.jpg"
["path"]=>
string(20) "animales/monkey.jpg"
["dateTime"]=>
string(19) "2016-10-12 19:46:25"
}
["3456"]=>
array(3) {
["fileName"]=>
string(9) "horse.jpg"
["path"]=>
string(19) "animales/horse.jpg"
["dateTime"]=>
string(19) "2016-10-12 19:46:25"
}
.... and many more...
}
... and many more...
}
I want to store the content into my database:
$sql = "INSERT INTO files (id,fileName,path,dateTime) values(?,?,?,?) ";
foreach($array as $key => $value){
if(is_array($value)){
foreach($value as $key => $v){
foreach($v as $k => $item){
if(is_array($v)){
$s = str_replace("\\","",$v['dateTime']);
$d = strtotime($s);
$dateTime = date('Y.m.d H:i:s', $d);
$q->execute(array($key,$v['fileName'],$v['path'],$dateTime));
}
}
}
}
}
My problem is, that I have over 500.000 entries. So my system crashes. I think it is because there are so many loops inside the loop. Is there a way to read the content with only one loop or some other way faster?
Note: the $array
is a spliced array created like this ($array[] = array_splice($orinal_array, 0,count($original_array)); I actually did that to make the system faster