What is reasonable time to load a CSV file into a 2-dimensional array in memory, where # columns is fixed (406), and the number of rows are about 87,000? -- In Perl it is taking about 12 seconds from either HardDisk (SATA) or SSD. -- other languages OK if speed can greatly improved. I expected the time to be much less! Size on disk of the referenced CSV file is 302MB!
Snip of the interesting Perl below:
while ($iline = <$CSVFILE>)
{
chomp($iline);
@csv_values = split /,/,$iline;
# Create a HASH Key from csv_value[0], which is the CODE/label!
$hashname=$csv_values[0];
$Greeks{$hashname}=[@csv_values]; # Create the reference & copy the array!
}
For above, the majority of the time is consumed in the "split", and the Hash new key addition lines!
I tried a similar test in python (not my strong suite), and the performance was much much worse! FYI: cpu is intel 3.2GHz i7-3930K wiht 32GB ram, 64-bit OS (win 10), for referenced performance. Thanks for constructive ideas!