Issue
I’m coding a program in PHP that requires listing a big CSV file values ordered by string length.
I’m currently using:
if (($process = fopen($CSV_file, "r")) !== FALSE)
{
$CSV_array = array();
while (($CSV_line = fgetcsv($process, ";")) !== FALSE)
{
$CSV_array = array_merge($CSV_array,$CSV_line);
}
fclose($process);
}
usort($CSV_array, function($a,$b) {return mb_strlen($b) - mb_strlen($a);});
Xdebug tells me that usort
is taking almost half the computing time of the whole program (not displayed here).
Would you know any way to optimize that piece of code?
Solution
Without changing the code too much, a first step would be a form of memoization:
$array = array_map(function ($str) { return array($str, mb_strlen($str)); }, $array);
usort($array, function($a, $b) { return $b[1] - $a[1]; });
$array = array_map('current', $array);
This way you’re not calling mb_strlen
over and over again for every string.
Answered By – deceze
Answer Checked By – Katrina (BugsFixing Volunteer)