As already discussed in other answers, the best strategy will be to always work on the two items with minimal cost for each iteration. So the only remaining question is how to efficiently take the two smallest items each time.
Since you asked for best performance, I shamelessly took the algorithm from NetMage and modified it to speed it up roughly 40% for my test case (thanks and +1 to NetMage).
The idea is to work mostly in place on a single array.
Each iteration increase the starting index by 1 and move the elements within the array to make space for the sum from current iteration.
public static long CostOfMerge2(this IEnumerable<int> psrc)
{
long total = 0;
var src = psrc.ToArray();
Array.Sort(src);
var i = 1;
int length = src.Length;
while (i < length)
{
var sum = src[i - 1] + src[i];
total += sum;
// find insert position for sum
var index = Array.BinarySearch(src, i + 1, length - i - 1, sum);
if (index < 0)
index = ~index;
--index;
// shift items that come before insert position one place to the left
if (i < index)
Array.Copy(src, i + 1, src, i, index - i);
src[index] = sum;
++i;
}
return total;
}
I tested with the following calling code (switching between CostOfMerge
and CostOfMerge2
), with a few different values for random-seed, count of elements and max value of initial items.
static void Main(string[] args)
{
var r = new Random(10);
var testcase = Enumerable.Range(0, 400000).Select(x => r.Next(1000)).ToList();
var sw = Stopwatch.StartNew();
long resultCost = testcase.CostOfMerge();
sw.Stop();
Console.WriteLine($"Cost of Merge: {resultCost}");
Console.WriteLine($"Time of Merge: {sw.Elapsed}");
Console.ReadLine();
}
Result for shown configuration for NetMage CostOfMerge:
Cost of Merge: 3670570720
Time of Merge: 00:00:15.4472251
My CostOfMerge2:
Cost of Merge: 3670570720
Time of Merge: 00:00:08.7193612
Ofcourse the detailed numbers are hardware dependent and difference might be bigger or smaller depending on a load of stuff.