1

I have a class in C# that is storing information on a stack to be used by other pieces of the application later. The information is currently stored in a class (without any methods) and consists of some ints, shorts, floats and a couple of boolean values. I can be processing 10-40 of these packets every second - potentially more - and the information on the stack will be removed when it's needed by another part of the program; however this isn't guaranteed to occur at any definite interval. The information also has a chance of being incomplete (I'm processing packets from a network connection). Currently I have this represented as such:

public class PackInfo
{
    public boolean active;
    public float f1;
    public float f2;
    public int i1;
    public int i2;
    public int i3;
    public int i4;
    public short s1;
    public short s2;
}

Is there a better way that this information can be represented? There's no chance of the stack getting too large (most of the information will be cleared if it starts getting too big) but I'm worried that there will be a needless amount of memory overhead involved in creating so many instances of the class to act as little more than a container for this information. Even though this is neither a computationally complex or memory-consuming task, I don't see it scaling well should it become either.

Micheal Hill
  • 1,570
  • 2
  • 16
  • 33

1 Answers1

4

This sounds like it would be a good idea to use a generic Queue for storing these. I have the assumption that you're handling these "messages" in order.

As for the overhead of instantiating these classes, I don't think that instantiating 10-40 per second would have any visible impact on performance, the actual processing you do afterwards would likely be a much better candidate for optimization than the cost of instantiation.

Also, I would recommend only optimizing when you can actually measure performance in your application, otherwise you might be wasting your time doing premature optimization.

Matthew
  • 21,467
  • 5
  • 66
  • 95
  • I wasn't so worried about the current implementation, though I was curious if it would be measurable. I was more concerned about how it would scale - will garbage collection reuse the memory that is allocated for it, or would that bog it down? This answer satisfied my curiosity though, thanks! – Micheal Hill Oct 10 '12 at 06:20
  • Garbage collection only frees memory when the system needs too, it might not collect old objects until a few thousand instances of your object are in memory if you have a high memory system. As for re-using memory, it in affect sort of already does, the .NET framework will call windows apis to allocate memory to the process in large chunks (probably using `VirtualAlloc`), then it is up to the process to segment that memory in a meaningful way. When garbage collection runs, it doesn't nescessarily return all that memory to the OS. – Matthew Oct 10 '12 at 13:48