8

Should I worry about memory fragmentation with std::vector? If so, are there ways to help prevent it? I don't always predict for my programs to be running on a PC, they may also be running on embedded devices/game consoles, so I won't always be able to rely on virtual memory.

Then again I believe it would be more efficient to use a dynamically sized array rather than a static array, so that memory would only be allocated if needed. It would also simplify my programs' design process. Are there ways to achieve this efficiently?

Thanks for any advice!

AutoBotAM
  • 1,365
  • 3
  • 15
  • 23
  • 1
    There's an optional allocator template parameter that you can specify to get tighter control over how memory allocations are made. – Mark Ransom Oct 31 '11 at 15:38

6 Answers6

13

The answer to your worries may be std::deque. It gives you a similar interface to that of std::vector, but works better with fragmented memory, since it allocates several small arrays instead of a large one. It is actually less efficient than std::vector in some aspects, but for your case it may be a good trade-off.

Gorpik
  • 10,360
  • 3
  • 32
  • 53
  • can you site a source for the claims you make? preferrably a standard reference? **Update** found a start myself: http://www.gotw.ca/gotw/054.htm – sehe Oct 31 '11 at 15:46
  • 1
    @sehe: I'm afraid the standard tends not to enforce implementations in order to allow for unknown optimisations. It does not even force `std::vector` to be implemented using an internal array, though I highly doubt that anybody can implement it otherwise. But usually `std::deque` is implemented using small arrays. See http://www.cplusplus.com/reference/stl/deque/, for instance, for a short library-independent discussion. – Gorpik Oct 31 '11 at 15:58
6

If your vector will be reallocated many times then yes, it can cause memory fragmentation. The simplest way to avoid that would be using std::vector::reserve() if you more or less know how big your array can grow.

You can also consider using std::deque instead of vector, so you won't have problem with memory fragmentation at all.

Here is topic on stackoverflow which can be interesting for you: what-is-memory-fragmentation.

Community
  • 1
  • 1
Piotr Kukielka
  • 3,622
  • 3
  • 29
  • 40
3

std::vector is only as good as new. It simply handles the underlying memory allocation for you A couple of things you can do - assuming you don't want to write a whole new new handler.

Pre-allocate vectors or resize() if you know what eventual size they will be, this stops wasteful memory copies as they grow.

If you are going to be using the vector again with the same size, it's better to keep it and refill it than to delete it and recreate it.

Generally on embedded targets if you know the memory requirements it's best to statically allocate all the memory at the start and divide it up yourself - it's not like another user is going to want some.

Martin Beckett
  • 90,457
  • 25
  • 178
  • 252
2

You should always worry about performance and efficiency when your profiler tells you so (you can be that profiler, but you have to 'measure', not guess).

Things you can do:

  1. pre-allocate capacity:

     std::vector<int> x(1000); // size() is 1000
    
     std::vector<int> y;
     y.reserve(1000); // size() is 0, capacity is 1000
    
  2. use a custom allocator

The first option is clearly the quick win; The second option is more involved and I only recommend it when your heap profiler tells you that fragmentation is causing problems.

For heap profiling, I suggest

Community
  • 1
  • 1
sehe
  • 328,274
  • 43
  • 416
  • 565
2

One good way to minimize repeated memory allocation and reallocation calls with std::vector is to make liberal use of std::vector::reserve() if you have some idea of how many elements your vector will be using. That will preallocate capacity and prevent the resizing of the internal array the vector is maintaining as you add elements via push_back().

Jason
  • 30,174
  • 7
  • 55
  • 73
1

No, std::vector guarantees contiguous storage. You can use vector::reserve() to avoid reallocations as the vector size increases though.

bames53
  • 79,748
  • 13
  • 162
  • 229
  • 3
    That's precisely why you do need to worry about memory fragmentation. If new can't find eg. a spare 4k contiguous block it can't allocate a vector of 1000 ints. – Martin Beckett Oct 31 '11 at 15:38
  • 1
    how would that even relate to heap fragmentation? Heap fragmentation does not depend on the layout of a class. It depends on the allocation patterns and the allocation policy/algorithms. – sehe Oct 31 '11 at 15:41
  • I take 'worrying about memory fragmentation' to mean 'worrying that the memory being accessed is fragmented,' because accessing non-contiguous memory means you probably aren't getting the performance benefits from prefetching and the like. – bames53 Oct 31 '11 at 16:50