1

I make a tiny experiment, code is as following :

    vector<char> *p = new vector<char>[1024];
    for (size_t i = 0; i < 1024; i++) 
    {
        (*p++).resize(1024 * 1024);//alloc 1 G memory 
    }

    sleep(5);
    cout << "start to clear" << endl;
    for (size_t i = 0; i < 1024; i++)
    {
        vector<char> tmp;
        tmp.swap(*p++);
    }
    delete [] p;

    cout << "clear over!" << endl;
    sleep (5);

//here, the memory is still 1G, why ? thank you very much.

Raymond
  • 511
  • 1
  • 4
  • 14
  • 1
    Question: Where are you resetting `p` to point to the first vector? – Xeo Jul 01 '11 at 02:15
  • sorry, cannot understand your question, reset p to first vector ? I think p = new X[n], p will by default point the first position. – Raymond Jul 01 '11 at 02:20
  • But then you incremented p 1024 times. – Benjamin Lindley Jul 01 '11 at 02:22
  • Yes, but after the first loop, and before doing the second, you don't set `p` back to the first position, but leave it as it was at the end of the first loop (where you reserve). – Xeo Jul 01 '11 at 02:22
  • YES, Xeo, you are right. and thank Benjamin lindley. Now the program behavior is right. THANK YOU ! – Raymond Jul 01 '11 at 02:24

2 Answers2

5

In most implementations, the memory isn't returned to the OS immediately, but rather put into a "free list", as acquiring memory from the OS is often way more expensive than walking such a free list. That's most likely why you still see the 1gig of memory, wherever you check that.

Also, in your code, I don't see where you reset p after reserving all the vectors, you basically swap the empty vector with uninitialized memory that doesn't belong to you.

Xeo
  • 123,374
  • 44
  • 277
  • 381
  • To elaborate: the C/C++ library malloc/new routines, when they need more memory, have to ask the OS itself to grant the process that memory. On UNIX, they'll call `sbrk(2)` (that's the `sbrk(void*)` function in man-page section 2). On most systems, a process's reported memory is a high watermark, with the least recently used pages likely to be written to swap (disk) if system memory gets low. On Linux, large allocations can actually be returned to the operating system. Further reading: http://stackoverflow.com/questions/5994543/problem-usage-memory-in-c – Tony Delroy Jul 01 '11 at 04:59
3

You didn't reset p to its initial value between the two loops. The second loop doesn't clear the initial p, it messes with random memory after the memory allocated for the initial p.

I suggest you use (*(p + i)) or (p + i)-> or p[i] instead of (*p++). Or even better, use vector<vector<char> >. And instead of swapping with a temporary vector. use the clear() member function.

Edit: Here's two good implementations

vector<char>* p = new vector<char>[1024];
for( size_t i = 0; i < 1024; i++ ){
    p[i].resize(1024 * 1024);
}
sleep(5);
cout << "start to clear" << endl;
for( size_t i = 0; i < 1024; i++ ){
    p[i].clear();
}
delete [] p;
cout << "clear over!" << endl;
sleep(5);

vector<vector<char> > p(1024);
for( size_t i = 0; i < 1024; i++ ){
    p[i].resize(1024 * 1024);
}
sleep(5);
cout << "start to clear" << endl;
for( size_t i = 0; i < 1024; i++ ){
    p[i].clear();
}
cout << "clear over!" << endl;
sleep(5);
Frigo
  • 1,625
  • 1
  • 14
  • 30
  • 2
    `clear` doesn't free the allocated memory. Swapping with an empty vector does. – Xeo Jul 01 '11 at 02:19
  • 1
    `std::array, 1024>` is a good option if you know the size. – Neil G Jul 01 '11 at 02:27
  • As Xeo says, `clear()` doesn't do what your answer suggests... all it does it call destructors for existing objects and return the `size()` to 0... the capacity and memory usage remains unchanged. See Herb Sutter's old GOTW: http://www.gotw.ca/gotw/054.htm – Tony Delroy Jul 01 '11 at 05:03
  • Never had memory problems with clear() under G++, maybe it is implementation dependent? – Frigo Jul 01 '11 at 16:13