0

I am currently following a book from Springer called "Guide to scientific computing in C++", and one of its exercises regarding pointers says as follows:

"Write code that allocates memory dynamically to two vectors of doubles of length 3, assigns values to each of the entries, and then de-allocates the memory. Extend this code so that it calculates the scalar product of these vectors and prints it to screen before the memory is de-allocated. Put the allocation of memory, calculation and de-allocation of memory inside a for loop that runs 1,000,000,000 times: if the memory is not de-allocated properly your code will use all available resources and your computer may struggle."

My attempt at this is:

for (long int j = 0; j < 1000000000; j++) { 

    // Allocate memory for the variables
    int length = 3;
    double *pVector1;
    double *pVector2;
    double *scalarProduct;
    pVector1 = new double[length];
    pVector2 = new double[length];
    scalarProduct = new double[length];

    for (i = 0; i < length; i++) { // loop to give values to the variables
        pVector1[i] = (double) i + 1;
        pVector2[i] = pVector1[i] - 1;
        scalarProduct[i] = pVector1[i] * pVector2[i];
        std::cout << scalarProduct[i] << " " << std::flush; // print scalar product
    }
    std::cout << std::endl;

    // deallocate memory
    delete[] pVector1;
    delete[] pVector2;
    delete[] scalarProduct;
}

My problem is that this code runs, but is inefficient. It seems that the de-allocation of the memory should be much faster since it runs for over a minute before terminating it. I am assuming that I am misusing the de-allocation, but haven't found a proper way to fix it.

ao_martinv
  • 35
  • 6
  • Instead of calling the allocator over and over again, try declaring `std::vector` on the outside of the loop, and just call `vector.resize()` to set the number of elements, and `vector.clear()` instead of `delete[]`. Also, if you know length is always `3`, then there is no need to allocate dynamically -- just use arrays. – PaulMcKenzie May 16 '20 at 12:41
  • Why do you assume it should be faster and why do you assume the problem is with the allocation and deallocation of memory? You’re also outputting numbers and flushing the buffer, which is much more time consuming than allocations. – Sami Kuhmonen May 16 '20 at 12:43
  • 1
    @PaulMcKenzie If you read the question there’s clear explanation why this was done in the way it is – Sami Kuhmonen May 16 '20 at 12:44
  • Sami Kuhmonen: I should have explained that in the question. When commenting out the lines that print (both couts) it also takes over a minute to run. PaulMcKenzie, thanks for the input. I will try to do that and see if it works. Still, I was hoping to get a method using array and new/delete, since it is what the book covers in the chapter where the exercise is. – ao_martinv May 16 '20 at 12:49
  • @ao_martinv -- The `vector` also uses dynamic allocation, but it is smart about it. It dynamically allocates memory up front, and only needs to reallocate if the number of entries is greater than the capacity. Also `clear()` would or should simply set the size member to `0` instead of issuing a call to `delete []`, since `double` is trivially destructible (no need to issue calls to a destructor). So the only overhead is memory allocation for larger capacity. Note that you could have done all of this work, so if you can't use vector, just be smarter in how you're using `new[]`. – PaulMcKenzie May 16 '20 at 13:00
  • Your code does exactly what it is supposed to, run a long time without crashing your computer due to out_of_memory. The book might be a bit dated as it assumes you can not allocate more than 72.000.000.000 bytes before crashing. You can test it be removing the deletes hence leaking the memory. – Surt May 16 '20 at 13:16
  • 1
    printing 3 billion values and performing 4 billion flushes is bound to take a while. Are you using compiler optimisations? – Alan Birtles May 16 '20 at 13:22
  • @Surt thank you for the explanation. That makes sense and you are right, the memory suffers when the deletes are removed. If you want, you can post it as the answer to the question so that I can mark it as the accepted answer. – ao_martinv May 17 '20 at 23:42
  • @PaulMcKenzie thank you for the explanation. I have recently used vectors a little bit, but I am still struggling with the idea of pointers and allocation of memory and these clarifications make everything clearer. – ao_martinv May 17 '20 at 23:45

1 Answers1

0

Your code does exactly what it is supposed to, run a long time without crashing your computer due to out_of_memory. The book might be a bit dated as it assumes you can not allocate more than 72.000.000.000 bytes before crashing. You can test it be removing the deletes hence leaking the memory.

Surt
  • 13,762
  • 3
  • 21
  • 34