Hi
I'm using the list and vector containers, and need an exact account of the total memory used by my program. I know that these containers use extra linkage pointers and other overheads per element in the set. How can I find out how much per element exactly for both the containers.
Suppose I insert 100 integers in the list<int> and 100 in vector<int>, how much heap memory is exactly being used. Is there a way of finding out how.
When I submit my program as a job (qsub), the qstat option reports outrageously extra memory used in the v_mem field. So this is not helping.
But that does not give an exact count of the total memory used.
I have checked that, by using valgrind's Massif tool, and that too like the 'qstat' reports extra memory usage (although still reasonable).
Well capacity just reports the number of elements as its capacity.. nothing in bytes.. hence it does not take into account the over head required for each container.
Well its actually the size in terms of no. of elements
as from the link you sent... Return ValueThe size of the currently allocated storage capacity in the vector, measured in the number elements it could hold.
I assume that you are aware that if element contain pointer(s) that was allocated elsewhere vector doesn't account for data that it points to, it will just count as 4 bytes (usually).
Here's a bit from Wikipedia about vectors:
http://en.wikipedia.org/wiki/Vector_%28STL%29
A typical vector implementation consists, internally, of a pointer to a dynamically allocated array,[2] and possibly data members holding the capacity and size of the vector. The size of the vector refers to the actual number of elements, while the capacity refers to the size of the internal array.
This program I just wrote suggests (to me) that the "overhead" may be 20 bytes = sizeof a vector across push_back()'s:
When I submit my program as a job (qsub), the qstat option reports outrageously extra memory used in the v_mem field. So this is not helping.
How much is "outrageously"? The system may also be including stack space (say, 1 MiB) and static data (code and other things).
Try this:
1. Start program.
2. Pause program (usleep() is not a bad idea).
3. Measure size and note the value.
4. Resume program and allocate structure.
5. Pause program.
6. Measure size and compare with previous value.
This is actually what I usually do when I want to do a quick check for memory leaks in code with deterministic deallocations (if the size doesn't decrease when it was expected, there's a leak).
Maybe there is a way by changing the new operator to something more personalizable (or a custom allocator)? Because i think a vector will do a series of dynamic memory allocations of small sizes, not just one memory allocation somewhere, so sizeof(), size, capacity,... wont help in that case.
Or maybe, the simplest way would be to make a minimal program that doesn't have memory leaks, allocate your vector dynamically, and see what is lost in valgrind (but the result will also depend on how the allocator manages it's memory)
I don't understand the confusion here, it is plain and simple. Say you have a vector<int> named MyV: sizeof(MyV) is the size of the data members that make up the vector, eg if it had an int* as it's data member, it would be equivalent to sizeof(int*). MyV.size() is the number of elements in the vector. MyV.capacity() is the number of elements the vector can hold before automatic or manual reallocation is needed. MyV.size() * sizeof(int) is the number of bytes of memory your elements are taking up. MyV.capacity() * sizeof(int) is the number of bytes of memory dynamically allocated by the vector. MyV.capacity() * sizeof(int) + sizeof(MyV) is how many bytes the vector has in total, including it's own internal data member memory and the memory it has allocated for your elements. This is most likely what you want.