Subj : Re: largest memory block To : borland.public.cpp.borlandcpp From : maeder Date : Thu Mar 24 2005 06:23 pm "Mohsen" writes: > It is a proplem for an application of mine right now. I have to > deal with huge arrays say of 1000000 elements or more. Using an > array of the following type makes the growing process slow: > > TArray Nodes(100, 0, 100); > > Because, the application has to make 9999 reallocations to > accomodate 1000000 elements. Consider another example: > > TArray Nodes(100000, 0, 100000); > > When reallocating, the occupied memory is 300000, because it > needs to keep the original array and also allocate a new block > of 200000 elements to deposit the previous array into it. And > if it cannot do that the program crashes. [Assuming you don't know the eventual array size from the beginning, or you'd initially allocate the right number of elements.] std::vector is required by the ISO C++ Standard to grow exponentially. If I understand you right, that would help you both performance and space wise. .