Subj : Re: largest memory block To : borland.public.cpp.borlandcpp From : Ed Mulroy [TeamB] Date : Thu Mar 24 2005 08:42 am You should consider re-examining the program's basic structure. Using a preprogrammed container type and keeping everything in memory with a very large data set has consequences. A design that is efficient and performs well for a data set of one size and complexity may not be that way for one of another size and complexity. Consider how the information is organized and what resources are available. Example of some things that might be worth considering: Win32 offers the concept of a memory mapped file in which the operating system's abilities to efficiently handle mixed real and virtual memory are made available to the programmer. The creators of the preprogrammed class such as TArrayAsVector (what you are using) made design decisions related to what they assumed its typical use would be. If you have source it might be worth deriving a customized version whose allocation scheme more aligns with what you need. It is only a concept that everything should be in one array. You might consider an implementation via a wrapper class whose publics appear about as that which you now used but whose internals differ. Specifically why should everything be in one array? As a crude example, why wouldn't it work were all entries starting with 'A' be in one array, starting with 'B' in another, etc? When a reallocation is needed, the size of the reallocation would be related to the current size of the 'A', 'B' or whatever vector and not of the size of the entire data set. Should one of these reallocations fail then you have a place in the wrapper to trap that and decide to do something else to handle the situation. I mentioned this not as some kind of deep science but rather as a patch to preserve the program's array orientation while side stepping the reallocation situation. .. Ed > Mohsen wrote in message > news:4242891e$1@newsgroups.borland.com... > >>Have you determined that this is a problem in an application of >>yours, >>or are you afraid it may become one in the future? > > It is a proplem for an application of mine right now. I have to > deal with huge arrays say of 1000000 elements or more. Using an > array of the following type makes the growing process slow: > > TArray Nodes(100, 0, 100); > > Because, the application has to make 9999 reallocations to > accomodate 1000000 elements. Consider another example: > > TArray Nodes(100000, 0, 100000); > > When reallocating, the occupied memory is 300000, because it > needs to keep the original array and also allocate a new block > of 200000 elements to deposit the previous array into it. And > if it cannot do that the program crashes. > >> >> >> >>I don't understand this. What is less efficient than what here? > > Comparing previous examples show what is less efficient than > the other. Reducing the array size and its growth delta > increases the number of reallocations, while reversing the > process reduces the number of reallocations and increases the > possibility of crashing. I think I could clarify what my > problem is. In my mind allocating the largest possible value > for the array size can be more efficent if we now the value in > advance. > > Mohsen .