Subj : A memory allocation method - questions To : comp.programming From : iouswuoibev Date : Fri Aug 19 2005 03:05 pm In the attempt to produce a method for storing data I have came up with a certain (flawed) method of allocating memory. I don't know what this method is called or if it even has a name, so I'm hoping you can help me. I'd also like to know if anybody can see a way around the flaws it presents. The method is thus: You have a pool of memory, and within this pool you allocate the desired pointers. The pointers are only permitted to be of a certain size (variable "element_size") particular to any one individual memory pool. Each memory pool has the variable "next_entry" attributed to it. Initially, the value if next_entry is 0 (zero). It also has the variable "max" attributed to it which is also initialized to 0. When allocating an element within the pool, so long as next_entry is 0, the next pointer is allocated to the factor of max and element_size. Max is then incremented by 1. Whenever an element gets removed, the variable next_entry is set to that element and the CONTENT of that element is set to the former value of next_entry. >From thereon, whilst next_entry is non-zero, whenever a new element is allocated, the value stored in the element pointed to next_entry is set to the value of next_entry, and the data is then stored to that element that was just pointed to. Ok, so that's the method (let me know if it needs clarifying). My first question is, as I said, what is this called? And my second question pertains to the inherent flaws and limitations. These are: that the elements must be of a predetermined size, the concept of a pool imposes limitations on how much can be stored, and that there is a huge potential for fragmentation. This method should always be faster than using a loop, but could the tradeoff of speed in favour of compactness ever be desireable? Is there a way to avoid the fragmentation? In attempting to avoid the data pool imposing limitations on how much can be stored, I considered an expanding and contracting data pool. But this could end up in a huge wastage of memory. For instance, what if I allocated 1024 elements in sequential order, and then removed the first 1023? The pool would not be able to contract because the 1024th element would be stopping it. And the only immediately apparent solution to this problem that I can see would be to use a loop to maintain compactness, which would then defeat the whole purpose behind this method which is its speed. .