Subj : Re: largest memory block To : borland.public.cpp.borlandcpp From : Mohsen Date : Sat Mar 26 2005 07:34 am > >I am familiar with the problem. Years ago I was involved in large >scale structural analysis on IBM mainframes with what is now regarded >as trivial amounts of memory. That necessitated not only streaming >data and constants to disks, but also 'logical transients', analagous >to but not exactly the same as overlays. > >The idea that because something is used in computation it therfore >should always be in memory is not necessarily correct. > >Computations are defined ways that separate them into phases or steps. >What is not in this step is a candidate to be on disk. You already >are swapping out large portions of code and data to disk as a >consequence of Windows' virtual memory implementation. > >The net result is that you must provide enough memory in the machine >to handle the calculations. If you cannot then you must alter the >architecture of your program to allow it to operate in whatever >machine environment is realistic for its application. Problem >partitioning is an elemental requirement for all large array >calculations. > >At some point railing about the limitations of grabbing large hunks of >memory with impunity becomes, at best, counterproductive. The >hardware and operating system you have selected have limitations. >Accept that and configure your algorithms to best work with them. > >. Ed Dear Ed, That does not really mean I want to keep everything in memory. Anyway I guess you are a well minded man. I'm looking to the problem from structural point of view and you are looking from computational point of view. I'll try to reconsider the basics of my program and to use my resources efficiently. Mohsen .