Subj : huge data BC3.1 techniques To : borland.public.cpp.borlandcpp From : Walter Briscoe Date : Sun Jul 20 2003 08:52 am I am trying to port some code which uses large quantities of memory. I am using Borland C++ 3.1 as a representative 16 bit compiler. (My code is bigger than can be supported by the BC2.0 compiler.) I currently have it hosted on Windows Professional 2000 SP3. The first problem is that the Debugger (TDW.EXE) used by the Windows IDE (BCW.EXE) fails as it tries direct access to the Mouse. I assume there is no workaround on that. I currently use the DOS IDE (BC.EXE) which is limited to an 80*50 screen. Would I do better with a more modern 16 bit compiler? (I have 16 bit support in 4.0, 4.51 and 5.01.) Enough of that. The code currently fails to use large quantities of data. An array of 1,800,000 bytes and another of 450,000 bytes are the main problems. The far heap is limited to 640K. There seems to be no control on its size. I read that large data model programs may use up to 16MB data. How? I believe that an individual data entity of 64K or more has to be addressed as a huge object to avoid address wrap. I infer that a huge object would wrap at 4MB. I would value either direct knowledge or URLS which show solutions to such large data problems. -- Walter Briscoe .