Newsgroups: comp.os.msdos.programmer
Path: utzoo!utgpu!news-server.csri.toronto.edu!rpi!think.com!snorkelwacker.mit.edu!bloom-picayune.mit.edu!athena.mit.edu!pshuang
From: pshuang@athena.mit.edu (Ping-Shun Huang)
Subject: Re: Porting software to the PC, 64K data structure barrier.
In-Reply-To: ts@uwasa.fi's message of 29 Jun 91 20:37:04 GMT
Message-ID: <PSHUANG.91Jun30005831@w20-575-108.mit.edu>
Sender: news@athena.mit.edu (News system)
Organization: Massachusetts Institute of Technology
References: <1972@contex.contex.com> <1991Jun29.203704.8443@uwasa.fi>
Date: Sun, 30 Jun 91 04:58:36 GMT
Lines: 22

In article <1991Jun29.203704.8443@uwasa.fi> ts@uwasa.fi (Timo Salmi) writes:

 > In TP one has to go around the 64K limitation by a judicious use of
 > pointers (I have a FAQ collection to cover how).  But doesn't Turbo C
 > have what is called a huge model (a rhetorical question).

I think the following table is correct:

MEMORY MODELS (IBM-PC C-compilers nomenclature?)
~~~~~~~~~~~~~
Tiny		near data	near code	[note: same 64Kb segment]
Small		near data	near code
Compact		near data	far code
Medium		far data	near code
Large		far data	far code
Huge		far data	far code	[note: 64Kb+ data objects OK]

[Note: near means total must be less than 64Kb, far means total can be more.]

--
Above text where applicable is (c) Copyleft 1991, all rights deserved by:
UNIX:/etc/ping instantiated (Ping Huang) [INTERNET: pshuang@athena.mit.edu]
