Newsgroups: comp.sys.mac.programmer
Path: utzoo!sq!dak
From: dak@sq.sq.com (David A Keldsen)
Subject: Re: THINK C library & opening files - HELP!?!
Message-ID: <1990Sep5.021203.19864@sq.sq.com>
Organization: SoftQuad Inc.
References: <12866@june.cs.washington.edu> <9898@goofy.Apple.COM> <1990Sep3.223651.7921@maths.tcd.ie>
Date: Wed, 5 Sep 90 02:12:03 GMT
Lines: 51

tim@maths.tcd.ie (Timothy Murphy) writes:

[deleted question that I'm not answering]

>Another question: is there any way of #defining outside the program --
>as eg from the command line in Unix or MS-DOS
>cc -Dxyz prog.c

No.  Put such defines in one file, #include as needed.

>While I've got your attention,
>is the 32k data limit due to Mac internals,
>or to THINK C?

Both.  Turns out that it is a "reasonable choice" to implement access
to this data by a register that is a 16 bit signed quantity, and this 
is the choice that the developers of Think-C made.  It's pretty common.

>Last question (for the moment):
>I find malloc()...free()
>doesn't actually free the space the *first* time it is called,
>but does on subsequent occasions.
>Has anyone else noticed this?
>Eg

>#include <stdio.h>
>#include <stdlib.h>
>main()
>{
>  int i;
>  char *cp;
>  for (i=0; i<100; i++) {
>    cp = malloc(1000);
>    printf("cp = %p\n", cp);
>    free(cp);
>  }
>}

malloc() doesn't take an integer, it takes a size_t as its argument.  
Look it up in the manual.  So your first malloc() is reading the 1000
on the stack, and some other number for the high half of the size_t...
you aren't actually getting 1000 malloc units.  You see a lot of this
when porting stuff to Think-C from various Unix environments.  A more
typical example is surprising "out of memory" errors, when you are just
*sure* that you've got plenty.

-- 
// David A. 'Dak' Keldsen:  dak@sq.com or utai!sq!dak
// "Well, I had resolved to be less offended by human nature, but I think
//  I blew it already." -- Calvin and Hobbes

