Newsgroups: comp.os.minix
Path: utzoo!utdoe!david
From: david@doe.utoronto.ca (David Megginson)
Subject: Re: definition of NULL
Message-ID: <1991Jun26.130923.5896@doe.utoronto.ca>
Organization: Dictionary of Old English Project - U of Toronto
References: <1991Jun25.172459.1142@amc.com>
Date: Wed, 26 Jun 1991 13:09:23 GMT


So, it looks like this. If you use a non-ANSI compiler on a system
that has 16-bit ints and 32-bit pointers (ie. Minix ST), then

	#define NULL (0)

will break just about half the Unix code in existance, because,
without prototypes, the compiler does not know when to promote
it to 32 bits in a function argument. On the other hand,

	#define NULL ((void *)0)

will break very little, since NULL should never be used for an integer
anyway.

I agree that

	#define NULL (0)

is probably better C style, but I would not be surprised if Minix 68K
users prefer the other so that they can compile programs like cdungeon
using 16-bit compilers. We will, of course, make the change only in
our local copies, and we promise not to whine when it breaks something
in the native Minix code :-)


David

-- 
////////////////////////////////////////////////////////////////////////
/  David Megginson                      david@doe.utoronto.ca          /
/  Centre for Medieval Studies          meggin@vm.epas.utoronto.ca     /
////////////////////////////////////////////////////////////////////////
