Subj : Re: Any help on why? To : borland.public.cpp.borlandcpp From : Bob Gonder Date : Thu Jul 08 2004 11:29 am Wayne A. King wrote: >On Wed, 07 Jul 2004 12:58:52 -0700, Bob Gonder wrote: > >>My docs (BCB) show LX as uint64. So don't know about "intended". > >BCB is not topical in this particular newsgroup. It's what happens with >BC++ compilers up to 5.02 that's germane. As I recall, the OP is targeting >16-bit embedded systems and no Borland DOS compiler supports the >extended integer types. The OP reports that LX works for his 5.02 app. >>Also firmly disagree on "integer types only". You can use %X on >>ANYTHING that goes on the stack, and has the matching size for the >>prefix to X. I can't see Any way for that to mess up. >If the syntax defies the language spec, the compiler writer may do >ANYTHING with it when parsing the source code. This includes ignoring >one or the other of the type specifier(s) or arguments. In the case of %LX, >if not specifically supported as an implementation-specific language >extension, the implementation *may* ignore the L altogether if it's not >appropriate, or apply it in unknown or unpredictable ways. Now you're going down the other track. Of course you can only use the _sizes_ that are implimented. But what you pack into that _size_ agument on the stack can be Anything (that fits). > Again, you're >trying to guess what the compiler implementors are *always* going to do. >If %X applied to a non-integer type is not specifically supported by the >language spec or by a compiler extension, the implementor may freely >and arbitrarily choose to ignore that arg altogether. It's dealer's choice. I'd like to know _how_ a printf runtime implimentor, who's only input is a string and an unknown bunch of binary data on the stack, can distinguish between an apple and an orange. I can see if you tell him to decode a float, and give him something that bitwise is illegal for a float, that he will have a fit, but int types have no underlying format, and X takes all ranges of data (in the supported byte-wise sizes). How is he going to "know" that I put garbage in my 32bit parameter? He can't. Why would he care if I did? His job is to hex those 32 bits. Nothing more, nothing less. printf ain't type-safe C++, Wayne. It's the wild and wooly world of C. >>Then they shouldn't have placed 'cast' and 'union' in the language. >>They encourage bit-fiddling. > >Which is why many C/C++ language gurus deprecate their use, Which is why I don't work for so-called gurus. >>It's dificult to write operating systems in C without bit fiddling. > >The OP is just starting to learn the language. Which is also why you shouldn't lie to him and tell him he _can't_ use %X on non-ints. Patently, he _can_ (even though you gnash your teeth at the thought of it), but he needed to get his sizes right. > I doubt that he will be >writing an OS using C any time soon. ;-) Point was, C is made for bit-fiddling. If you don't want to bit-fiddle, use C++ or Java, or just about anything else. Just don't condem a C programmer for using the language to his benefit (or should that be bending the langueage to his will?) > The rationale behind going >to C (and other higher level languages such as PLUS - Programming >Language for Univac/Unisys Systems) when writing OSes was precisely >to make the OS code portable across vendor's hardware offerings. The >more bit ops incorporated, the more portability suffers. Yep, but you can't do it without the bit-fiddling. >You're an anachronism. ;-)) And proud of it. >>Depends I guess, on if you want to use it "as designed" in the 60/70's >>or "as designed" in the 90's. > >As designed according to the C90 spec here and elsewhere, >until Borland releases a C compiler which supports the C99 >spec. Of course, one is free to use documented and officially >supported features which are an extension to the language. >But even there, trying to use such features in a way which deviates >from their intended functionality is usually ill-advised. See, there's that 'intended' word again. I don't see anywhere where the designer's "intentions" are discussed. The specs say that X decodes an int. Fine. My belief is that the designer didn't obfuscate the matter by saying that "X decodes any data whose size is int" because it's too wordy, and not succinct. So, _I_ think that this usage _was_ intended. And I don't see how it could be otherwise. (see also next) >>I still don't think of C as "designed" in that manner. > >What do you think the ANSI Standard for the C language was >written for, if not as a design specification? To keep Microsoft in line? It's a _compiler_ spec. It's not a program spec.(It does tell the programmer what to expect the compiler to do with his code, but he is free to ignore or pervert the usage if he can and so chooses.) Knowing that X takes an int and hexes it, a programmer with a brain can figure out that placing something on the stack that looks like an int (has the size of an int), will be decoded to hex (like an int) >Warning mantissa.c(15): printf: %X requires integral argument (arg 2) in >function main "Warning" because the programmer _probably_ made an error, but also "Warning" because the programmer _might_ know what he's doing. Otherwise, it'd be an "Error". >printf() and its companion functions (sprintf() and fprintf()) can be >one of the most prolific sources of problems Yes, they "can" be, as can just about anything else if you don't know what you're doing. .