Subj : Re: Any help on why? To : borland.public.cpp.borlandcpp From : waking Date : Fri Jul 09 2004 12:28 am On Thu, 08 Jul 2004 10:29:37 -0700, Bob Gonder wrote: >The OP reports that LX works for his 5.02 app. Again a misuse of specifiers. "L" is intended for use with long doubles, which have a size of 80 bits. Using it with doubles which have a size of 64 bits results in undefined behavior. If that undefined behavior just happens to appear to give you the result you were hoping for, it is still bad programming practice. The side effects are unknown and not always immediately apparent. (e.g. - possible corruption of adjacent memory or the stack, incorrect parsing/processing of subsequent args, etc.) >you can only use the >_sizes_ that are implimented. But what you pack into that _size_ >agument on the stack can be Anything (that fits). See above. You're recommending using a wrong size modifier. I clearly alluded to the parsing of the source code. The parser can tell from the source that a type mismatch is present, and freely choose to ignore it, alter it, pass it unchanged, etc. The effects of mismatching are unpredictable. You appear to be working on the assumption that the compiler will always blindly apply the specifiers without regard for the type of the actual matching arguments. While an implementor *may* do so, it is not guaranteed nor is it a reliable assumption. If it just happens to "work" in a specific instance, it's blind luck not clever programming. >I'd like to know _how_ a printf runtime implimentor, who's only input >is a string and an unknown bunch of binary data on the stack, can >distinguish between an apple and an orange. The parser can distinguish an integer from a float type. It's looking at the source code, not the stack. >int types have no underlying format, and X takes all ranges of data Not entirely correct. Int formats are sensitive to endian issues and sign bits. X is specifically intended for use with ints. If you pass a float to it the parser can detect that. (As shown in the warnings from CLint++.) If the parser chooses to process it anyway as raw bits, you got lucky. But the parser may also choose to silently ignore it, or assume an int of a different size, etc. The parser may be written to determine correctly the sizes of integer arguments when processing X, and may assume a default size if passed an illegal type. The effects are unpredictable, undefined and unreliable. >printf ain't type-safe C++, That's no excuse for abusing it. >>Which is why many C/C++ language gurus deprecate their use, > >Which is why I don't work for so-called gurus. Do I detect an attitude problem? You can choose to be a professional C/C++ programmer or an undisciplined hacker. But if you choose to be the latter, don't pass yourself off as the former or denigrate those who strive to bring order out of chaos. >Which is also why you shouldn't lie to him : "lie"?? That implies deliberate deception and impugns my integrity. Personal insults are not welcome in these newsgroups. >and tell him he _can't_ use %X on non-ints. Fine. If you need it qualified: He can't use %X on non-ints and: (1) expect reliable, consistent, predictable results (2) expect to do so with impunity (3) expect any degree of portability in his code (4) be considered a competent C programmer >C is made for bit-fiddling. If you don't want to >bit-fiddle, use C++ or Java, or just about anything else. I wouldn't say it was *made* for that purpose. That it lends itself to that end may be seen as either a strength or a weakness of the language, depending on one's goals, needs and mindset. >Just don't condem a C programmer for using the language to his benefit More like using loopholes and exploiting weaknesses. >(or should that be bending the langueage to his will?) In this case, I'd say the latter. When using features of the C Standard Library in a way and for a purpose other than as designed and intended by the language spec and the compiler implementor, it's not just "bending" it's perverting and quite possibly "breaking". >>You're an anachronism. ;-)) > >And proud of it. I was sure it was a deliberate choice. >The specs say that X decodes an int. Correct. >My belief is that the designer didn't obfuscate the matter by >saying that "X decodes any data whose size is int" because it's too >wordy, and not succinct. Language lawyers who craft formal specs are hardly renowned for their succinctness. Such specs are noted for their verbosity. >So, _I_ think that this usage _was_ intended. I suspect you're alone in this world on that assumption. >It's a _compiler_ spec. It's not a >program spec.(It does tell the programmer what to expect the compiler >to do with his code, but he is free to ignore or pervert the usage if >he can and so chooses.) Sure. Give him enough rope and he'll shoot himself in the foot. Perverting the language may have its appeal for quick and dirty programs (programmers?), but it is not a practice to be encouraged widely. Neophyte C programmers should be taught and encouraged to use the language as designed and in a safe and reliable manner. After they've become competent in the "correct" use of the language, they can then safely explore aberrations from a solid base to which they can (and should) return as soon as possible. >Knowing that X takes an int and hexes it, a programmer with a brain >can figure out that placing something on the stack that looks like an >int (has the size of an int), will be decoded to hex (like an int) Are you suggesting that compiler writers don't have brains? Passing a non-int to a function such as printf when only an int is expected presumes that the implementation has been coded to determine correctly the size of *all* types. It may not have been designed to do that. It may use a default size if the arg passed is not an integer type. This may or may not happily coincide with the incorrect type of arg being passed. For efficiency, C library functions are "bare bones" implementations. They place the burden of responsibility for correct usage squarely on the programmer. >"Warning" because the programmer _probably_ made an error, Most likely. >"Warning" because the programmer _might_ know what he's doing. Most unlikely. >Otherwise, it'd be an "Error". Right. Passing incorrect specifiers to printf won't cause a compile to abort. It *will* usually cause the program to malfunction when run. Passing the incorrect number of arguments or specifiers will also pass undetected in most compilers. That doesn't mean it may be due to "cleverness" on the part of the programmer. -- Wayne A. King (waking@idirect.com, Wayne_A_King@compuserve.com) .