[HN Gopher] How were the signs for logical and arithmetic operat...
       ___________________________________________________________________
        
       How were the signs for logical and arithmetic operators decided?
        
       Author : edent
       Score  : 62 points
       Date   : 2023-09-28 18:36 UTC (4 hours ago)
        
 (HTM) web link (retrocomputing.stackexchange.com)
 (TXT) w3m dump (retrocomputing.stackexchange.com)
        
       | crabbone wrote:
       | A lot of math symbols are accidental and owe their existence to
       | circumstances of writing, book publishing, typewriter or later
       | computer keyboard. Very few have any kind of history that helps
       | understand their meaning.
       | 
       | Just a few examples: "lambda" in "lambda calculus" was originally
       | the caret sign, which, by convention was used on typewriters
       | because those didn't have Greek letters. Whether Alonzo Church
       | actually wanted the Greek letter or the caret symbol is not known
       | (and he, himself, probably wouldn't care), but if you look at the
       | manuscript, it uses "^".
       | 
       | Minus sign was an attempt to create a sign "opposite to plus",
       | whereas the plus sign was a contraction of Latin "at" (ampersand
       | is another way in which Latin "at" made it into modern
       | typesetting). Multiplication was also an attempt to manipulate
       | the plus sign. And so was division (the horizontal line between
       | two dots, and subsequent mutation into just two dots making a ":"
       | sign).
       | 
       | Asterisk, which used plenty in different mathematical contexts
       | was actually a punctuation sign. Early Roman writing didn't use
       | spaces between words, instead they put dots between them. It was
       | _the only_ punctuation sign they used, so no commas, periods,
       | question marks etc. All that stuff came almost a thousand years
       | later.
       | 
       | A bunch of logical signs were invented by Frege and later
       | popularized / amended by Wittgenstein. These were completely
       | artificial and didn't derive from anything. I don't think the
       | authors ever explained why they chose any specific shapes, but my
       | guess would be that they were looking for something visually
       | distinct enough for a small alphabet they created and arbitrarily
       | assigned symbols to operations.
        
         | joe_the_user wrote:
         | Interesting post but I think you're wrong in the following
         | sentence:
         | 
         |  _A bunch of logical signs were invented by Frege and later
         | popularized / amended by Wittgenstein._
         | 
         | Jan Von Plato goes into this in The Great Formal Machinery
         | Works. Frege's notation never become popular - his notation
         | involved lines from one part of an expression to another (!)
         | and is fairly incomprehensible. A more or less modern system of
         | symbols of mathematical logic was created by Russel and
         | Whitehead in Principia Mathematica extending the symbols of
         | Peano (but using the _ideas_ of Frege). Russel and Whitehead
         | syntax, ( "there exists" and "for all" symbol) was taken up by
         | Hilbert and the Gottingen school which crafted a lot of the
         | basic theories of mathematical logic. Von Plato doesn't refer
         | to Wittgenstein influencing notation and I don't think he did -
         | he was a popular but idiosyncratic philosopher, not a
         | mathematician.
         | 
         | Von Plato's history of mathematical logic in Great Formal
         | Machinery... is fascinating.
         | 
         | Edit: Wikipedia describes Frege's notation. You have draw lines
         | connecting expressions, an approach that quickly gets unwieldy.
         | I believe that only Frege used Frege's notation. Wittgenstein
         | refers to Frege but that's it.
         | 
         | https://en.wikipedia.org/wiki/Begriffsschrift
        
           | crabbone wrote:
           | To the best of my knowledge, Wittgenstein lived with Russel
           | for quite some time and was a very influential force in
           | writing the Principa. I believe I saw a lot of prototypes of
           | the modern logical symbols in his Traktatus.
           | 
           | I also believe that Russel traveled to meet Frege in his
           | quest for writing the book. He definitely saw his work even
           | before it was published / had private conversations with him
           | face-to-face. Not sure how much of the invented language made
           | it into Principia, but the author was definitely familiar
           | with Frege's work.
        
       | 082349872349872 wrote:
       | I had been guessing * made its way into character sets because of
       | its business use as a line fill for printed checks. (for those of
       | you not alive in the 20th century, see
       | https://en.wikipedia.org/wiki/Cheque )
       | 
       | Edit: then again, it's in the 0x28-0x2F range along with + - and
       | /, which suggests it was already thought of as mathematical as
       | early as 1963...
       | 
       | Edit 2: it doesn't seem to occur in ITA-2, and in FIELDATA it's
       | alongside punctuation like & $ ( and %.
        
         | adrian_b wrote:
         | Perhaps for the reason proposed by you, the IBM typewriters had
         | "*" but they did not have "x U+00D7;MULTIPLICATION SIGN".
         | 
         | This has forced the choice of "*" for multiplication in
         | Fortran, in 1956. All the other languages that use "*" have
         | taken it from Fortran. There are only a few exceptions, like
         | Algol 60 and APL\360, which used "x", because they did not care
         | about what the standard IBM printers supported.
         | 
         | The ASCII encoding is irrelevant, it was chosen more than a
         | decade later and the earlier character sets had encoded "*" in
         | completely other locations.
        
       | junon wrote:
       | I asked a question about the origins of the XOR operator a while
       | back.
       | 
       | Ken Thompson was emailed by the accepted answerer and provided a
       | response, in fact!
       | 
       | https://softwareengineering.stackexchange.com/questions/3313...
        
         | totoglazer wrote:
         | Also for ampersand!
         | https://softwareengineering.stackexchange.com/questions/2520...
        
       | vincent-manis wrote:
       | Bob Bemer, one of the principals of the development of ASCII,
       | argued that the backslash should be incorporated so that "and"
       | ([?]) could be written as /\ and "or" ([?]) as \/. The logic
       | signs are derived from the Latin ac (or atque, "and") and vel
       | ("or"), respectively.
        
       | bazoom42 wrote:
       | Before ASCII, languages didn't necessarily mandate what
       | characters to use for operators and other symbols. They defined
       | an abstract syntax, but left it to the implemention for a
       | particular machine what chracters to use for a given symbol,
       | because of the large variation in keyboards and character sets
       | across machines.
       | 
       | I believe Algol and BCPL did it like that, and C was one of the
       | first languages to specify the exact characters to use for its
       | symbols.
        
         | nine_k wrote:
         | More than that; ALGOL-68, for instance, allowed for translating
         | keywords into different national languages, IIRC.
        
       | [deleted]
        
       | adrian_b wrote:
       | Most comments there do not go far enough back in time but refer
       | only to later languages that have inherited the operators from
       | other earlier languages.
       | 
       | Regarding the logical operators, in mathematics, many decades
       | before the first programming languages, the symbols were:
       | "[?] U+2227;LOGICAL AND"        "[?] U+2228;LOGICAL OR"
       | Either "~ U+223C;TILDE OPERATOR" or "! U+00AC;NOT SIGN"
       | 
       | Many early programming languages, e.g. Algol 60, CPL, IBM
       | APL\360, used these (Algol 60 used "!", McCarthy, CPL and APL
       | used "~").
       | 
       | IBM FORTRAN IV (1962) was meant to be used with IBM printers with
       | poor character sets, so it used ".AND.", ".OR.", and ".NOT.".
       | 
       | The next IBM language, IBM NPL (December 1964) replaced the
       | FORTRAN IV keywords with "&", "|" and "!" (because these symbols
       | were included in the IBM EBCDIC character set, while
       | "U+2227;LOGICAL AND" and "U+2228;LOGICAL OR" were not included).
       | 
       | The next year NPL was rebranded as PL/I and all the languages
       | that use "&" and "|" have taken them directly or indirectly from
       | PL/I.
       | 
       | The languages B and C have taken their symbols and keywords from
       | 3 sources, BCPL, PL/I and Algol 68. The logical operators were
       | taken from PL/I.
       | 
       | By the time of B, ASCII had been standardized and it did not
       | include "! U+00AC;NOT SIGN".
       | 
       | Because of this, B had to replace in the PL/I operators "!" with
       | an unused ASCII symbol, "!".
       | 
       | In B, "&" and "|" were ambiguous, depending on the context they
       | were interpreted as either bit string operators or as "McCarthy
       | AND" and "McCarthy OR".
       | 
       | C has resolved the ambiguity by adding "&&" and "||", and then it
       | has added "~" for bit string not, which had already been used for
       | this purpose in mathematics and in earlier programming languages.
        
         | fanf2 wrote:
         | The classic [?] [?] symbols from mathematical logic have been
         | used in a few languages, often spelled /\ \/
         | 
         | After CPL they turn up in Miranda (as the weirdly asymmetrical
         | & \/) and in Occam
         | 
         | I thought there were more, but I can't find them right now.
        
         | [deleted]
        
       | bee_rider wrote:
       | Fortran has .and. and .or. for logical operators, which is pretty
       | easy to remember.
       | 
       | & seems pretty obvious, of course they couldn't have known at the
       | time that logical operations should probably be given priority
       | over bitwise ones.
       | 
       | | is an odd symbol. They don't really justify it in the stack
       | overflow answer, just mention where it came from.
       | 
       | Unix pipe, conditional probability, and or. I guess a vertical
       | line is naturally going to be very popular though.
        
         | adrian_b wrote:
         | The early character sets that were available in teletypes and
         | line printers were not selected with the purpose of writing
         | mathematical notation, but with the purpose of writing
         | "business correspondence".
         | 
         | Because of this, the authors of most early programming
         | languages, with the exception of APL\360 and of some European
         | programming languages, which were not restricted to the IBM
         | business-oriented character sets, had to replace the
         | traditional mathematical symbols with whatever "business"
         | characters were more suitable.
         | 
         | There is nothing odd about "|". It was included in the
         | "business" symbols, for drawing tables. Among the few available
         | symbols, it was doubtless the most appropriate choice for "OR".
        
       | 082349872349872 wrote:
       | In an ideal world, of course, symmetric operators would have
       | symmetric symbols, non-symmetric operators asymmetric symbols ...
       | and antisymmetric operators what kinds of symbols?
        
       ___________________________________________________________________
       (page generated 2023-09-28 23:00 UTC)