https://retrofun.pl/2023/12/18/was-basic-that-horrible-or-better/ RetroFun.PL [ ] * #Polski * #English * #Polski * #English fragment of source code of the Crystals program for Atari * Posted on 18 December 2023 * By ikari * In Uncategorized Was BASIC that horrible or... better? Everything was fine until BASIC entered the picture. Edsger Dijkstra in a hallucination of 2023's AI Simplicity is the ultimate sophistication. Leonardo Da Vinci It is practically impossible to teach good programming to students that have had a prior exposure to BASIC: as potential programmers they are mentally mutilated beyond hope of regeneration. Edsger Dijkstra, How do we tell truths that might hurt?, 18 June 1975 Selected Writings on Computing: A Personal Perspective, Springer-Verlag, 1982. ISBN 0-387-90652-5. [image] Edsger Dijkstra, a renowned computer scientist, famously made this last controversial statement in bullet-point list of inconvenient truths that he published in 1975. Dijkstra was known for his strong opinions on programming languages, and he believed that the simplicity and lack of structured programming principles in BASIC could hinder students from developing a strong foundation in programming. If we dig into what reality caused this statement, which made me felt personally attacked at some point ;), to be born, it's a pretty fascinating rabbit hole (as everything in computing is, if you stay curious). And to separate that personal feeling from facts, let me assure you that I will clearly separate facts from opinions in this kind of posts! How strong is the famous quote? This Dijkstra's opinion was so strong and caused so much offense, that today the Generative Pre-trained Transforms seem to associate him with enough hatred for the language to imagine quotes such as "Everything ws fine until BASIC entered the picture", something he never actually said. However, one could argue that the second quote is contradictory, as the language BASIC is itself, for the same reason, quite simple. That should make it a much better tool, allowing the programmer user to focus on the goal. Like modern crippled Golang ;-). Does it mean BASIC is too simple? Were all the popular alternatives better? It is important to note that Dijkstra's statement was made in a different era, when BASIC was one of the few widely accessible programming languages. Today, there are newer, more powerful programming languages and resources available that can help students develop strong programming skills, regardless of their prior exposure to BASIC, supporting the fact that it was a hyperbole used to prove a point. The statement was also just a bullet point in a bigger list of similarly negative declarations, such as "PL/I -"the fatal disease"- belongs more to the problem set than to the solution set." (remember? PL/I was the language behind Multics, a project that - by failing to meet the schedule - gave us Unix) or, much harder, "The use of COBOL cripples the mind; its teaching should, therefore, be regarded as a criminal offense.". Going back to "is it too simple?" question, we can look at BASIC evolution in three eras: * The 70s, when it made him so resentful * The 80s and 90s, when his statement made home computer users so resentful * BASIC today BASIC throughout the years The 70s The flavor of BASIC most likely criticized by Dijkstra is Dartmouth BASIC, the original (!) version of the BASIC programming language, from 1964. According to Wikipedia, it was designed by two professors at Dartmouth College, John G. Kemeny and Thomas E. Kurtz. With the underlying Dartmouth Time Sharing System (DTSS), it offered an interactive programming environment to all undergraduates as well as the larger university community. It wasn't the BASIC most older computer users worldwide know. In fact, it wasn't even an interpreted language, but a compiled one! [BASIC-compiler-Dartmouth] A note to the beginner: Compiled languages, like most, but not all, of the ones popular today (C, C++, Go, Rust) are those where a program cannot be directly executed after it was written. Instead, it has to go through the process of checking all the references between files and modules (including answering questions such as: where do they come from? do they exist? are all the values the correct type?), translating each instruction into a set of lower-level instructions directly executable by the computer's main processor. This process is known as compilation. The process can take a while, from a few seconds for moderate size code base, to many hours (Chromium, the core part of Chrome browser, would take 6-8 hours to compile on a 4-core machine). This means a few things - first of all, it's always a two-step process, not as interactive as running commands one by one. You have to wait for the program to check all the code for syntax errors and compile first, which comes with the benefit of basic (sic!) error check before you even execute the code. Interpreted languages, or rather programming language interpreters are the other way to execute code. Here, traditionally each instruction from the source code is translated to executable machine code only as the execution reaches that particular instruction. This, in turn, means some errors may go unnoticed until we reach a given point in the code, such as referencing a variable that doesn't exist, or is of a type incompatible with the way we want to interact with it. A common benefit, however, is that interpreted languages are usually less strict about the type of a variable, and most constructs can operate just as happily with integer numbers, as with floating point (the ones with the fractional part), or even dynamically adapt to when it's a string (text). Python, for example, extends this concept to all objects with an idea called if it looks like a duck and quacks like a duck, then it probably is a duck. And you can use it in all places of the code where a duck would be expected. With time, the distinction between compiled and interpreted code has gotten blurry, and performance gap tightened over the years. Traditionally interpreted languages, such as JavaScript, use techniques like Just-In-Time compilation nowadays, meaning a fragment of the code is indeed actually compiled before execution, and can even be automatically re-compiled with more optimization tricks, if it executes often. Some languages, like Java or C#, have also been considered a bit of both worlds - the compilation stage translates their source code to a so-called "byte-code" which is much lower-level, but doesn't execute directly on the hardware - instead, those simplified instructions translate to one or more hardware instructions at runtime. While Dartmouth BASIC was a compiled language, it didn't bring all the downsides of having the extra compilation step before execution -- with many other compiled languages of the 80s on home computers, and if we had "only" one computer, and a single-tasking one, probably including the need to exit the compiler, load the program, see it fail, load back the compiler, load our files, fix the problem, compile again... all of this lengthening the update cycle significantly. Dartmouth did better. At any time, when the program was in memory, you could use SAVE to save it from being forgotten when you finish working with the computer, and RUN to compile and execute it right away. The RUN command is familiar to users of our later 8-bit home computer BASIC environments, in which it instructs the computer to start interpreting the code line by line. It also had the most recognizable feature of BASIC -- each line begins with the line number, so even if you don't have an editor available, you can add lines at arbitrary positions between the existing ones (hence also the convention to number them in increments of 10, rather than 1, 2, 3... -- gives you the chance to insert some more code between lines 10 and 20, if needed). Why would you not have an editor? It's not that they didn't exist, even documentation from languages from the 60s mentions a few text editors. The problem was that editors require resources, and for decades computer software really wanted to spare every kilobyte not needed, and leave it free for the user programs. Or games. The first version of the language was extremely limited, compared to any later popular version of BASIC. The only supported keywords apart from math functions were: LET, PRINT, END, FOR...NEXT, GOTO, GOSUB...RETURN, IF...THEN, DEF, READ, DATA, DIM, and REM. This means very basic flow control, I/O, comments, and basic arrays. The last feature is not trivial, so it was one of the few features that made it more useful than assembly language. . Variable names were limited to a single letter or a letter followed by a digit (286 possible variable names), which made the programs much harder to read (by a human being), than they should be. By 1975 the language reached its Sixth edition. User input was added, a number of math operators was there (along the lines of ABS, LOG, RND, SIN). Did it allow "normal" (longer) variable names? No trace of such feature. Did it allow full commands (statements) in IF...THEN? Also no! It was only a conditional GOTO statement, meaning you must have wrote it as IF A>0 100 where 100 is the line to execute if the condition is met. I can see where the rage against the machine running BASIC was coming from when I consider the classical "guessing game" example (the computer picks a random number, and the user is supposed to guess the number, being given hints like "too large" or "too small"): 100 REM GUESSING GAME 110 120 PRINT "GUESS THE NUMBER BETWEEN 1 AND 100." 130 140 LET X = INT(100*RND(0)+1) 150 LET N = 0 160 PRINT "YOUR GUESS"; 170 INPUT G 180 LET N = N+1 190 IF G = X THEN 300 200 IF G < X THEN 250 210 PRINT "TOO LARGE, GUESS AGAIN" 220 GOTO 160 230 250 PRINT "TOO SMALL, GUESS AGAIN" 260 GOTO 160 270 300 PRINT "YOU GUESSED IT, IN"; N; "TRIES" 310 PRINT "ANOTHER GAME (YES = 1, NO = 0)"; 320 INPUT A 330 IF A = 1 THEN 140 340 PRINT "THANKS FOR PLAYING" 350 END src: The example comes directly from Dartmouth College. As you see, there code looks braindead simple, with conditional jumps (IF G = X THEN 300) that require you to jump along, taking your focus with you, and that do not support ELSE statements (in this simple example, an ELSE is realized like in assembly - by omission and just continuing over. If you didn't get the number right and make the jump in line 190, the number you provided is either too small, in which case you jump from line 200 to line 250, or you just follow along to line 210, because after ruling out the numbers being equal and GX. All the code looks very flat compared to today's standards. This is because most BASIC implementations not only in the 60s and 70s, but also some in the 80s and 90s, did not handle unexpected whitespace very well. You already know the reason why you don't see indented blocks within IF...THEN blocks... there's nothing to indent, if you can't put commands in the THEN clause, and can't use multiple ones even in the BASIC versions where you can. This is cheating, but let's peek into 1982 and check out Commodore 64's code: 10 PRINT"HELLO, HOW OLD ARE YOU", 20 INPUT A 30 IF A > 30 THEN 40 PRINT"THAT'S A GOOD AGE FOR A RETROFUN.PL VISITOR!" 50 END What may happen if a 25-year-old user executes it is: RUNHELLO, HOW OLD ARE YOU? 25THAT'S A GOOD AGE FOR A RETROFUN.PL VISITOR! While 25 is definitely a good age for a RetroFun.pl visitor, we can see that neither is the line 40 executing only if the condition A>30 is met, nor there's any error that our THEN block was effectively empty either. Having any code structure is going to be based on GOTO, GOSUB (that's a GOTO that can RETURN to where it was called from) and more "flat" lines of code. (Oh, I actually cheated twice - using END that ends the program in a way that may have tricked you into thinking it has anything to do with the IF statement). One more elephant in the room There same source provides an example of how to plot a bell curve (amazingly simple to calculate in this math-heavy university-ready BASIC): 100 REM PLOT A NORMAL DISTRIBUTION CURVE 110 120 DEF FNN(X) = EXP(-(X^2/2))/SQR(2*3.14159265) 130 140 FOR X = -2 TO 2 STEP .1 150 LET Y = FNN(X) 160 LET Y = INT(100*Y) 170 FOR Z = 1 TO Y 180 PRINT " "; 190 NEXT Z 200 PRINT "*" 210 NEXT X 220 END You might have noticed, that the language doesn't have any keywords making it easy to actually plot anything on the screen (as in: light up a pixel), and the example also kept it down to text mode. But it's not a missing feature of the language, or rather not a necessary but missing feature. If you asked about it, they would reply with... What screen? Math professor and future Dartmouth president John Kemeny looks over a program written by his daughter Jennifer Kemeny '76 using the Teletype computer terminal at their home. (Photo by Adrian N. Bouchard/courtesy of Rauner Special Collections Library) Andrew Behrens '71 checks the output from his Model 35 Teletype. IBM punch cards are visible in the background. (Photo by Adrian N. Bouchard/ courtesy of Rauner Special Collections Library) The elephant in the room is the computer the size of an elephant. Dartmouth BASIC was, like the system it worked on (DTSS), operated from remote terminals, which challenge even our todays definition of a lightweight terminal (you'd assume a simple device with a keyboard and a screen... oh, and lightweight), or a terminal as in the 80s. It's essentially a heavy desk with a keyboard and a remotely-controlled typewriter (No, not a dot matrix printer). That's where the "Teletype" name comes from. [Teletype-IMG_7299] Competition (though also considered bad) Is it a programming language? Yes. Is it a nice one? No. Is it better than the competition? Well, Dijkstra in the same paper referred to PL /I, COBOL, Fortran, APL (a wonderful mix of praise and mockery: "APL is a mistake, carried through to perfection. It is the language of the future for the programming techniques of the past: it creates a new generation of coding bums."), and FORTRAN ("hopelessly inadequate"). By the way, if "PL" rings a "PL/SQL" bell for you, yes, there is some similarity between these two, but they are not the same language (the acronym represents different names too, "Programming Language" One vs "Procedural Language" in SQL). Even a simple for loop is a bit different (see below). On the other hand, both languages use keywords like BEGIN and END instead of curly braces, which makes them more similar to each other, but also to Pascal or ADA (the last two are a separate new world we can explore). -- PL/SQL: FOR i IN 1..10 LOOP DBMS_OUTPUT.PUT_LINE(i); END LOOP; -- PL/I: DO I = 1 TO 10; PUT SKIP LIST(I); END; Interestingly, a comparison of PL/I, COBOL and Fortran was published in December 1967 in the PL/I bulletin issue 5. [image] They compared the languages using good criteria - difficulty to learn, and difficulty to use, the latter measured by number of statement needed to achieve a particular goal, applicability in scientific and business cases. However due to the small publication size (a letter to a bulletin) the results are more quoted than presented, statistics for number of statement are given without the source codes. Unsurprisingly (it's a PL/I bulletin after all), PL/I was considered superior in some, and at least just as good in other areas. It aimed to combine the best features of the other two. In reality it also had problems keeping up with their individual development, which may be the reason of its smaller adoption. Some PL/I example code from the time can be found in the same issue (see PL/I bulletin archive), but the scan quality makes it hard to embed or quote in a post. However, I looked up an example from October 1976, from the compiler documentation, and, in my opinion, it shows how much more structure the code has had (note: indentation, complex if-else, procedures, procedure arguments, and the fact the arguments can have the same name as variables outside, shadowing them - not possible in BASIC at the time). By the way, keywords are case-insensitive. A: PROCEDURE; DECLARE S CHARACTER (20); DCL SET ENTRY(FIXED DEClMAL(1)) OUT ENTRY(LABEL); CALL SET (3); E: GET LIST (S,M,N); B: BEGIN; DECLARE X(M,N), Y(M); GET LIST (X,Y); CALL C(X,Y); C: PROCEDURE (P,Q); DECLARE P(*,*), Q(*), S BINARY FIXED EXTERNAL; S = 0; DO I = 1 TO M; IF SUM (P(I,*)) = Q(I) THEN GO TO B; S = S+1; IF S = 3 THEN CALL OUT (E); CALL D(1); B: END; END C; D: PROCEDURE (N); PUT LIST ('ERROR IN ROW ', N, 'TABLE NAME ', S); END D; END B; GO TO E; END A; OUT: PROCEDURE (R); DECLARE R LABEL, (M,L) STATIC INTERNAL INITIAL (0), S BINARY FIXED EXTERNAL, Z FIXED DECIMAL(l); M M+l; S=O; IF M ... END IF, for example. And we could have variables of any name length (however some sources hinted that the one-letter ones are the fastest, because they have a static memory address; was it true?), so it shouldn't be so bad, right? It was horrible. Solving challenges from day 1 (solution) or day 2 (solution) was fun, even - or especially - when trying to be fairly memory efficient, and a little dirty on the approach. But then comes day 3, where we work with 140x140 matrix of characters. Some of them form numbers, some are symbols, everything else is filled with . as blank space. Task one is to find all numbers that are adjacent to a symbol (next to it, above or below, or diagonally). This can be done, as one possible way, by loading into memory 3 lines at a time, and processing the middle one as the "current" one, and scrolling your way through the dataset without ever keeping more than 3 lines in memory. Then, keeping in mind the lack of more advanced string manipulation functions like "split", "find" or "indexOf", not to mention regular expressions, we can process such line character by character. If it's a digit, we're inside a number. Update the number digit by digit (n=10*n+digit), add to know numbers when we're out of digits, and check for adjacency of a symbol. It all sounds simple, if you can abstract your logic into functions, or some kind of smaller modules. But once the code grows a little bigger, and due to lack of memory, utilities, and all that stuff, you want to keep a few more flags for your happy little state machine, things get complicated. The day 3 solution that works is 210 lines long. In a 2020s IDE, and a 20XXs programming language, that'd be small and easy to maintain. But at that point, having all variables global makes them hard to track and reuse correctly, not being able to have functions or procedures makes it harder to know inputs and outputs (so GOSUB solves only half of the problem), and it feels like you spend more time on the lower level end of each operation, than on actually solving the main problem. When I tried to run it with a BBC BASIC for CP/M, it also turns out even the part of reading one line from the file would need a rewrite to a more low-level byte-by-byte approach, as the GET$#... TO ... keyword is not supported. The determined coders These difficulties don't mean sophisticated programs or games couldn't be created for the computers in their native BASIC! The demo program of Amstrad CPC is written entirely in BASIC, except for the "Roland in Time" game fragment, showing off graphics, music, spreadsheet and word processing functions. Amstrad CPC demo program is written 90% in BASIC (except from a game fragment shown) Or checkout a more modern (2006) demo, but written entirely in Atari BASIC for Atari XL/XE (feel free to use 2x speed): These demos feature something you haven't seen before! Graphics (and sound)! Less or more advanced, but we see things in color, we see them animate, and in some cases we can also hear sounds and music. This would not have been possible in the dialects of BASIC designed to run remotely and print on a teletype. By moving the computer physically into the user's room, the industry opened the door to multimedia and more dynamic entertainment. Note: not all BASICs at the time had graphics keywords. Commodore 64's, for example, did not, but some BASIC-coded graphic effects could leverage the way it printed text characters in color, and that it could support sprites - which were graphics that could be overlaid on top of regular screen content without extra copying operations! I'm an artist! Enabling interaction richer than text prompts and answers is more than just that technical difference, more than an item on the spec list. Every language that makes it easy to create art, opens up your creativity, invites for experimentation, and gives you control and means of expression. And it did just that, two seconds away from toggling the POWER switch. Many examples, including type-ins, were combining mathematical functions and PLOT (draw a point) or LINE (draw a line) for surprising, mesmerizing artistic effects. Below is a (10x speedup) capture of "Crystals" demo written in Atari BASIC on Atari XL/XE. Crystals demo (BASIC) on Atari XL/XE These capabilities, allowing one to create procedural art on their own, are what I think the biggest advantage and the biggest impact of the BASIC language on average computer user. The full source code isn't even big, see the full listing under the link below: fragment of source code of the Crystals program for Atari The magic of visual programming is what can be attractive to a non-nerd, and what introduced basic coding to masses in the 80s and 90s. I have put this risky hypothesis in the posts title: that BASIC might have done something better than most languages. This is the thing better about BASIC - easy access to the computer's fundamental graphical and musical capabilities. The commands were simple, yet the simplicity invited experimentation, and building upon them. Manuals for computers at that time also often taught you the math behind some shapes, explaining how and why a circle can be expressed as (x, y) = r*sin(x),r*cos(y), and therefore how to draw it with a LINE. Of course, modern and widely popular languages support all of that too (why would you remove something that works?), one of the closest examples in terms of simplicity and popularity would be JavaScript canvas. Sample code and effect on the left for JavaScript, right for BASIC: const canvas = document.getElementById("myCanvas"); const ctx = canvas.getContext("2d"); ctx.beginPath(); ctx.arc(100, 75, 50, 0, 2 * Math.PI); ctx.stroke(); ORIGIN 100,75:r=50 FOR i=0 TO 360:PLOT r*sin(i),r*cos(i):NEXT [image] [image-1] The BASIC example seems simpler, even though it doesn't have a keyword/function for arcs or circles, and is a result of typing in the code on the right directly after booting the 8-bit computer. The example on the left, smoother, higher resolution, and much faster to paint, is on the other hand longer, and... incomplete. It refers to an element myCanvas within a HTML document that is not even part of the example (so we need at least one line more, somewhere, and a web browser). Most languages will have many ways to paint things onto the screen, and there will be dozens of libraries to choose from - this is both for the better or worse. If the language doesn't support something with the built-ins, it means both flexibility and the need to find the best tool for the job, a way to refer to it, possibly install or bundle with your program, and so on - these problems don't exist if you work with a simple, but highly integrated environment. You could say BASIC helped to learn the basics. Once you master that, you usually want to move on to a more powerful toolkit no matter what the machine is. Today This gives us two aspects to look at "today" in: * How is BASIC doing today? How are competitors doing? What other languages are the most popular? * What gives similar user experience today, but is more modern, easier to use, and just as fun? The others are not dead yet Surprisingly, none of the languages mentioned previously for the 70s, 80s and 90s is dead. * PL/I is still supported and maintained by, as it always has been, IBM. Here's a PL/I manual from 2019. * Nordea is still hiring COBOL developers. Desperately, I presume. * Fortran is alive and somewhat kicking. Latest spec is Fortran 2023. * APL is more a curiosity - a language where a valid statement may look like {|1 [?][?].[?]3 4=+/,-1 0 1[?].[?]-1 0 1[?].[?][?][?]} is not easily adopted, and gets the hearts of hobbyists and mathematicians more than the average programmers. The top 10 programming languages have changed since the 70s (or 80s). According to IEEE, the top 10 of 2023 are: Python, Java, C++, C, JavaScript, C#, SQL, Go, TypeScript, and... HTML (huh? the name literally says it's a Markup Language). For now, let's just note the presence of C (started from 1972 but not officially published until 1978, so Dijkstra could not rant on it). Visual Basic ended up on place 24th, noting MS decided not to develop the language further in 2020. Fortran is still there, on place 27th, COBOL on 34th, and Ada on 36th. Last, but not least, Pascal/Delphi is mentioned on place 45th. Revised Text: The top 10 programming languages have changed since the 70s (or 80s). According to IEEE's ranking for 2023, the top 10 languages are: Python, Java, C++, C, JavaScript, C#, SQL, Go, TypeScript, and... HTML (huh? the name literally says it's a Markup Language). Notably, C, which started in 1972 but was officially published in 1978, couldn't be criticized by Dijkstra at the time. Visual Basic is ranked 24th, as Microsoft decided not to further develop the language in 2020. Fortran remains at 27th place, COBOL at 34th, and Ada at 36th. Lastly, Pascal/Delphi is mentioned at 45th place. Neither is BASIC BASIC didn't stop in its evolution, of course, either. These show the language is still valued for simplicity and ease of use with hardware. Notable mentions include: * The QuickBASIC that was included in MS-DOS, keeping it possible to easily access a basic programming language. * Visual Basic (with Visual Basic .NET, VBScript, and office packages scripting languages such as Microsoft's Visual Basic for Applications and OpenOffice Basic that evolved from this one) - Microsoft's continuation of the BASIC lineage over the years. You could trace its evolution from a beginner-friendly language to a full-featured professional development tool. However, in 2020 Microsoft announced there is no further development of the language planned. * FreeBASIC. This is an open source project, and very feature rich. * BBC BASIC - this one has a long history! It's an evolution of the language created for the BBC Micro computer in 1981. Supports a massive amount of operating systems today (from 80s CP/M computers, through Raspberry Pi, to Android and iOS). The site is also strong on the documentation side. * SmallBasic by Microsoft - intended to learn programming, "even by kids", which actually suggests it may focus on the simplicity of "fun" features that BASIC brought. It also brings in turtle graphics, a concept introduced by the language LOGO, which brought a new, relative, approach to drawing things on the screen. * BASIC dialects such as B4X can even run on microcontrollers - Versions of BASIC have been created for Arduino and other microcontroller boards, bringing back the spirit of early PC BASICs. + PBASIC - A commercial BASIC variant created specifically for Parallax's BASIC Stamp microcontrollers. Known for its accessibility and approachable documentation. But what would have the same effect today? While the BASIC name may not have the popularity it once held in the home computing era, the language's legacy lives on in various forms today. A great place to start and see immediate interesting effects with little code would be the Processing language (also available in JS as p5js). They provide simple APIs for drawing graphics, animations, and visualizations in an immediate way reminiscent of classic BASIC interpreters. Take a look at the example: Recursive Tree / Examples / Processing.org - the language has features that BASIC was lacking, allowing you to properly structure the code, and it has super easy to use commands to draw things on the screen, or even render 3D objects. Another neat example with some sound, a simple concept, yet playful: p5.js Web Editor | BUBBLE WORDS (p5js.org) The notion that Processing is an example of is called "creative coding", and Processing is as good at it as BASIC was in the 80s. Check out the Bull's eye demo below, or Floating In Space: I would highly recommend it for creative and fun experiments, but there are other options, too, of course. Let's mention at least a few. Scratch, the colorful block-based programming language designed for kids, carries the torch of BASIC's legacy perhaps better than any other modern tool. By using visual blocks that snap together like puzzle pieces rather than typed syntax, it removes a major early barrier to coding creativity that existed even in simplified BASIC versions. Themed graphics, animations, and sound libraries make exploration even more fun for budding young programmers. Just as BASIC and early home computers created a gateway for many tech pioneers, Scratch aims to foster that same experimental spirit, no matter a child's prior access to technology or education. Its online community also connects peers to share and remix projects - a markedly more social approach than solo BASIC tinkering of the past. If you're more comfortable in programming in general and don't hesitate using more commands to achieve your result, as the cost for more flexibility and portability, consider JavaScript - While not strictly BASIC-derived, JS has a very loose, dynamic style that echoes some qualities of the language. The fact that it's so widely used for interactive web apps connects to BASIC's interactive nature. Now and then While the BASIC name may not have the popularity it once held in the home computing era, the language's legacy lives on in various forms today. Modern tools like Processing and p5.js for creative coding projects have inherited BASIC's focus on accessibility and rapid visualization for beginners. Inspired by Java and JavaScript respectively, they provide simple APIs for drawing graphics, animations, and visualizations in an immediate way reminiscent of classic BASIC interpreters. Scratch carries on BASIC's mantle for introducing young students to programming in a fun and intuitive environment. Even outside the realm of purely educational tools, JavaScript itself, despite no direct lineage from BASIC, has a flexible, beginner-friendly coding style that echoes some of BASIC's most famous qualities. The fact that JavaScript powers most interactive websites and web apps today mirrors how BASIC enabled new realms of software interactivity in the early PC era. And for those yearning for BASIC's glory days on microcomputers, modern microcontroller boards like Arduino often have custom BASIC interpreters and compilers created by the community to control hardware projects. So while it evolves across new platforms, BASIC's accessibility and focus on rapid iteration persists in the DNA of many modern coding tools. While Dijkstra's inflammatory criticism of BASIC was controversial, his quote sparked discussion that influenced the growth of computer science education and programming language design. The history of BASIC illustrates how strongly opinions can differ regarding the best way to balance simplicity and power when creating tools for novice programmers. In its early days, BASIC favored ease of use over advanced capabilities, though over time it evolved by incorporating more features without compromising approachability. Modern BASIC dialects aim to offer a gentle starting point along with capabilities to take on more complex coding. There are still debates around finding the right equilibrium to serve programmers across the skill spectrum. However, the differing perspectives pushed the field forward. In the end, a diversity of languages can coexist, fitting different needs. The intensity of Dijkstra's viewpoints sheds light on how passionately programmers care about building the best systems for their peers to create software magic and unlock human potential. While his criticism was extreme, it opened valuable dialogue. See also Dartmouth College documentary about BASIC for the 50th anniversary (2014) highlights other key features of the system, such as time-sharing (BASIC was actually used concurrently by multiple users!) BASIC BASIC at 50 - Dartmouth College 2014 documentary * Example Dartmouth BASIC manuals: first and 4th edition of the language: basicmanual_1964Download BASIC_4th_Edition_Jan68_textDownload * BASIC Type-Ins by Sean McManus: Amstrad CPC 464 664 6128 Basic programming tutorial and games. The Basic Idea (sean.co.uk) PL/I * PL/I Bulletin archive at teampli.net * If you're curious how PL/I was used in Multics source code, here's a sample utility. COBOL * Some friendly COBOL examples with modern terminology here: 7 cobol examples with explanations. | by Yvan Scher | Medium -- I do not condone closed platforms like Medium, Yvan says he moved his blog to his own domain, but that domain doesn't work anymore. Processing * Discover - OpenProcessing - example gallery to get inspired Related Tags:BASIC, history, PL/I, programming Share * * * * * * 3 * 0 Post navigation Previous postAnother Atari acquisition 3 thoughts on "Was BASIC that horrible or... better?" 1. Pingback: Was BASIC that horrible or better? - Zweft 2. [f2c386b4e76e] Tim says: 23 December 2023 at 18:59 "limited to a single letter or a letter followed by a digit (286 possible" How do you get that number? Reply 1. [c41420e16533] JSL says: 23 December 2023 at 19:26 10*26 (each single letter followed by a single digit 0-9) + the 26 letters by themselves. 260+26 = 286. Reply Leave a reply Cancel reply Your email address will not be published. Required fields are marked * [ ] [ ] [ ] [ ] [ ] [ ] [ ] Comment * [ ] Name * [ ] Email * [ ] Website [ ] [ ] Save my name, email, and website in this browser for the next time I comment. [Post comment] [ ] [ ] [ ] [ ] [ ] [ ] [ ] D[ ] [ ] Recent Posts * Was BASIC that horrible or... better? 18 December 2023 * Another Atari acquisition 1 November 2023 * Compile your own IBM 8088 XT BIOS! 27 October 2023 * CBM PET gets an Italian-plumber-lookalike game! 18 October 2023 * Atari acquires AtariAge 10 September 2023 Recent Comments * JSL on Was BASIC that horrible or... better? * Tim on Was BASIC that horrible or... better? * Was BASIC that horrible or better? - Zweft on Was BASIC that horrible or... better? * Was BASIC that horrible or... better? - RetroFun.PL on Ho(bby| arding) * Was BASIC that horrible or... better? - RetroFun.PL on Some art and history for you Widgets Archives * December 2023 * November 2023 * October 2023 * September 2023 * June 2023 * May 2023 * April 2023 * February 2023 * December 2022 * November 2022 * October 2022 * June 2022 * May 2022 * March 2022 * July 2021 * June 2021 * May 2021 Categories * Uncategorized Meta * Log in * Entries feed * Comments feed * WordPress.org