[HN Gopher] Turing Award goes to Aho and Ullman
___________________________________________________________________
Turing Award goes to Aho and Ullman
Author : jonstewart
Score : 340 points
Date : 2021-03-31 12:43 UTC (10 hours ago)
(HTM) web link (www.nytimes.com)
(TXT) w3m dump (www.nytimes.com)
| pjmorris wrote:
| Back in the dark ages when 'Data Structures and Algorithms' was
| our state of the art textbook, we called the author trio of Aho,
| Hopcroft, and Ullman the 'Three Wise Men.' Glad to see Aho and
| Ullman get more serious recognition.
| a-dub wrote:
| masters of the real "string theory"
| noisy_boy wrote:
| Alfred Aho is the "a" in awk[0] (Peter Weinberger and Brian
| Kernighan being "w" and "k").
|
| [0]: https://en.wikipedia.org/wiki/AWK
| [deleted]
| credit_guy wrote:
| The book "The AWK Programming Language" by Aho, Weinberger and
| Kernighan is a real gem. Chapter 6 is "Little Languages", it
| shows you how to write interpreters and compilers using AWK. In
| particular it even shows how to start a bootstrap: you write a
| compiler to a subset of AWK that writes down C code. You don't
| need to read all the 5 chapters before, only Chapter 1 is
| enough.
| dfjsk wrote:
| Generally when a compiler produces C code (or assembly, or
| whatever) it is described as _emitting_ C code. "A compiler
| for a subset of AWK that emits C code"
| [deleted]
| zests wrote:
| Also:
| https://en.wikipedia.org/wiki/Aho%E2%80%93Corasick_algorithm
| smortaz wrote:
| thank you aho & ullman. your dragon book got me hooked & started
| a 40 year career in compilers/languages. first job, an algol68
| compiler on burroughs hw back in the middle ages.
| jonstewart wrote:
| I never took a compilers course in college and came to regret
| that. Ten years later I started reading the Dragon Book, and
| realized how I could trivially modify the NFA regular expression
| search algorithm to solve a problem I'd seen in digital forensics
| and eDiscovery. The writing is extremely clear and easy to
| follow, and I now tell all interns to be sure to take a compilers
| course before they graduate. Well-earned!
| forinti wrote:
| Although I really enjoyed compilers in college I never thought
| I would build one for practical reasons.
|
| I ended up using the knowledge for two very interesting
| projects. One was a template engine in Java for turning
| variables into clauses in SQL queries. The other was an
| interpreter for PL/SQL that helps to extract procedures and
| functions in order to do partial updates of packages in
| production (the user chooses which procedures to send to
| production and doesn't have to update the whole package).
| 0xfaded wrote:
| My first job had me write a biology protocol description to
| robot commands compiler. It wasn't Turing complete, but it
| did need to allocate and assign "registers", which were the
| physical spaces available for the 96 well sample plates.
| grok22 wrote:
| Weird, I remember the book as being too dense and cryptic
| without sufficient explanatory text for a compilers course when
| I originally read it...maybe I am mis-remembering. I will have
| to find a copy and take another look.
| cxr wrote:
| It was just a few days ago when I mentioned how often I see the
| dragon book held up as a punching bag. It seems that at some
| point the book, along with K&R, have become the target for
| unsubstantiated criticism. I've pressed for details, and I've
| seen others do the same, but it never ends in anything concrete
| --just an implicit appeal to the new conventional wisdom that
| they do more harm than good. Reminds me of how "truth" gets
| synthesized on Reddit.
|
| Here's a counterexample: Russ Cox has a series on regular
| expressions, beginning with Regular Expression Matching Can Be
| Simple And Fast <https://swtch.com/~rsc/regexp/regexp1.html>.
| In the last several years, it has become very well-received. In
| it, he lays out the case for Thompson NFAs, and also laments in
| some places about how the approach is not more widely known,
| having become a sort of lost knowledge. (See also Jonathan
| Blow's <https://www.youtube.com/watch?v=ZSRHeXYDLko>.) For all
| the criticism of the dragon book, though, you'd think that
| would mean enough people have read it that someone would have
| pointed out that Thompson is cited in Chapter 3, where his
| approach is described in some detail.
|
| I think that what's actually going on is that, like The Art Of
| Computer Programming, the dragon book is widely referenced and
| rarely read. Now, enter the 21st century meme factory.
|
| Having said that, the most recent comment I saw that was being
| casually dismissive of the dragon book--ironically for its
| "second-rate methods" that have "held back" compiler
| development--did motivate me into digging the book out, which
| itself led to me to spending an hour or so chasing down the
| source of McCarthy's "ho, ho, you're confusing theory with
| practice" quip, since the origin story of Lisp's eval is also
| told in Chapter 11 of the dragon book.
| <https://hypothes.is/a/pQs39pAhEeupnAuO5qfOeQ>
| jnwatson wrote:
| I took a compilers class around the dragon book. I think the
| valid criticism is the sequence of the book. An undergraduate
| compiler class can only cover perhaps half the book.
|
| IMHO, the number of students that would benefit from learning
| about AST transformation and code gen vastly outnumbers the
| folks that would benefit from learning about finite automata
| transformation (and I took a dedicated course in that).
|
| There simply aren't that many folks working on regex engines.
| criddell wrote:
| > An undergraduate compiler class can only cover perhaps
| half the book.
|
| When I took a compilers course that used this book, the
| professor had his own syllabus and would use the dragon
| book in support of his plan. We jumped around quite a bit
| and certainly didn't use all of the book.
|
| That course was one of my favorites and had a huge impact
| on me. I haven't written any compilers, but have used a lot
| of the concepts (especially state machines) over the years.
| jll29 wrote:
| > There simply aren't that many folks working on regex
| engines.
|
| You'd be surprised: there's even a dedicated annual
| international conference on implementation of automata that
| has been going since 1996:
| <https://link.springer.com/conference/wia>
|
| There are constantly new algorithms, libraries, tools and
| applications around regular languages and regular relations
| (an extension of regular expression that is more elegant,
| symmetric, expressive and takes a lot of the ugliness of
| regexes out). Simplifying somewhat, finite state
| transducers are NFAs "with output".
|
| See e.g. FOMA at <https://fomafst.github.io/>.
| vishnugupta wrote:
| I second this based on my experience. My masters thesis was
| in code generation for Haskell where in I spent fair bit of
| time learning tree pattern matching, AST manipulation,
| register allocation etc. One can achieve a non trivial
| speed up by understanding these different phases of code
| generation.
|
| Another reason, in the context of compilers, is that
| there's not much to be gained by micro-optimizing lexing
| and parsing. After all, a code is compiled just once and
| executed multiple times. So you know where much of the fun
| lies. The undergrads unfortunately usually miss out on the
| meatier part of the compiler course.
| fanf2 wrote:
| It used to be the case that lexing was the hottest part
| of a compiler, because it had to touch every byte of
| source and subsequent phases did not have so much data to
| crunch. But that stopped being true in the 1990s as
| optimizing compilers became mainstream technology.
| marktangotango wrote:
| > There simply aren't that many folks working on regex
| engines.
|
| Yes that's true but there are A LOT of leetcode/hankerrank
| problems that are trivially solvable when one has a solid
| grasp of DFA and state machines in general!
| spekcular wrote:
| > I've pressed for details, and I've seen others do the same,
| but it never ends in anything concrete--just an implicit
| appeal to the new conventional wisdom that they do more harm
| than good.
|
| Really? Merits of the criticism aside (I don't know anything
| about compilers, so I'm not capable of judging), the received
| wisdom seems to be that the Dragon book spends far too much
| time on parsing and not enough time on other, more important
| aspects of modern compiler development.
|
| See, for instance, this blog post by a professor about
| teaching a compilers course:
| http://danghica.blogspot.com/2020/04/teaching-compilers.html.
|
| Or this blog post (and the one it links in the first
| paragraph): https://semantic-
| domain.blogspot.com/2020/02/thought-experim....
| hardwaregeek wrote:
| Exactly. I don't have the book in front of me, but I
| remember its typechecking section being hilariously short
| and imperative focused. Also in general the book doesn't
| focus on implementation at all. That's okay in some
| respects; it's probably stayed more relevant and useful due
| to not talking about implementation. But it makes it
| possible to read a bunch of the book and still have no clue
| how to write code to build a compiler.
| wglb wrote:
| They are good books. (I started the first one when writing a
| compiler).
|
| The valid criticisms of them relate to improvements in
| parsing, and in optimization, which has progressed
| significantly in the years that they were written.
|
| Fun fact: On the compiler we wrote, Jeanne Musinski PhD, a
| student of Ullman was on our team. She wrote the parser, and
| actually discovered an improvement in error recovery during
| our project and published a paper after conferring with
| Ullman.
| kingaillas wrote:
| As others noted, the Dragon Book spends far too much time on
| parsing, and not nearly enough on optimizations. To be fair,
| the first edition of the book is from 1986. It looks like the
| 2nd edition (2006) adds 3 chapters on data flow analysis,
| parallelization. I don't happen to have that version of the
| book so I don't know how good the material is.
|
| I took compilers in grad school in the early 90's. Even then,
| we mostly skipped parsing because that was considered covered
| in another class (Theory of Computation, a class that covered
| regular expression, finite automata, Turing machines, P vs
| NP, those sorts of topics).
|
| The professor spent as much time as he could on
| optimizations, since that was his research focus. He then
| went onto write his own compiler book (Engineering a
| Compiler; Cooper & Torczon) so you can compare the table of
| contents (https://www.elsevier.com/books/engineering-a-
| compiler/cooper...) and see what current compiler researchers
| feel are important topics to cover in a textbook.
|
| Not throwing the Dragon Book under the bus, but it probably
| is in the "cited widely but rarely read" category as you have
| noted, just from its position as being first/early.
|
| An anecdote from me - I had the occasion to write a parser
| for a work project a few years ago. Rather than reach for
| flex/bison or any theory I had from compiler courses... I
| went with the new-to-me method of a parser combinator. This
| was awesome and I might even describe as fun. Granted the
| target language I was parsing was much simpler than a
| programming language, but I hope that is covered in today's
| compilers courses. I remember really tedious parsing homework
| problems back in the day.
| rectang wrote:
| > _As others noted, the Dragon Book spends far too much
| time on parsing, and not nearly enough on optimizations._
|
| If you actually want to specialize in writing compilers,
| that may be true.
|
| But most people who study compilers don't write compilers.
| Instead, the time they spend studying compilers will help
| them to better understand _programming language design_ ,
| which will make them more effective at programming
| _anything_.
|
| And for that, it is far better to spend lots of time on
| parsing than on esoteric compiler optimizations.'
|
| > _(Engineering a Compiler; Cooper & Torczon)_
|
| I bought that book, but didn't get much out of it because
| it concentrates on compiler optimizations and seems to be
| truly targeted for an audience of compiler specialists.
| Instead, I chose _Programming Language Pragmatics_ by
| Michael L. Scott, and went through nearly the entire thing
| with a study group over about 10 months.
| the_benno wrote:
| I can confirm that the second edition gives a quite-good
| treatment of data flow analysis. Not as thorough as you'd
| find in a proper program analysis book (where Nielson is
| the usual choice, though the recent Rival/Yi book on
| abstract interpretation is fantastic IMO) but it's well-
| written and at around the right level of detail for a
| compilers course.
|
| Source: PL PhD candidate, used (parts of) it to teach a
| graduate compilers course.
| mcguire wrote:
| My impression (it's been a very long time since my
| compilers class) was that it didn't spend too much time on
| parsing; rather, it spent too much time on how to implement
| parser generators. I've spent a lot of time parsing things
| in my life, but rather much less on the details of LR
| parsers.
| Taniwha wrote:
| I was doing compiler design back in the late 70s/early
| 80s and writing your own compiler generator was quite
| common at the time - it wasn't until things like
| bison/yacc became available which not only did an
| excellent job but also allowed one to hack on them and/or
| the result (for example my Algol68 compiler needed to be
| able to be able to decide how to resolve some
| shift/reduce conflicts on the fly ... that's how you do
| changeable operator precedence)
| wglb wrote:
| > To be fair, the first edition of the book is from 1986.
| It looks like the 2nd edition (2006) adds 3 chapters on
| data flow analysis, parallelization.
|
| So this criticism seems to say "well, it was written too
| early."
| kingaillas wrote:
| Or possibly, "needs to update more often than every 20
| years".
| dave_aiello wrote:
| Couldn't it be said that the first edition of the Dragon
| Book is an indication of how far ahead of their time Aho
| and Ullman were at that point?
| kingaillas wrote:
| Well, it was basically the only book out at that time. I
| remember a few others in the 1990s - Compiler Design in
| C; Holub, and later after I graduated Modern Compiler
| Implementation in C; Appel).
|
| Hey something cool I found while googling - Holub's book
| is available as a free download from the author!
| (https://holub.com/compiler/)
|
| The Dragon Book has flaws but it is also the case where
| the pioneer catches all the arrows. But the OP asked
| about unsubstantiated criticisms of the book so I added
| in the one I remember from my courses - mostly too much
| on parsing, not enough on everything else. The 2nd
| compiler course I took didn't even use a textbook, Dragon
| book or otherwise; it was a handful of papers and a
| semester-long project on SSA and associated
| documentation.
| mistrial9 wrote:
| I took a professional development course in C from Alan
| Holub in Berkeley that I think was the base material from
| that book, ages ago. I can say that a room-full of
| talented people worked really, really hard to keep up the
| pace in that course. Like some other classes at UC
| Berkeley proper, surviving the course says something
| about you! It was all in portable C, and rigorous.
| ChuckMcM wrote:
| Agreed, of all the CS classes I took in college, the compilers
| class has been the most useful in my career. Still have the
| copy of the "dragon book" from that class.
| eyeundersand wrote:
| Prof. Ullman's opinion on the Middle East:
|
| https://web.archive.org/web/20120806135022/http://infolab.st...
| pthreads wrote:
| Can someone recommend me a good online course in automata theory
| (or even a book). I tried Ullman's course on edX (or Coursera,
| don't remember). But I found it a bit difficult to follow --
| seemed a bit dry and monotonous, to me at least.
|
| Had the same experience with Ullman's book (first edition at
| least). I have enough theoretical background in CS and am not
| averse to reading dense material.
| joshmarlow wrote:
| I'm assuming this was the Ullman book you took a look at -
| https://www.amazon.com/Introduction-Automata-Languages-Compu...
|
| If not, it's good but pretty dense. If you didn't like that,
| then I would recommend this one -
| https://www.amazon.com/Introduction-Theory-Computation-Micha...
|
| Sipser has a lot less notation and more english explanations of
| the concepts. I picked it up and read most of it after
| graduating - it's pretty easy to follow (though if I recall, I
| think some of the terminology around Turing complete languages
| differed slightly from the Ullman text).
| llimllib wrote:
| I love sipser, it's one of the few textbooks I kept after
| college
| brudgers wrote:
| My interest in the topic carried me through Ulman's Corsera
| course...the second time back when such things were free.
|
| Which is only relevant in the sense that what changes is our
| relationship to the material over time. Our interests evolve.
| Same our expectations.
|
| Ulman wrote the book. It's hard to do better than that in many
| of the ways that matter.
| ricklamers wrote:
| I can highly recommend
| https://youtube.com/playlist?list=PLbtzT1TYeoMjNOGEiaRmm_vMI...
|
| I've deeply enjoyed his videos and after the course in which we
| treated the material, regexes were all of a sudden extremely
| obvious. I benefit from it almost daily.
|
| Accompanying book I would recommend: Introduction to the theory
| of Computing by Michael Sipser.
|
| Credits go to TU Delft professors for architecting the course.
| [deleted]
| lordleft wrote:
| Aho was my CS Theory professor at undergrad. He was incredibly
| polite and in my opinion, quite compassionate towards his
| beleaguered CS students. Cheers to him and Ullman.
| sebastian_z wrote:
| Agreed! (as a former MS student)
| vlads wrote:
| Same. Aho was one of the nicest professors of my undergrad
| studies. He was also one of the few who actually taught well,
| and who placed a heavy emphasis on understanding the basics and
| then being able to build on top of that.
| markus_zhang wrote:
| Contrary to the opinion of many HN posters, I actually believe
| the front-end (lexing, parsing, semantic analysis) is more
| important to most software engineers.
|
| Consider: You don't get to work on the back-end for most of the
| careers out there. But Lexing and parsing are very valuable if
| you want to parse something, say a configuration file. These
| kinds of things are more often seen in day-in-day work than code
| generation and optimization.
| mhh__ wrote:
| > front-end (lexing, parsing, semantic analysis
|
| lexing, parsing, _SEMANTIC ANALYSIS_ (Emphasis added!)
| cjohansson wrote:
| Aho and Ullman has a great book about Parsing, Translating and
| Compiling that I'm currently reading, it's not a either or
| thing - you can learn about both and they are related to each
| other
| jonstewart wrote:
| Agreed. Lexing and parsing might not be the most critical
| aspects of a compiler these days, but they're more generally
| useful.
| chrisseaton wrote:
| I think they should split university courses - a course about
| parsing (no need to even mention compilers) and a course about
| compilers that starts from an AST. Many the former a
| requirement and the latter an elective.
| kingaillas wrote:
| My university did that... sort of. There was a required
| "Theory of Computation" course using the Ullman & Hopcroft
| textbook "Introduction to Automata Theory, Languages, and
| Computation". A newer textbook could also be Sipser's
| "Introduction to the Theory of Computation".
| cestith wrote:
| under 40 days ago someone posted to HN a SIGPLAN story about
| teaching compiler from the final phase to the first phase of
| processing the translations.
|
| https://news.ycombinator.com/item?id=26237368
|
| It gives you an AST you can assume is valid and has you do
| the work to rename variables and methods. Then it gives you
| an AST and has you develop your target code (in the case of
| the specific class, LLVM IR). Then the AST is checked for
| valid semantics. Then the course has you check a source
| program for syntactic validity and generate an AST.
|
| It gives people the advantage of understanding what the AST
| is for on a deeper level before deciding how to represent it.
| I think this sort of class and a separate parsing class could
| be stellar together.
| Koshkin wrote:
| As a side note, "lexing" is jargon that has never been in use
| by authors of books on compilers.
| mhh__ wrote:
| The people who write compilers people actually use say lexing
| quite regularly.
| chrisseaton wrote:
| It's just short for lexically analysing. The Dragon book has
| almost a hundred pages on lexical analysis.
| Turing_Machine wrote:
| "Jargon", perhaps. "Never been in use by authors"? Not so.
| The current "international economy edition" of Aho, et al
| uses "lexing" 29 times, according to the search function on
| Amazon. Flex & Bison by John Levine (O'Reilly and Associates)
| uses "lexing" 38 times. And so on.
| ufo wrote:
| I agree, partially. I also think that the front end of a
| compiler is more broadly useful skill than the back end stuff.
| However, I believe that most people would be better served
| knowing how to work with a recursive descent parser instead of
| a bottom up parser generator, which is something that the
| dragon book spends a lot of time on. For this reason, one of my
| favorite compiler books is munificent's Crafting Interpreters.
| markus_zhang wrote:
| I'm actually reading that book. I'm going back to university
| to study CS and I have a course about Java which is exactly
| the language he uses in the first part of the book. I think
| this would be a good exercise for learning the language as
| well as learning lexing/parsing.
|
| I was blocked by the BNF expression in section 6.1. Not
| exactly sure what I didn't understand, but overall I felt
| like I could only copy what he says but not grasp the
| underlying rules. Maybe I need some primer on BNF first.
| wuuwoo wrote:
| I wholeheartedly disagree. Compiler engineering is ultimately
| about taking data in one representation and converting to
| another, in practice much of the job is figuring out what data
| you need to have access to in order to perform a transformation
| and test that it is sound - the plumbing is usally
| uninteresting and straightforward.
|
| Code generation and optimization are just the task you're
| trying to complete, the actual implementation of it is not so
| different from most SWE grunt work. But many of the
| abstractions you build when trying to do it are useful to
| understand and use in practice everywhere you right software.
|
| If you've ever worked on an application that ingests data and
| spits out meaningful output, you've written a compiler. Failing
| to understand the shared abstractions is not good engineering.
| neonate wrote:
| https://archive.is/kamx6
| [deleted]
| cjohansson wrote:
| Reading one of their books now, great stuff, compilers are
| fascinating, I wish I had discovered that earlier
| jjice wrote:
| Good for them. The Dragon Book is a great compiler guide, despite
| some understandable criticisms I've heard about it not teaching
| blessing edge technique at the time. I'm using it right now for
| my compiler course, and it still holds up to a solid extent. Just
| out of curiosity though, why wasn't Sethi (co-author) included?
| Was he just an author and not involved in as much research as Aho
| and Ullman?
| comeonseriously wrote:
| Maybe because, while HN is talking about The Dragon Book, the
| actual reference[0] refers to
| https://en.wikipedia.org/wiki/Principles_of_Compiler_Design
| which did not include Sethi (although there is a blurb about it
| in the paragraph).
|
| [0]https://awards.acm.org/about/2020-turing
| jjice wrote:
| Thank you for pointing that out. I guess I had tunnel vision
| because of my personal experience.
| gnulinux wrote:
| The original Dragon Book (sometimes referred to as the "Green
| Dragon Book") was written by Aho and Ullman.
| mhh__ wrote:
| The dragon book still has some gems but great might be pushing
| it these days
| pbhowmic wrote:
| Good question. Why not?
| garfieldnate wrote:
| Is it possible in the future that incremental compilation methods
| will become basic compiler knowledge? I think it would be an
| amazing step forward if the default compiler design would always
| result in IDE-inspectable code. Language Server Protocol is
| amazing, but it's only half the story.
| kazinator wrote:
| I've amusingly noticed that my 1988 edition of the book does not
| mention tail calls whatsoever, except in one early chapter where
| it is in relation to the _implementation_ of some parsing.
|
| Index: [...] Tail recursion 52-53
|
| On pages 52-53, they show how, in a token matching procedure that
| takes no arguments, and returns nothing, the recursive self call
| can be _manually replaced_ by a backwards goto: "[w]e can speed
| up a program by replacing tail recursion by iteration. For a
| procedure without parameters, a tail-recursive call can be simply
| replaced by a jump to the beginning of the procedure."
|
| There is no mention about the possibility of compilers doing this
| for you, or that there are other reasons for tail calls, like
| eliminating stack growth. The rest of the book is mum about tail
| calls.
| [deleted]
| amelius wrote:
| What's today's equivalent of the "Dragon book"? I guess it should
| cover new stuff such as JIT compilation, concurrent garbage
| collection, optimizations that are part of LLVM, and modern type
| systems.
| enriquto wrote:
| Maybe link to the source?
| https://awards.acm.org/about/2020-turing
| shadeslayer_ wrote:
| One of the enduring memories of bumbling through my first few
| months as an undergrad was coming upon a torn, well-read copy of
| the Dragon Book in the institute library. To date, it remains one
| of the best pieces of CS literature I have read.
|
| Seven years down the line, it feels strangely warm to read those
| names again.
| hn_throwaway_99 wrote:
| I've been out of college for a looking time. I'm not a
| particularly nostalgic person, but there are a couple books I
| still have from college, and the Dragon Book is still one I
| like to read.
| pvarangot wrote:
| The Dragon book is great! That book opened my mind onto how
| compilers "do it" when I was a starting college.
|
| That being said, their book on Data Structures and Algorithms
| is severely underrated. It doesn't go in depth like more
| classical texts like Cormen and doesn't condense infinite
| knowledge in each paragraph like Knuth does, but it's a very
| nice read that goes from simple data structures to secondary
| memory and it's really easy to read.
| nabla9 wrote:
| I love the Dragon Book (Compilers: Principles, Techniques, and
| Tools by Aho, Sethi, Ullman).
|
| https://en.wikipedia.org/wiki/Compilers:_Principles,_Techniq...
| cambalache wrote:
| The prize didnt go to React creators? Well, maybe next year.
| algo_trader wrote:
| Wikipedia:
|
| Jeffrey Ullman.... was the Ph.D. advisor of Sergey Brin
|
| The greats know how to recognize greats ;)
| rvz wrote:
| > Jeffrey Ullman.... was the Ph.D. advisor of Sergey Brin
|
| And?
|
| What is that supposed to mean?
| chrisseaton wrote:
| A PhD advisor is the academic who guides your PhD work.
| Sergey Brin is a co-founder of Google. They're pointing out
| how connected many people in computer science.
| rvz wrote:
| > They're pointing out how connected many people in
| computer science.
|
| Your last sentence is mostly relevant to my point.
|
| The parent shouldn't be easily blinded and in awe of
| listing just only _' one example'_ and then proclaiming
| they're 'recognising greats'.
|
| Let's have more examples of other 'greats' connected to or
| even advised by Jeffrey Ullman rather than mentioning only
| one _' great'_. Even if it is the co-founder of Google.
| rurban wrote:
| Needed some time. Knuth won it 1974, Hopcroft 1986 already.
| Wonder why so late
___________________________________________________________________
(page generated 2021-03-31 23:00 UTC)