[HN Gopher] Dynamic programming is not black magic
       ___________________________________________________________________
        
       Dynamic programming is not black magic
        
       Author : qsantos
       Score  : 217 points
       Date   : 2024-01-14 09:42 UTC (13 hours ago)
        
 (HTM) web link (qsantos.fr)
 (TXT) w3m dump (qsantos.fr)
        
       | ctk_25 wrote:
       | Would recommend DPV Algorithms book and Georgia Techs lectures on
       | udacity for graduate algorithms. The way to master dynamic
       | programming is - practice practice practice solving problems...
        
         | 62951413 wrote:
         | Consider https://www.amazon.com/Dynamic-Programming-Coding-
         | Interviews... for the most pedagogical approach I have seen.
        
       | ulucs wrote:
       | The name "Dynamic Programming" might seem out of place because it
       | doesn't come from programming as a discipline. In this case, it
       | actually refers to something like optimization, similar to linear
       | programming. Dynamic programming is basically a method you can
       | use to solve decision problems with discrete time, i.e picking
       | the optimal sequence {a_t} in order to maximize \sum_t u_t(a_t)
       | (plus constraints). The "dynamic programming" is defining a value
       | function V* where V*(t) = max_{a_t}{ u_t(a_t) + V*(t-1) } which
       | greatly reduces the dimensionality of the optimization problem.
        
         | glimshe wrote:
         | Most of the time I hear the term being used by other people
         | (not you :) ), I feel it's for showing off and look smarter
         | than everybody else - "Look, I used dynamic programming to
         | solve that problem", when in fact they were just employing what
         | I'd consider the natural and intuitive approach for solving the
         | problem. There was nothing they actually "used", besides
         | happening to identify that the problem could be broken into
         | progressively smaller subproblems.
        
           | valval wrote:
           | Well aren't you cynical.
        
           | mk89 wrote:
           | > happening to identify that the problem could be broken into
           | progressively smaller subproblems.
           | 
           | Which is what dynamic programming is about. And no, not
           | everyone is capable to do that, especially since not all
           | problems solved at work are Leetcode.
           | 
           | Sometimes people really have to spend hours or days to
           | understand how to split a problem. Only then optimizations
           | can be applied or become "obvious".
           | 
           | Like usual, solutions are "so obvious" when someone has done
           | a lot of heavy lifting to simplify the problem, improve the
           | context, etc.
        
           | bcrosby95 wrote:
           | Do you feel similarly if someone says they used an iterative,
           | recursive, or greedy algorithm?
           | 
           | Dynamic programming is a whole chapter in most algorithms
           | books. It's not about showing off, it's the name of the
           | technique.
        
         | uoaei wrote:
         | If you go further back, "programming" describes it precisely,
         | and what we call "programming" today is "writing code" which
         | can be subdivided into different kinds of "programming" such as
         | functional, declarative, procedural, etc. But there's a lot
         | more that fits under that umbrella.
        
         | roenxi wrote:
         | In fact, the official line [0] on where the name comes from is
         | quite funny. I shall quote my favourite part:
         | 
         | > [Dynamic] also has a very interesting property as an
         | adjective, and that is it's impossible to use the word dynamic
         | in a pejorative sense. Try thinking of some combination that
         | will possibly give it a pejorative meaning. It's impossible.
         | Thus, I thought dynamic programming was a good name. It was
         | something not even a Congressman could object to.
         | 
         | [0] https://en.wikipedia.org/wiki/Dynamic_programming#History
        
           | smokel wrote:
           | Dynamic typing comes to mind ;)
        
             | kevindamm wrote:
             | Dynamic scoping ;)
        
               | tempodox wrote:
               | You cannot get any more pejorative. Mission accomplished!
        
           | mtlmtlmtlmtl wrote:
           | My dad told me a joke/story once about one of his old
           | psychologist colleagues(call him Dave) arguing with a
           | psychoanalyst* about some psychology thing.
           | 
           | After a while of the discussion going nowhere, the
           | psychoanalyst said something like this. "Not to worry, Dave,
           | I know you're dynamically minded enough that you'll
           | eventually learn to agree with me."
           | 
           | Dave was not pleased.
           | 
           | I guess that was more condescending than pejorative, but oh
           | well.
           | 
           | (Most modern clinical psychologists consider psychoanalysis
           | to be a pseduoscience. A small minority still practice it
           | even clinically. They don't like eachother very much)
        
             | savolai wrote:
             | Care to offer evidence of pseudoscience status? All I could
             | find was debunkings of this claim as myth and outdated, and
             | to my understanding research in the field has caught wind
             | in past decades. I'd love to learn more about the debate,
             | so any pointers are welcome.
             | 
             | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6020924/
             | 
             | https://cosmonautmag.com/2021/10/on-the-scientific-status-
             | of...
             | 
             | " Depending on where your philosophical allegiances lie,
             | Karl Popper might be a friend or an enemy to whatever
             | version of philosophy of science you accept.
             | Falsificationist approaches to science imply that
             | psychoanalysis makes untestable and unfalsifiable claims!
             | Popper made the very same claims about Marxist social
             | theory as well. There are two issues with this approach.
             | One, if falsificationism is true, then all social sciences,
             | and even some natural sciences, are pseudo-scientific. This
             | is an unacceptable conclusion since much of social science
             | is clearly science. Lots of cutting-edge physics whose
             | models do not lend themselves to controlled experimental
             | testing would also lose their status as science. That is
             | also an absurd conclusion. Newtonian mechanics, for
             | example, would never have been accepted as a theory if we
             | use Popper's standards of theory corroboration and
             | falsifiability. The theory offered better ways of
             | explaining phenomenon that previous theories were already
             | decent at predicting reliably. The process of legitimizing
             | Newtonian physics had nothing to do with testability.4 A
             | good theory of science does not rule out obvious cases of
             | scientific rigor.5"
        
               | User23 wrote:
               | The burden of proof lies on them that claim it's a "real"
               | science.
        
               | savolai wrote:
               | It appears to me the burden of proof lies on whoever
               | wants to draw the line on what "real" science is. Only
               | natural sciences, then?
        
               | mp05 wrote:
               | https://en.wikipedia.org/wiki/No_true_Scotsman
        
               | mtlmtlmtlmtl wrote:
               | I'm not a psychologist myself, and I know nothing about
               | modern _academic_ psychoanalysis. I 've tried to read
               | some but it was indecipherable to me.
               | 
               | Modern clinical psychoanalysis is a strange field. It
               | went out of vogue with the advent of behavioral and later
               | cognitive psychology in the latter half of the 20th
               | century, as well as psychiatry. Psychoanalysts tend to
               | believe their way is the only way, while more modern
               | psychologists are much more eclectic. They're often
               | extremely critical of the use of medication even in cases
               | where the evidence supporting their use is overwhelming,
               | like stimulants for ADHD.
               | 
               | Psychoanalysts have their own strange nomenclature that's
               | often incompatible with modern developments. So it's hard
               | for other psychologists to even talk to them at all.
               | 
               | Those are some of the reasons psychoanalysis is viewed as
               | pseudoscientific by modern clinical psychologists.
        
               | dahart wrote:
               | https://en.wikipedia.org/wiki/Psychoanalysis#Debate_over_
               | sta...
               | 
               | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5459228/
               | 
               | https://www.bbvaopenmind.com/en/science/research/psychoan
               | aly...
               | 
               | https://link.springer.com/referenceworkentry/10.1007/978-
               | 94-...
               | 
               | https://www.skeptic.org.uk/2021/07/psychoanalysis-
               | science-or...
               | 
               | https://philarchive.org/archive/FERIPA-6
               | 
               | That quote you included seems a little funny. It's not
               | making any direct or compelling argument in favor of
               | psychoanalysis being science, it's just saying well if
               | psychoanalysis isn't science then other disciplines
               | aren't science either, and since that's sometimes not
               | true, then Popper is wrong. It's a slippery argument with
               | some assumptions and big holes, and perhaps most
               | glaringly it is intentionally ignoring the magnitude or
               | degree of scientific experimentation, and attempting to
               | frame the issue as only binary: science or not science.
               | 
               | I'm sure there is some modern psychoanalysis that is
               | scientific, but OTOH it seems like the foundations of
               | psychoanalysis, especially Freud, are certainly
               | problematic from the perspective of science, right?
               | Psychoanalysis has a valid, earned reputation that may
               | take a very long time to fix, even if today's
               | practitioners are being careful and scientific. This
               | could be compared to chiropractic medicine - some of it
               | is valid medicine today, but it definitely came from non-
               | medical, non-scientific origins.
               | 
               | There appear to be at least 2 separate arguments going.
               | One is whether psychoanalysis _can_ be a science, and the
               | other is whether it actually was a science historically.
               | This is why both sides are somewhat right: proponents of
               | psychoanalysis argue that it can be scientific, which is
               | true, and opponents argue that it wasn't scientific in
               | origin and has a troubled past, which is also true.
        
               | savolai wrote:
               | The issue for me is that the popular claim against
               | psychoanalysis seems to be exactly that, binary.
               | 
               | Thanks for bringing nuance and providing quotes!
               | 
               | I've personally benefited tremendously from work with the
               | ego structure and from realizing I can strenghen my
               | will/ego capacities and learn to more and more discern
               | and deny excessive power from my superego - from past
               | learnt protective and restrictive impulses that no longer
               | serve a purpose. This understanding alone seems like a
               | treasure trove that keeps on giving year after year while
               | doing introspective work and self inquiry with others
               | interested in the work.
               | 
               | However I have little understanding how this fits in a
               | modern understanding of psychoanalysis or psychodynamic
               | psychotherapy.
               | 
               | What I do know is that CBT I did younger didn't seem to
               | have nearly sufficient explanatory powers to help me
               | personally. Most of the skills I was offered seemed more
               | or less trivial or perhaps were taught ineffectively.
               | 
               | Another thing that I would have needed was tapping into
               | the resources of my body in the context of therapy and
               | how that links to having capacity to work with myself.
               | CBT seemed obsessed with, umh, overly just cognition.
        
               | bakuninsbart wrote:
               | Interesting quote, since my mind immediately went to
               | Critical theory, one of the schools of philosophy
               | probably targetted by Popper.
               | 
               | Over the last 200 years, the definition of science has
               | been deliberately narrowed down a lot, mainly to combat
               | pseudo-science and misinformation, but I think partially
               | also to create an in-circle of academics. There are few
               | good definitions of science, and usually they are a bit
               | self-contradictory or insufficient to capture the
               | underlying _goal_ of science; the building of a corpus of
               | verified knowledge. Falsification is a very high
               | standard, that cannot always be applied, but it is still
               | a good first test to filter out possible bullshit.
               | 
               | In deductive studies (formal sciences but also often in
               | other fields) we need to have acceptable axioms, and
               | rigorous deductions from these axioms. In inductive
               | studies, you need good data and valid methods to derive
               | meaning from that data. A lot of research actually falls
               | somewhere in-between, and we wouldn't want to outright
               | dismiss it, as many subjects would be too elusive to
               | study at all, but we still have a vested interest in
               | understanding them.
               | 
               | A lot of research, and I would count psycho-analysis in
               | there for the most part, kind of walks the line between
               | pseudo-science and proper deductive or inductive studies.
               | These fields definetely build a corpus of knowledge,
               | which often turns out to be true, but there are fewer
               | good measures of filtering the bullshit from the nuggets
               | of truth, and that substantially devalues the field. The
               | risk, not exclusive but especially pronounced, is that
               | acceptance of research becomes more a measure of
               | eloquence than depth of enquiry.
               | 
               | And fields can be collectively wrong, even in the more
               | rigorous fields that lend themselves to proper testing or
               | deductive reasoning: You mention Newtonian Physics, but
               | there are too many examples to enumerate: Quantum
               | superposition was extremely controversial when first
               | proposed, and poor Lobachevsky was ruthlessly mocked for
               | his perfectly valid development of hyperbolic geometry.
        
               | LudwigNagasena wrote:
               | IMO, modern psychology "walks the line between pseudo-
               | science and proper deductive or inductive studies".
               | Psychoanalysis at best sometimes stumbles in the right
               | direction.
        
               | CamperBob2 wrote:
               | As I understand it, the notion of falsifiability is
               | really more like, "Theory X is acceptable if it can be
               | shown to be false by the use of tools whose principles
               | meet the same standard of falsifiability, even if those
               | tools aren't currently available."
               | 
               | So a theory that can be tested only with later scientific
               | refinements -- say, by increasing measurement precision
               | beyond what's currently available -- is indeed eligible
               | to be classified as science. That criterion would allow
               | for Newtonian mechanics to be accepted in its time, and
               | for things like string theory to be accepted
               | provisionally in ours.
               | 
               | Basically, the ultimate failure of the Newtonian model
               | can't be used as an argument that it should never have
               | been considered valid science.
        
               | patrick451 wrote:
               | Rejecting an assertion because you don't like it's
               | conclusion is pretty specious. I fully endorse the notion
               | that any practice which makes unfalsifiable claims is not
               | science. But that doesn't necessarily make such a
               | practice pseudo science -- this is a false dichotomy.
               | There are disciplines which are neither scientific nor
               | pseudo-scientific.
        
               | mp05 wrote:
               | Well hang on, let me go find my logic book from college
               | to decipher...
        
               | raincom wrote:
               | Demarcation problem is unique to Popper's philosophy of
               | science. That problem doesn't exist in other philosophies
               | of science. The best attack on psychoanalysis comes from
               | the late philosopher of science Adolf Grunbaum's "The
               | Foundations of Psychoanalysis: A philosophical critique"
        
             | ordu wrote:
             | _> After a while of the discussion going nowhere, the
             | psychoanalyst said something like this.  "Not to worry,
             | Dave, I know you're dynamically minded enough that you'll
             | eventually learn to agree with me."_
             | 
             | It looks like psychoanalyst suffered from a professional
             | deformation. Not enough evidence to be sure, but it is the
             | main premise of psychoanalysis that psychoanalyst knows
             | better then his "patient". A psychoanalyst knows what his
             | patient feels, thinks, wishes, and so on. If patient
             | doesn't agree then it is a "resistance" and overcoming it
             | is the first priority task for a psychoanalyst.
             | 
             |  _> I guess that was more condescending than pejorative,
             | but oh well._
             | 
             | The whole statement is condescending, but not because of
             | "dynamically minded". It is "you'll learn to agree with me"
             | that does the trick.
        
           | qsantos wrote:
           | It _is_ a funny quote.
           | 
           | But I would still point out that this is the exact reason
           | "dynamic" does not help if you do not know about the history.
           | Since it can be applied to anything, it does not help you
           | trim down what it can refer to.
        
           | karmakaze wrote:
           | Scripting languages are called 'dynamic' sometimes
           | pejoratively. Dynamic microphones are also inferior in studio
           | recording situations. I could see a poor DP implementation
           | wasting memory being described negatively.
        
         | asimpletune wrote:
         | This is the correct definition of dp.
        
         | eru wrote:
         | Compare also 'Linear Programming'. Or the usage of a 'TV
         | programme' or a 'musical programme'.
        
         | cubefox wrote:
         | You forgot to define u and t.
        
         | layer8 wrote:
         | "Optimization" is similarly misleading (as someone who once
         | took a CS class on "optimization" expecting something very
         | different ;)).
        
           | LudwigNagasena wrote:
           | Optimisation is just function minimisation/maximisation. What
           | did you expect and how was it different?
        
             | hcs wrote:
             | Not speaking for the GP but I'd imagine "optimizing" for
             | performance. I had a professor who would get very irritated
             | when anyone mentioned "optimizing" a program for just this
             | reason, since you weren't finding the _best possible_
             | runtime.
        
               | LudwigNagasena wrote:
               | Oh, makes sense, performance optimisation didn't cross my
               | mind for some reason.
        
         | closeparen wrote:
         | It's interesting how _computing things_ - like optimization
         | problems - used to be much more dominant in terms of what
         | people thought about and did with computers. It feels like most
         | of the time we are just doing data storage and retrieval
         | combined with networking... some computations are embedded
         | within those things, but usually well encapsulated.
        
       | bombela wrote:
       | I really like how this article exposes the problem recursively
       | first then progressively adds caching, and finally reduces the
       | size of the cache to only what is necessary.
       | 
       | I have often made the mistake of trying to get to the dynamic
       | programming solution directly, and either got stuck or had to go
       | through heroic efforts to get it working. I think from now on, I
       | will force myself to go through the steps in order.
        
         | qsantos wrote:
         | From my own experience, being taught dynamic programming
         | straight way makes it more of a puzzle. By going through the
         | steps and explaining _why_ we are using a table, and connecting
         | the concept to caching, I feel like it makes much more sense.
        
       | gligorot wrote:
       | https://web.archive.org/web/20240114111200/https://qsantos.f...
       | 
       | Since it got the hug of death
        
         | wmil wrote:
         | Effective web caching is black magic.
        
         | qsantos wrote:
         | Thanks! I should have planned better for that before posting.
        
       | sethammons wrote:
       | Would it be wrong to just think "memoization" when you hear
       | "dynamic programming"? The missing part may be intelligently
       | breaking up the problem to use memoization?
        
         | ecshafer wrote:
         | Yes it would be wrong in my book. First i think a obvious
         | counter example of memoization can be used outside of dynamic
         | programing. Otherwise you can do most dynamic programming
         | algorithms with storing the results in a table, and searching
         | the table afterwards for the best answer. Memoization is
         | basically a strategy to speed up algorithms.
        
           | sk11001 wrote:
           | > First i think a obvious counter example of memoization can
           | be used outside of dynamic programing
           | 
           | This isn't relevant, the quesion is whether there are dynamic
           | programming solutions which do not involve memoization.
        
             | eru wrote:
             | Well, dynamic programming also tells you when you can drop
             | your 'memoization' caches.
        
         | JohnKemeny wrote:
         | That's exactly how I teach dynamic programming. First solve it
         | recursively, then add memoization (I call that top-down).
         | 
         | Then notice that recursion and memoization has some overhead,
         | and construct the table from bottom-up, remove the recursive
         | call, and voila, dynamic programming.
        
         | jltsiren wrote:
         | Memoization is a more general technique. It's often little more
         | than caching the results you happened to compute, in case you
         | need them again in the future.
         | 
         | Dynamic programming is systematic memoization. You solve
         | subproblems of increasing size, until you reach a solution to
         | the full problem. "Inductive algorithm" would be kind of
         | appropriate term, as the typical dynamic programming algorithm
         | is effectively a proof by induction. Unfortunately that term
         | already has other meanings.
        
           | deaddodo wrote:
           | And "dynamic programming" is an insanely general term that
           | leetcode enthusiasts have pigeonholed into meaning memoized
           | reduction. It can describe dynamic dispatch, code
           | morphing/rewriting, dynamic loading, etc.
           | 
           | I would argue "memoization", while still a bad term, is far
           | more specific and descriptive of what's happening than
           | "dynamic programming".
        
             | greyw wrote:
             | The term "Dynamic programming" comes from mathematics of
             | the 1940s before modern programming languages even existed.
             | It's about a certain way to do optimization. This is also
             | how it is used by "leetcode enthusiasts" so imo they are
             | using it correctly.
             | 
             | Nowadays, you could naturally confuse dynamic programming
             | with the things you have listed.
        
               | nihzm wrote:
               | Exactly! I have already linked it in another thread but
               | see this link below for the mathematical definition of
               | dynamic programming:
               | 
               | https://web.mit.edu/dimitrib/www/DP_Slides.pdf
        
               | deaddodo wrote:
               | > This is also how it is used by "leetcode enthusiasts"
               | so imo they are using it correctly.
               | 
               | I never said they used it incorrectly. In fact,
               | inversely, I tacitly acknowledged the fact that it was
               | used _correctly_. I said they _pigeonholed_ a
               | _generalized phrase_ into a specific use case. E.g. a
               | word /phrase that can be used to describe many similar
               | but distinct concepts in the same field, as you yourself
               | acknowledge.
               | 
               | If you're going to take umbrage with someone's words,
               | it's better not to misunderstand them or misphrase them.
        
               | greyw wrote:
               | Sorry I was not clear enough here. I wanted to just add
               | some historical context.
               | 
               | My point is that dynamic programming was not
               | "pigeonholed" because of "leetcode enthusiasts" but
               | rather they are just using the original meaning. Modern
               | programming languages refering to "dynamic" things in
               | various circumstances made the term confusing so you
               | would have to blame modern programming languages rather
               | than "leecode enthusiasts".
        
               | deaddodo wrote:
               | Sorry, _I_ wasn 't clear enough. The term is perfectly
               | fine, it's simply too generalized. So if other's want to
               | use other terms to better describe it, I support that; no
               | matter who was first. Otherwise we end up in another
               | "systems programming" situation.
               | 
               | My offhanded remark about "leetcode enthusiasts" is about
               | people trying to _strongarm /gatekeep_ a, frankly, _way
               | too generalized phrase_ to mean a very specific thing.
               | They can call it that all they want. It 's a correct
               | phrase for that. Just don't get mad when someone else
               | refers to "code morphing" as "dynamic programming".
               | 
               | > Modern programming languages refering to "dynamic"
               | things in various circumstances made the term confusing
               | so you would have to blame modern programming languages
               | rather than "leecode enthusiasts".
               | 
               | Modern programmers and engineers used _the word dictated
               | for the functionality by the language they communicate
               | in_. You 're just further reinforcing my point that the
               | phrase is too generalized.
               | 
               | Or, to give you an analogy that might finally make it
               | click. It would be like if I called all "string metric"
               | problems "string programming".
        
           | owlstuffing wrote:
           | > Dynamic programming is systematic memoization.
           | 
           | Exactly! This should have been clarified in the op.
        
         | kevindamm wrote:
         | There are dynamic programming solutions not based on
         | memoization. See for example finding the longest common
         | substring between two strings. Memoization will not help as
         | much because you only need the table's left and up cells once.
         | 
         | In general, if the problem's sub-problems have a lot of overlap
         | and the optimal subproblem must be part of the optimal overall
         | solution, you've got an opportunity for dynamic programming.
         | Saying memoization is the only kind of dynamic programming is
         | like saying hash tables are the only abstract data type.
        
           | recursive wrote:
           | The left of the up is the up of the left.
        
             | kevindamm wrote:
             | Yeah, I had a momentary misgiving when I hit send and
             | thought (_well, the cell _is_ used twice_) but in practice
             | it is still very different than what would be considered
             | memoization. You can make it work with just two rows at a
             | time, forgetting all the calculated values of the earlier
             | rows. You can also do it in one row if you work backwards
             | in each row. If anything, it 's a cache with a very
             | specific eviction policy.
             | 
             | I had a whole other paragraph where I nearly convinced
             | myself it's the same here for LCS so I'm just going to stop
             | myself here. If you look up any texts that describe LCS
             | they don't refer to it as using memoization. Let's not
             | confuse things.
        
               | n2d4 wrote:
               | > _If you look up any texts that describe LCS they don 't
               | refer to it as using memoization._
               | 
               | I challenge that actually, several resources online
               | consider it memoization (I would too):
               | https://www.geeksforgeeks.org/longest-common-subsequence-
               | dp-...
        
               | kevindamm wrote:
               | I had in mind a Data Structures and Algorithms text like
               | what is used to teach comp sci courses with. But I'm not
               | here to denigrate the credibility of Geeks for Geeks. If
               | we're going to cite arbitrary web sites as sources, let's
               | refer to Wikipedia's definition:
               | 
               | """ An optimization technique used primarily to speed up
               | computer programs by storing the results of expensive
               | function calls to pure functions and returning the cached
               | result when the same inputs occur again. """
               | 
               | In LCS you are not saving the result of calling a
               | substring method for (str1, str2, i, j) or any similar
               | recursive method. You're keeping a constant-sized
               | association of index pairs and their accumulated score.
               | 
               | If we're getting technical, I would say that what LCS
               | uses is referred to as tabulation.
        
         | GuB-42 wrote:
         | To my approach to dynamic programming, memoization is step 2 of
         | 3.
         | 
         | 1- find a recursive algorithm
         | 
         | 2- memoization
         | 
         | 3- make it iterative / bottom-up
         | 
         | 3b- if possible, optimize for space
         | 
         | Step 3 is the most characteristic part of dynamic programming,
         | but if you stopped at step 2, it would still qualify, I think,
         | but it would not be as efficient as it could be.
         | 
         | Another way to think of step 3 would be: memoization is about
         | caching, is there a way to pre-fill the cache?
        
           | naet wrote:
           | Where I have the most trouble is usually going from recursive
           | to iterative. It's pretty easy for me to understand the
           | recursive + memoization, but when that ends up being taken
           | off the table due to various inefficiencies of recursion I
           | often get a little stuck.
        
           | cubefox wrote:
           | This makes it clear why dynamic programming is not easy. The
           | problem occurs already in step 1 -- we naturally tend to
           | think in loops, not recursions. So before you find a
           | recursive algorithm, you probably come up with a loop
           | algorithm first. Then you have to try to convert the loop
           | into recursion, then go back to loops that use caching. And
           | hope that the result is more efficient than the loop
           | algorithm you started with.
        
             | WJW wrote:
             | Speak for yourself, there are dozens (dozens!) of us for
             | which recursion is the more natural way to think about most
             | algorithms. Joking aside, DP is very straightforward in
             | more functional languages like Haskell where recursion is
             | the default way to do iteration anyway.
        
               | cubefox wrote:
               | And I think people don't like functional programming
               | languages because they don't like recursion. Even
               | cookbooks use loops!
        
       | akoboldfrying wrote:
       | It's good that the article points out that DP algorithms are
       | "just" clever ways to cache a recursion. Looking for a recursive
       | solution is certainly the best way to start out looking for a DP
       | solution, in my experience -- if you can find one, memoising it
       | is trivial and may give a big speedup (and may even be faster
       | than a "bottom-up" DP, since it only ever computes solutions that
       | we definitely need).
       | 
       | With enough practice, it's usually possible to come up with a
       | (correct but slow) recursive solution. When turning this into a
       | DP, it doesn't matter if there are large numbers of subproblems
       | in the call tree -- what's important is that there are a
       | relatively small number of _distinct_ subproblems. (Since there
       | 's no point caching a result that's only ever needed one time.)
       | And that's where the difficulty tends to lie: Figuring out how to
       | partition the original problem into _few enough distinct
       | subproblems_.
        
       | hit8run wrote:
       | Cannot establish database connection... It probably is...
        
       | ris58h wrote:
       | > Dynamic Programming is not Black Magic
       | 
       | But who said otherwise?
        
         | mikhailfranco wrote:
         | The No True Scots Straw Man?
        
       | croes wrote:
       | I hope the site is not example of dynamic programming
       | 
       | >Error establishing a database connection
        
         | alephnan wrote:
         | I was using Leetcode this week and realized how terrible the
         | software is.
         | 
         | The login flow was broken where it kept signing me out when I
         | tried to click "subscribe to premium". There was no way to give
         | them money and I was stuck in a loop.
        
         | eru wrote:
         | Dynamic programming doesn't automatically mean you don't have
         | any books, or that things will be fast.
        
         | deaddodo wrote:
         | They definitely need to install a caching plugin. Wordpress
         | taking your site down, even on some subpar shared host, should
         | be a thing of the past by now.
        
           | qsantos wrote:
           | Absolutely. I did not bother so far, since I was mostly
           | writing for myself. But I should have before posting here; I
           | know what the hug of death means!
        
         | qsantos wrote:
         | Sorry about that. It looks like the MySQL database did not like
         | the sudden influx of traffic. It's back up!
        
       | bob1029 wrote:
       | If you are seeking black magic, dynamic optimality might be more
       | your speed.
        
       | wiseowise wrote:
       | Awaiting "but you'll never need it at your job" crowd.
        
         | jmull wrote:
         | Do people use it much in their jobs?
         | 
         | I can't remember ever actually using it other than for puzzle-
         | solving (like the excellent Advent of Code problems). I guess
         | it's pretty common to use leet-code type problems for
         | interviews, which maybe counts, but even there hopefully most
         | employers aren't clueless enough to give people DP problems.
         | 
         | I've done some stuff around generating
         | puzzles/levels/challenges for games that maybe could have used
         | it, but it seemed like problems it might be useful for never
         | came up for me... maybe because I found that you get better,
         | funner puzzles and can control the difficulty better if the
         | generator sort of "thinks like" a human solver/player would.
         | 
         | No doubt there are some edge cases, but how much are people
         | really using DP at a job?
        
       | rav wrote:
       | > And many common algorithms are actually just the application of
       | dynamic programming to specific problems, including omnipresent
       | path-finding algorithms such as Dijkstra's algorithm.
       | 
       | Dijkstra's algorithm is an application of dynamic programming? I
       | disagree. In dynamic programming, you tabulate the subproblems to
       | solve, with static interdependencies between them leading to
       | straightforward orders in which to solve the subproblems. In
       | Dijkstra's algorithm, you need to compute the shortest path from
       | s to each vertex, but the order in which you have to visit the
       | vertices is only discovered along the way using a priority queue,
       | so the subproblem interdependencies are not known ahead of time
       | until you have actually solved the problem.
        
         | qsantos wrote:
         | I totally agree in that I use the same mental model. But, if
         | you look at it as "the shortest path must be the shortest path
         | through one of its neighbors", it can actually be classified as
         | a dynamic programming algorithm!
         | 
         | https://en.wikipedia.org/wiki/Dijkstra%27s_algorithm#Dynamic...
        
           | rav wrote:
           | In dynamic programming, the problem can be solved by solving
           | subproblems, and those subproblems are solved by solving
           | subsubproblems, and there is overlap between these
           | subproblems. This allows us to solve DP problems in two ways,
           | either by recursion with memoization or by iterative table
           | filling.
           | 
           | Although the shortest path problem has some kind of "optimal
           | substructure", the recursive memoized approach doesn't work
           | because there's no set order in which the subproblems can be
           | solved. Instead, you need to compute the shortest paths in
           | order of shortest path length, and the shortest path lengths
           | aren't given ahead of time - those are exactly what
           | Dijkstra's algorithm computes!
           | 
           | It's not enough to call it dynamic programming that "the
           | shortest path must be the shortest path through one of its
           | neighbors", because this fact doesn't immediately lead to an
           | acyclic subproblem dependency graph.
           | 
           | Shortest path on an acyclic graph, and longest path on an
           | acyclic graph, are two problems that can be solved with
           | dynamic programming - but Dijkstra's algorithms solves
           | shortest paths on a different class of graphs that doesn't
           | lend itself to DP.
        
             | kj4211cash wrote:
             | I'm a bit lost in this terminology. But coming from the
             | Operations Research perspective that gave Dynamic
             | Programming its name, Dijkstra's Algorithm is very clearly
             | Dynamic Programming. It's Forward Dynamic Programming as
             | opposed to the much more common Backward Dynamic
             | Programming, if that helps any.
        
         | nihzm wrote:
         | Not quite, actually Dijkstra is a special case of what are
         | called label correcting methods / algorithms (LCA) [1], which
         | come directly from applying dynamic programming to the shortest
         | path problem. LCA apply to all finite graphs with non-negative
         | cycles and to be specific Dijkstra's algorithm is a LCA where
         | in step one the node to be removed is choosen according to
         | min_{i \in OPEN} d_i, see [1] for more.
         | 
         | [1]: https://web.mit.edu/dimitrib/www/DP_Slides.pdf (see
         | lecture 3)
        
         | vladimirralev wrote:
         | Dijkstra is definitely dynamic programming, no doubt about it,
         | you are still computing a new state from neighbouring state
         | nodes while ignoring substantial number of non-neighbouring
         | states.
         | 
         | You just have to accept the more abstract definition of "state"
         | where the state encodes subsets of the graph. The states in
         | Dijkstra are represented by subgraphs and distances that
         | incrementally include more nodes leaving us with a path of
         | states to follow in some order.
         | 
         | It's not much different from the traveling salesman or viterbi
         | in that sense. You come up with some topological order of
         | states in abstract state-space and then follow that topological
         | order to compute the new state based on the adjacent previously
         | computed states and never look back to update already finalised
         | states.
         | 
         | With this more abstract point of view, it's clear Dijsktra is
         | dynamic programming.
         | 
         | There is a whole field of graph dynamic programming problems.
         | And the closely related Markov chain problems.
        
           | rav wrote:
           | > You come up with some topological order of states in
           | abstract state-space
           | 
           | If the state space is "subsets of V", then it's exponentially
           | larger than the set of states actually visited in Dijkstra's
           | algorithm. Dijkstra's algorithm has an invariant that the
           | vertices visited have a smaller distance than the vertices
           | not visited. For vertex sets that adhere to this invariant,
           | there's clearly a topological order, but for arbitrary
           | subsets of V I don't see how this topological order would be
           | defined.
           | 
           | I guess my gripe is that in my view, the framework of dynamic
           | programming is not a useful way to analyze algorithms that
           | explore a small set of states in an exponentially larger
           | state space.
        
       | charlieyu1 wrote:
       | I don't find DP that difficult to be called Black Magic. Tree
       | algorithms on the other hand, is much harder
        
         | qsantos wrote:
         | Do you mean tree rebalancing algorithms? I have to agree with
         | this. AVL tree insertion is fine enough, but it gets hairy when
         | you get to deletion. And Red-Black trees...
        
           | charlieyu1 wrote:
           | There are more than that, even finding diameter of a tree is
           | pretty nasty
        
             | qsantos wrote:
             | Ah yes, but I use Rust, I cannot go back up a tree (-:
        
               | KRAKRISMOTT wrote:
               | No graphs either :(
        
             | n2d4 wrote:
             | Only nasty to find diameter of general graphs, right? If
             | you know you have a tree, just root it at a random vertex
             | and recursively compute `max(largest diameter of children,
             | sum of the two biggest heights of children)`
        
       | Der_Einzige wrote:
       | All problems solved using DP can also be solved using global
       | optimization techniques. A much easier one to implement in an
       | interview than most DP solutions are simple genetic algorithms.
       | Yes, DPs really do solve any silly interview problem and even
       | have several advantages to boot (i.e. partial solutions if
       | stopped before they're finished).
       | 
       | Anyone interviewing you who won't give you a pass for solving the
       | leetcode problem the "wrong way" without strong very strong
       | justifications is a fool who themselves shouldn't be working in
       | tech.
        
         | qsantos wrote:
         | Did you encounter practical problems where genetic algorithms
         | work well? So far, the only serious usage I did was for
         | CodinGame's Mars Lander optimization problem (and it works
         | pretty well there!).
        
         | anon291 wrote:
         | Genetic algorithms are not deterministic, whereas a DP solution
         | is going to be deterministic with very easily calculable bounds
         | on execution time. Comparing the two as you do, in my opinion,
         | shows a fundamental gap in algorithms understanding.
        
       | hxypqr wrote:
       | Most of Dynamic programming is just a method of reducing
       | computational complexity by changing the noun objects in first-
       | order logic (or second-order logic, advanced version) to walk
       | through the answers of unfinished tasks using completed tasks.
       | Only in very few cases is it necessary to extract and match the
       | completed parts from the unfinished objects in the above process,
       | which often involves optimizing a function f(A,B). However, most
       | of the time, this process is futile.
        
       | marcodiego wrote:
       | I had a very good algorithms professor. He studied at UCLA. His
       | classes about dynamic programming were superb. He started with a
       | problem for which the naive solution was simple and had
       | exponential complexity. Then he broke the problem into smaller
       | problems and reduced the complexity to something polynomial. Then
       | he applied memoisation and the complexity dropped to linear.
       | 
       | I really would like to remember the problems he used.
        
         | Horffupolde wrote:
         | 1. *Fibonacci Sequence*: The classic example where the naive
         | recursive solution has exponential complexity. By storing
         | previously computed values (memoization), the complexity can be
         | reduced to linear.
         | 
         | 2. *Coin Change Problem*: Given different denominations of
         | coins and a total amount, finding the number of ways to make
         | the change. The naive approach is exponential, but dynamic
         | programming reduces it to polynomial complexity.
         | 
         | 3. *Knapsack Problem*: Particularly the 0/1 Knapsack problem,
         | where items with given weights and values must be placed in a
         | knapsack of a fixed capacity to maximize total value. The naive
         | exponential solution can be optimized using dynamic
         | programming.
         | 
         | 4. *Matrix Chain Multiplication*: Determining the most
         | efficient way to multiply a chain of matrices. The problem can
         | be solved in exponential time using a naive approach but
         | becomes much more efficient with dynamic programming.
         | 
         | 5. *Longest Common Subsequence*: Finding the longest
         | subsequence common to two sequences. A classic dynamic
         | programming problem that can be solved in polynomial time.
         | 
         | 6. *Longest Increasing Subsequence*: Finding the length of the
         | longest subsequence of a given sequence such that all elements
         | of the subsequence are sorted in increasing order.
         | 
         | 7. *Shortest Path Problems*: Like the Floyd-Warshall algorithm
         | for finding the shortest paths in a weighted graph with
         | positive or negative edge weights.
         | 
         | 8. *Edit Distance (Levenshtein Distance)*: Finding the minimum
         | number of edits (insertions, deletions, substitutions) needed
         | to change one word into another.
        
           | JohnKemeny wrote:
           | I just want to add
           | 
           | 9. _Longest Path in DAGs_ : Find a longest path in a directed
           | acyclic graph.
           | 
           | 10. _Weighted Independent Set on a Path_ : Given an array of
           | integers, compute the maximum sum of numbers provided that
           | you may not take two consecutive cells.
        
           | manvillej wrote:
           | I am a big fan of the Knight Dialer. I wrote an article on it
           | and how you can actually use graph theory to reduce it to an
           | incredibly efficient 4x4 matrix. was super fun.
        
         | qsantos wrote:
         | I have listed a few in the article. It's pretty common to see
         | them in lectures and practical exercices.
         | 
         | - longest common subsequence
         | 
         | - longest common substring
         | 
         | - line warp
         | 
         | - subset sum
         | 
         | - partition
         | 
         | - knapsack
         | 
         | You can also have a look at [1] for more ideas.
         | 
         | [1]
         | https://en.wikipedia.org/wiki/Dynamic_programming#Algorithms...
        
         | jakeinspace wrote:
         | In addition to the suggested problems others posted, perhaps it
         | was a scheduling problem? Like, for example, scheduling N
         | overlapping events in time, like course schedules or processes
         | for a CPU. Generally this would be done to optimize something
         | like throughput - I believe that when you start adding special
         | requirements, like "These 2 courses must be taken together",
         | then things get much more complicated and intractable compared
         | to plain-old dynamic programming.
        
         | colordrops wrote:
         | Did he study under Kang at UCLA?
        
           | marcodiego wrote:
           | I don't know, but I know he studied there in late 90's or
           | early 2000's.
        
       | dahart wrote:
       | > "Bootstrap" is an imaged expression to point to the absurdity
       | and impossibility of a task
       | 
       | This is an archaic definition that I didn't know, and it's fairly
       | interesting. At the bottom of the opening section on the history
       | of "bootstrap" is this comment:
       | 
       | "Critics have observed that the phrase is used to portray unfair
       | situations as far more meritocratic than they really
       | are.[8][9][7] A 2009 study found that 77% of Americans believe
       | that wealth is often the result of hard work.[10] Various studies
       | have found that the main predictor of future wealth is not IQ or
       | hard work, but initial wealth.[7][11]"
       | 
       | Interesting (to me, anyway) because it used "meritocratic" which
       | itself is a word that was coined with a somewhat different
       | meaning than it has today. Meritocracy was originally used to
       | point out that merit itself is an outcome of social advantages,
       | not inherent skill.
       | 
       | https://reagle.org/joseph/pelican/social/the-surprising-soci...
        
         | wibblewobble125 wrote:
         | Amusingly, merit's other meaning is the verb "to be worthy."
        
       | Futurebot wrote:
       | I like the short forumlation to explain it: "Dynamic Programming
       | is approximately recursion + memoization + guessing"
        
       | coolThingsFirst wrote:
       | It's black magic bcs ppl don't want to admit they don't
       | understand recursion lol. Fibonacci has always been a bad example
       | to teach recursion. That and towers of hanoi are the greatest
       | failure in teaching CS
        
       | flobosg wrote:
       | One cool application of dynamic programming is the pairwise
       | alignment of nucleotide/protein sequences:
       | 
       | * https://en.wikipedia.org/wiki/Sequence_alignment
       | 
       | * https://en.wikipedia.org/wiki/Needleman%E2%80%93Wunsch_algor...
       | 
       | * https://en.wikipedia.org/wiki/Smith%E2%80%93Waterman_algorit...
        
       | RespectYourself wrote:
       | Good rant about BS terminology. He should've included
       | "responsive" Web pages.
        
       ___________________________________________________________________
       (page generated 2024-01-14 23:00 UTC)