[HN Gopher] Redundancy vs. dependencies: which is worse? (2008)
       ___________________________________________________________________
        
       Redundancy vs. dependencies: which is worse? (2008)
        
       Author : luu
       Score  : 35 points
       Date   : 2023-01-09 23:17 UTC (23 hours ago)
        
 (HTM) web link (yosefk.com)
 (TXT) w3m dump (yosefk.com)
        
       | feoren wrote:
       | Conspicuously absent from this entire article is the word
       | "abstraction".
       | 
       | Oh, sorry, I just said a bad word, didn't I? We're not allowed to
       | use the _a_ -word around these parts, or people will think we're
       | being _enterprisey_ and chase use away with pitchforks. What am
       | I, some kinda _enterprise_ programmer?
       | 
       | I can see why doing an "extract to interface" IDE command in the
       | messy situation TFA describes would (rightly!) get this reaction.
       | Of course that's just adding more needless complexity to an
       | already needlessly complex situation. But that's not what good
       | abstraction looks like. For some reason, everyone seems to get it
       | in their head that abstraction is supposed to be some sort of
       | list of features you support. So you take each public class Foo
       | and extract them to an interface IFoo and you have your list of
       | features. Abstraction!
       | 
       | No. Good abstraction describes your _needs_. That 's it! If your
       | module needs to do something, but doesn't want to be responsible
       | for the exact implementation of that thing, then you define an
       | abstraction which describes that need. It will be implemented by
       | whoever needs your module.
       | 
       | > You can choose between having modules A and B using a module C
       | doing something, or have them do it themselves. What's your call?
       | 
       | A defines abstractions describing what it needs to do its job,
       | and so does B. Since A and B have different responsibilities,
       | these are two separate pieces of code, even if they look
       | basically the same right now.
       | 
       | Now you have lots of options, and you can vary between them
       | easily depending on what makes sense:
       | 
       | - A and B can implement the abstractions C needs, and then use C.
       | So A -> C and B -> C. ("->" meaning "depends on"). The code
       | exists in isolated corners of A and B so it can be moved easily
       | later.
       | 
       | - C can implement A's and B's abstractions, so C -> A and C -> B.
       | The code exists in isolated corners of C, so it can be moved
       | easily later.
       | 
       | - Leave A, B, and C _completely independent_ of one another.
       | Write a small module _ac_ , which provides implementations for A
       | in terms of C, and a small module _ad_ , which provides
       | implementations for B in terms of C. Now you have _no
       | dependencies whatsoever_ between your major modules, and you 've
       | introduced small adapter modules that you can throw away later.
       | 
       | - Leave A, B, and C _completely independent_ of one another. Your
       | end application handles stitching them together (like above, but
       | _ab_ and _ac_ are just sitting inside your end application).
       | 
       | > Rumors tell that [XParam's] original host project uses <5% of
       | its features.
       | 
       | Let the marketing people think in terms of "features". You should
       | be thinking in terms of _needs_. You can spend months
       | implementing features that you don 't need. It's much harder to
       | implement a need you don't need. An that's what abstractions are:
       | a description of what you need to do your job.
        
         | Xeoncross wrote:
         | "The bigger the interface, the weaker the abstraction." -
         | https://www.youtube.com/watch?v=PAAkCSZUG1c&t=5m17s
         | 
         | Abstractions are good if 1) they can't be any smaller and are
         | 2) covering well understood processes warranting them.
        
           | feoren wrote:
           | That's basically right, but not really for the right reasons.
           | What exactly does "smaller" mean? Rob Pike is saying they
           | have few methods. We're kinda getting to the right ideas
           | without understanding why. It still sounds like you (and Rob
           | Pike, from the bit I listened to) are still thinking of
           | abstractions as a _feature list_ , and you're saying: instead
           | of having one list of lots of features, have lots of lists of
           | one feature.
           | 
           | Okay, but ... we're still missing the point. It's _not about
           | features_. It 's about _needs_.
           | 
           | How do you know whether an abstraction can be "smaller"? What
           | does "smaller" mean? Smaller means: more generic and/or
           | easier to implement. That is _not the same_ as having fewer
           | methods, even if that is often the result. An abstraction can
           | be smaller if you can make it more generic, and /or easier to
           | implement, and have it still do everything you need it to do.
        
             | Xeoncross wrote:
             | An interface must be a small as possible to fulfill it's
             | purpose, no smaller and no larger. It's not about
             | implementation, it's about the mental model of what is
             | expected and why.
             | 
             | Another way to say this is that an interface must be
             | perfectly sized, but perhaps that is to abstract to be
             | useful.
             | 
             | Rob Pike mentioned few methods because that is the only
             | thing that interfaces mean in Go - he might have used
             | different terminology in another context or speaking about
             | abstraction of systems instead of interfaces alone as
             | abstractions.
        
         | macintux wrote:
         | > Oh, sorry, I just said a bad word, didn't I? We're not
         | allowed to use the a-word around these parts, or people will
         | think we're being enterprisey and chase use away with
         | pitchforks. What am I, some kinda enterprise programmer?
         | 
         | What HN discussions have you been reading? I nearly stopped at
         | this preemptive snark.
        
           | JonChesterfield wrote:
           | That would be the style of the post under discussion
        
           | feoren wrote:
           | Perhaps it's just a vocal minority, but I see a lot of
           | backlash against anything that smells like large-scale, top-
           | down, enterprise design. People see a concept implemented
           | badly, so they get it in their heads that the original
           | concept was worthless. This is one major underlying force for
           | things like the NoSQL movement and other cyclic fads.
           | 
           | What HN discussions have I been reading? Let's look back:
           | 
           | "Abstraction is Expensive"
           | [https://news.ycombinator.com/item?id=33895178]
           | 
           | > I wish it was a meme [to hate abstraction]. The things
           | people complained about decades ago are still happening
           | today, and the practitioners are all too eager to turn a 100
           | line program into a 1000 line one, while taking weeks to test
           | if it does what it should. The average loud dev is a GoF
           | fanatic in love with inaccessible reflection.
           | 
           | https://news.ycombinator.com/item?id=33897735
           | 
           | > I've found this "come correct" mindset is used to justify
           | unnecessarily flexible solutions to allow for easier changes
           | in the future ... changes that 90% of the time, never
           | actually materialize.
           | 
           | > [a response:] In other words, when the basic assumptions of
           | that fancy abstraction are just not workable with the future
           | requirements, you're hosed. Worse, now you might need to
           | refactor a lot of code building on this abstraction.
           | 
           | https://news.ycombinator.com/item?id=33686475
           | 
           | > This view follows quite naturally from another aspect of
           | modern programming thinking (and education), which claims
           | many problems are gross and complex, and thus we need
           | abstraction to make them appear simpler.
           | 
           | From an article that hit the front-page of HN, although this
           | is a little unfair, because this guy is a raging asshole.
           | 
           | https://news.ycombinator.com/item?id=33379079
           | 
           | > The Gang of Four on the other hand was an unfortunate turn
           | for the industry. As you say we took it too seriously. In
           | practice very few projects can benefit from the "Factory
           | pattern"
           | 
           | Someone claiming passing a function that takes 0 arguments
           | and returns 1 is too enterprisey and "very few projects" can
           | benefit from it. Passing functions around is an absolutely
           | essential feature for a programming language, and I can't
           | imagine how you could develop good abstractions (or good code
           | at all) without it. And yes, I've worked in a lot of
           | languages that don't support first-class functions; that's
           | how I came to the conclusion that it's absolutely essential.
           | 
           | This also illustrates my point about how something can be
           | done badly (factory pattern) and it poisons the well for the
           | concept (passing a function).
           | 
           | https://news.ycombinator.com/item?id=33406880
           | 
           | > ignoring the architectural realities of the hardware is
           | ignoring one of your responsibilities as an engineer to
           | deliver performant software. Now, it's possible to argue that
           | writing performant software is not important ...
           | 
           | The argument in question here is whether every programmer
           | should always know which processor cache every variable they
           | use lives in at all times. I think it's pretty clear this is
           | completely incompatible with even the barest form of
           | abstraction.
           | 
           | https://news.ycombinator.com/item?id=33154568
        
       | dathinab wrote:
       | It's a balance, most things in live are.
       | 
       | Seeing things in absolutes sets you up for failure, or at least
       | limited success.
       | 
       | ---
       | 
       | As a side note if we speak about dependencies instead of code
       | reuse in general: dependency pinning and not updating is
       | thoroughly underutilized. Sure you probably don't want to do it
       | for idk. openssl, or your web framework. But reviewing, pinning
       | and forgetting about small self contained dependencies as long
       | you don't run into bugs can be a pretty good idea (through
       | depends a lot on the language, immutability of dependencies,
       | etc.).
       | 
       | Edit: Also making it easy to change code when requirements change
       | doesn't mean adding all kinds of generic abstractions to your
       | code. It mean keeping code clean and simple and avoiding implicit
       | logic dependencies enough to make code changes easy. Adding a lot
       | of generic abstractions can make it harder to change things
       | instead of easier.
        
         | alfalfasprout wrote:
         | Pinning and forgetting is also a fantastic way to be riddled
         | with security vulnerabilities in a not so short span of time!
         | And a great way to make it a pain for people to upgrade to
         | newer versions of dependencies since they'll need to jump
         | several major versions in some cases.
        
       | dang wrote:
       | Related:
       | 
       |  _Redundancy vs. dependencies: which is worse? (2008)_ -
       | https://news.ycombinator.com/item?id=15621435 - Nov 2017 (29
       | comments)
       | 
       |  _Redundancy vs. dependencies: which is worse? (2008)_ -
       | https://news.ycombinator.com/item?id=9730492 - June 2015 (39
       | comments)
        
         | thadt wrote:
         | _Redundant: ...
         | 
         | _ FTFY
        
           | jspash wrote:
           | Well that depends...
        
           | dang wrote:
           | Previous discussions add information! Usually, at least.
        
       | bob1029 wrote:
       | I always default to redundancy until I can't stand it anymore, or
       | it has actually caused an issue.
       | 
       | For me, the real devil is the clarity of requirements and
       | intended product roadmap. If the thing you are building is
       | crystalline & eternal with 100% obvious requirements on day #1,
       | then sure. Spend all afternoon in the architecture rabbit hole
       | and make a likewise implementation.
       | 
       | I've never worked on a project where you can plan this far ahead.
       | Think about what happens when you allow 2 different developers to
       | work on 2 distinct features that share a (vague) common concept,
       | but then encourage them to independently implement that common
       | concept. Some developers look at this like an inconsistent horror
       | show. I look at it as additional options to choose from.
       | 
       | At the end of the day, you can always come back and de-dupe these
       | common types and re-align implementations accordingly.
        
         | ChrisMarshallNY wrote:
         | This.
         | 
         | A big part of my refactoring is distilling base classes, and
         | protocol defaults (I was just doing exactly that, a few minutes
         | ago, in fact).
         | 
         | I have watched so many dependapocalyses, that I have a healthy
         | fear. I no longer have to touch that particular stove-lid, to
         | know that it's hot.
         | 
         | There are a number of serious issues with dependencies.
         | 
         | In many cases, they are great. They can give a fairly
         | inexperienced programmer, massive power. In many other cases,
         | they are _not_ great. They can give a fairly inexperienced
         | programmer, massive power.
         | 
         | But I use them frequently, in my own work. It's called "Modular
         | Programming," and the concept is decades old. I write almost
         | all my modules, and include them as packages, with static
         | linking. Each package tends to be a fairly standalone, atomic
         | software project, with its own lifecycle.
        
           | bob1029 wrote:
           | > They can give a fairly inexperienced programmer, massive
           | power. (x2)
           | 
           | This is a big reason why we use one of the broadest
           | frameworks available in modern tooling today - .NET 6+. It
           | has so many batteries included that pulling in a 3rd party
           | dependency is something rare enough to call a big meeting
           | for.
           | 
           | Additionally, that "dependency" is the same one everyone else
           | uses - not just in our organization - so all that code could
           | be shared with minimal friction.
           | 
           | There are certainly downsides to this - what if Microsoft
           | becomes _even more_ evil? But, we can hypothesize about the
           | end of the world while our competition uses the same trick
           | and steals all of our lunch money...
        
       | yonixw wrote:
       | If you don't have tests that will alert you about de-syncs
       | between 2 components than it's just a tradeoff of "Choose your
       | poison" question.
        
         | JonChesterfield wrote:
         | Being alerted is good. Not necessarily sufficient though.
         | Dependency A depends on C.v2. Dependency B used to depend on
         | C.v2 but just upgraded to C.v3.
         | 
         | Either your language (or _maybe_ tooling) allows those to
         | coexist or you are now going to have a really bad time and give
         | serious thought to jettisoning one of A or B.
        
       ___________________________________________________________________
       (page generated 2023-01-10 23:01 UTC)