[HN Gopher] The road to hell is paved with good intentions and C...
___________________________________________________________________
The road to hell is paved with good intentions and C++ modules
(2023)
Author : fanf2
Score : 74 points
Date : 2024-05-25 16:42 UTC (6 hours ago)
(HTM) web link (nibblestew.blogspot.com)
(TXT) w3m dump (nibblestew.blogspot.com)
| db48x wrote:
| I stopped paying attention to C++ stuff years ago. Did any of
| these problems ever get fixed?
| kevindamm wrote:
| If by fixed you mean "elaborate measures for specificity and
| saliency but still the same order of undefined behavior &
| bugbears" then yeah? It's gotten a lot more... more.
| db48x wrote:
| Oh good, I'd hate for the industry to have stagnated :P
| Bostonian wrote:
| The author says Fortran modules are easier for build systems to
| handle:
|
| 'A Ninja file is static, thus we need to know a) the compilation
| arguments to use and b) interdependencies between files. The
| latter can't be done for things like Fortran or C++ modules. This
| is why Ninja has a functionality called dyndeps. It is a way to
| call an external program that scans the sources to be built and
| then writes a simple Makefile-esque snippet describing which
| order things should be built in. This is a simple and
| understandable step that works nicely for Fortran modules but is
| by itself insufficient for C++ modules.'
|
| ...
|
| 'Let's start by listing the requirements [for modules]:
|
| All command lines used must be knowable a priori, that is,
| without scanning the contents of source files
|
| Any compiler may choose to name its module files however it
| wants, but said mapping must be knowable just from the compiler
| name and version, i.o.w. it has to be documented
|
| Changing source files must not cause, by itself, a
| reconfiguration, only a rescan for deps followed by a build of
| the changed files
|
| The developer must not need to tell which sources are which
| module types in the build system, it is to be deduced
| automatically without needing to scan the contents of source
| files (i.e. by using the proper file extension)
|
| Module files are per-compiler and per-version and only make sense
| within the build tree of a single project
|
| If module files are to be installed, those are defined in a way
| that does not affect source files with modules that do not need
| installing.
|
| Fortran modules already satisfy all of these.'
| kergonath wrote:
| Fortran modules are great for developers but a bit of a pain
| when setting up a build system (even though it's quite
| straightforward now, it still requires a first pass with
| something like makedepf90 or one of its many equivalents). I
| find it quite a feat that C++ modules, while benefitting from
| the experience of many languages, ended up so broken.
|
| Also, I never really liked cmake but this looks really bad.
| kazinator wrote:
| If your language has modules, like in the syntax and
| semantics, and you're still requiring external tools to hunt
| down dependencies, your module system must somehow be a
| failure.
|
| The idea behind modules is that since they are part of the
| language, the compiler for the language understands modules
| and follows the dependencies. All module-related tooling,
| like incremental builds, is hooked through the compiler. You
| don't use _make_ or external dependency generators.
|
| If someone does need a dependency generator for whatever
| reason, the compiler has to provide that function. You give
| the compiler the proper option to spit out dependencies,
| point it at the root module of the program, and let it do its
| thing.
| kergonath wrote:
| > If your language has modules, like in the syntax and
| semantics, and you're still requiring external tools to
| hunt down dependencies, your module system must somehow be
| a failure.
|
| No argument from me :)
|
| > The idea behind modules is that since they are part of
| the language, the compiler for the language understands
| modules and follows the dependencies. All module-related
| tooling, like incremental builds, is hooked through the
| compiler.
|
| I am not sure I follow. The compiler still compiles one
| file each time it is run, so we need a first pass to
| determine the dependency tree and what can be compiled in
| which order, right? Even if that first pass uses the
| compiler itself (doing only enough parsing to see what is
| needed).
|
| > You give the compiler the proper option to spit out
| dependencies, point it at the root module of the program,
| and let it do its thing.
|
| Nowadays, gfortran can generate Makefile fragments with the
| dependencies, but it was not always the case. This is why
| there are a handful of small tools to do it and makedepf90
| is one of them (it's a small Fortran program with a
| rudimentary parser). There are also other compilers that
| don't do it properly.
| ReleaseCandidat wrote:
| > Nowadays, gfortran can generate Makefile fragments with
| the dependencies, but it was not always the case.
|
| For quite some years, there had been no gfortran, only
| g95.
| awinter-py wrote:
| ughghgh
|
| every proposal for C/C++ modules feels like it's by someone who
| has not used a working module system and doesn't believe it's
| truly possible to create
| IshKebab wrote:
| It's because they're trying to make a module system that is
| compatible with decades of development based around `#include`.
| It's nowhere near as simple as making a new language with
| modules.
|
| I almost feel like they should give up. IMO Rust and Zig have
| made C/C++ a legacy language. Except in some specific areas
| where Rust doesn't have mature libraries (GUIs & games mostly)
| you'd be silly to start a new project in C++.
|
| Migrating existing C++ projects to modules likely isn't worth
| the enormous effort, and the days of starting new C++ projects
| are numbered... so why bother?
| _hl_ wrote:
| There is still an unfathomably huge landscape of problems
| where C++ is the more mature, "right" choice over rust for
| new projects. Just from my (extremely limited and narrow)
| personal experience:
|
| - scientific/numerical computing (super mature)
|
| - domain-specific high performance computing (C++ is for
| better or worse extremely flexible)
|
| - weird CUDA use cases (feels very native in C++)
|
| - game dev as you mention
|
| - probably much much more
|
| C++ to me feels more like a meta-language where you go off to
| build your own kind of high performance DSL. Rust is much
| more rigid, which is probably a good thing for many
| "standard" use cases. But I'm sure there will always be a
| very very long tail of these niche projects where the
| flexibility comes in handy.
| zrules wrote:
| Opencv while not niche is also one of them. It's definitely
| preventing me from going full in on rust. Probably for a
| while.
|
| This Reddit comment captures my reasons pretty well: https:
| //www.reddit.com/r/rust/comments/1arys3z/comment/kqoak...
| pjmlp wrote:
| There are tons of places where Rust and Zig don't have a
| solution, or are not part of an industry standard that builds
| on top of C and C++, like GPGPU, Khronos standards, console
| SDKs, ...
|
| Additionally while they keep being built on top of C++
| compiler infrastructure, C++ isn't going away.
| IshKebab wrote:
| Sure _now_ that is the case. But which do you think will
| happen first - C++ will get working modules or people just
| give up on C++? Definitely not clear cut!
| Conscat wrote:
| How is it that D and Objective-C++ came up with modules that work
| and can be used many years ago, yet C++ with much greater stakes
| fumbled it on almost every level?
| marcosdumay wrote:
| Because the C family is guided by a committee. Even worse, an
| important and respected committee.
| wudangmonk wrote:
| The issue seems to be the C++ committee, never heard anyone
| complain about C's committee.
| nicoburns wrote:
| The C committee's big problem is being too conservative.
| But you don't hear too many complaints because people just
| use C++.
| flohofwoe wrote:
| I think the reason is more that the goals of the C
| committee align better with the goals of C programmers
| (which is a language that's a tool and not a playground
| for language designers).
|
| Unlike C++, C also remained a simple language (which is
| definitely a side effect of the "conservatism" of the C
| committee).
| edflsafoiewq wrote:
| That's what he means; like the puddle which marvelously
| fits its own hole, anyone for whom C in all of its
| aspects is not very close to optimal has long ago moved
| to another language.
| rightbyte wrote:
| Wait what? C's greatest strength IMO is that it hardly
| change at all.
|
| The mental overload of some small things added every 10th
| year and in use 20 years later is how I like it.
| pjmlp wrote:
| Go check the complaints on anything past C11, or how after
| 50 years WG14 keeps ignoring security issues.
| flohofwoe wrote:
| The "C family" has two committees, one that works fairly well
| (https://www.open-std.org/jtc1/sc22/wg14/) and one that
| doesn't (https://www.open-std.org/jtc1/sc22/wg21/).
| murderfs wrote:
| WG14 has only "worked" fairly well insofar as it did
| absolutely nothing but file the serial numbers off of
| things standardized in C++ and release them. The biggest
| C-original features (type generic macros, annex K) are all
| gigantic boondoggles. Even many of the features that were
| lifted from C++ were poorly thought out (e.g. compatibility
| between <stdatomic.h> and <atomic>).
| imachine1980_ wrote:
| Is easy, you don't have the same amount of courdination
| problem, most of what's wrong in software today have in big
| part problems whit two things legacy and coordination problems,
| need back compatibility means that useful solution aren't
| posible, soy then you need to compromise and then compromise
| again because some users cant use you solution, so now you have
| solution who don't satisfied anyone, sum that whit natural
| misalignment of any organization and is obviously that focus
| sitmrs whit few stakeholders and willingness and capacity to
| break change can do a better job, this ins't particular problem
| of programming.
| nine_k wrote:
| Exactly because of much greater stakes. A lot of heavyweights
| involved, and nobody agrees to yield. The result is a
| compromise, aka a solution which is unsatisfactory to all
| parties to an equal degree, as they say.
|
| Best designs are produced by small, tightly-knot, often one-
| person design teams. Examples: Unix, Clojure, SQLite,
| Boeing-747, Westminster Palace in London.
|
| Sometimes a small team with a cohesive vision keeps on
| attracting like-minded people or matching ideas, and the
| project grows with contributions from a large number of people.
| The key part of every such success is the vetting process.
| Examples: Python, Ruby, FreeBSD and OpenBSD.
|
| Worst designs with most glaring, painful, expensive, even
| tragic shortcomings are produced by large committees of very
| important people from huge, very (self-)important companies,
| agencies, departments, etc. Each of them has an axe to grind
| or, worse, a pet peeve. Examples: PL/I, Algol-68, the Space
| Shuttle (killed two crews because the escape system was removed
| from the quite sane initial design); to a lesser degree, it's
| also HTML, CSS, and, well, C++ to a large degree :( The disease
| has a name [1].
|
| Sometimes a relatively small and cohesive team in a large
| honking corporation produces a nice, cohesive design, like this
| happened to Typescript, certain better parts of CSS, and some
| of the better parts of C++. This may sometimes make a false
| impression that "design by committee" sometimes works.
|
| [1]: https://en.wikipedia.org/wiki/Design_by_committee
| intelVISA wrote:
| Modern C++ would easily be The Perfect Language... if only it
| wasn't designed by committee and thus sabotaged by all those
| Very Important People.
| astrange wrote:
| SQLite is three people, which is probably better than one
| because you do want to bounce ideas off people.
|
| Not sure about Unix... it's gone through a lot at this point
| and obviously it's a committee now.
| xuhu wrote:
| Which category do WASM and JPEG fit into ?
| jokoon wrote:
| Backward compatibility with existing compilers.
|
| And more importantly, not breaking old, old codebases, which is
| a very critical problem.
|
| Obj-C++ and D don't have critical codebases, because there is
| just so much less code. Apple does what they want.
|
| Like Stroustrup says, when a language is not used, it won't
| generate problem: I gues his quote is often quite overlooked,
| but it's very very true.
|
| Compilers are a difficult topic.
| shdnx wrote:
| There is no backwards compatibility here to speak of: C++
| modules are a new feature introduced by the C++20 standard.
| There was absolutely no reason to make a mess out of them.
| plorkyeran wrote:
| After a decade of using obj-c modules I'm still unclear on what
| the benefit of them is other than the Swift bridging aspects of
| it. There isn't really any observable difference in either
| compile-time performance or behavior from turning them on and
| off other than that a lot of the rules around inclusions are
| much stricter with modules enabled. The documentation claims
| that they should improve build performance, but AFAICT that
| isn't actually true or the difference is so small as to be
| irrelevant.
| astrange wrote:
| It's somewhat true and it should also improve debugger
| performance, but it takes a very long time to get to the
| point where that can be turned on and anyone sees it.
| (keyword is "explicit modules")
| vlovich123 wrote:
| What's upsetting is that the feedback has been provided before
| that the standards body and compiler devs aren't communicating
| with build systems folks when modules were first being
| standardized. Looks like not a lot has changed.
| Galanwe wrote:
| I don't know of anyone happy with the C++ standard committee.
|
| Some parts of the new std features were copied from Python,
| which is good, a semantic and naming a lot of people already
| know. Yet some weirdo decided to half copy the naming scheme of
| APL and we end up with functions named "iota". Seriously. What.
| The.
|
| Don't get me started on features added in C++17 and deprecated
| 2 versions later.
|
| Don't ask also why they decided to rename universal references
| by forwarding references while the former makes sense and the
| latter none, just because they were pissed to not have named it
| first.
|
| All I want now, for the future of C++, is a Circle-like
| approach. I want to be able to selectively disable and replace
| legacy behavior and syntax, per file, so that at last we can
| have the language evolve.
| pjmlp wrote:
| > Don't get me started on features added in C++17 and
| deprecated 2 versions later
|
| They are currently suffering from not having proper field
| experience on many features before setting them into the
| standard.
| vlovich123 wrote:
| And it's not like they're unfamiliar with how rust is doing
| things and yet they still choose to cling to clearly broken
| processes. ISO is a mess.
| dgellow wrote:
| (2023)
| josefx wrote:
| > At first you run into all the usual tool roughness that you'd
| expect from new experimental features. For example, GCC generates
| dependency files that don't work with Ninja.
|
| To sum the link behind that statement up: The default output
| format used by the GNU Compiler is for GNU Make, a ninja
| compatible format can be generated but is not the default. The
| blog author considers this default broken, GNU developers
| consider this behavior correct.
| IshKebab wrote:
| Typical GNU attitude. I'd say the author is correct unless
| there is a good reason to make it incompatible with Ninja.
| dlundqvist wrote:
| If it can detect type of compiler, I would think it can pass
| the correct flags to it. If that is all it is.
| bonzini wrote:
| Ninja is supposed to read dependency files "in Makefile
| syntax", so I would at least try to understand the issue
| before placing blame.
| codelikeawolf wrote:
| > In other words in order to be able to compile this source file
| you first need to parse the source file and all included sources
| until you hit the export declaration and then throw away the
| result.
|
| I might be completely wrong here, but reading this reminded me of
| barrel files [1] in JS/TS projects, which are an absolute
| nightmare. I put together a little visualization a while back
| that demonstrated how much extra work is required to resolve the
| module graph if you forward exports, and it is _orders of
| magnitude_ more than just importing something directly from the
| file where it is exported. If this is also the case for C++
| modules, I 'd like to buy the author a beer or six.
|
| [1] https://basarat.gitbook.io/typescript/main-1/barrel
| Evan-Purkhiser wrote:
| Would love to see the visualization if you have it on hand!
| dang wrote:
| Discussed (a bit) at the time:
|
| _The road to hell is paved with good intentions and C++ modules_
| - https://news.ycombinator.com/item?id=37891898 - Oct 2023 (2
| comments)
| vitus wrote:
| Do we even have a second C++20-standard-compliant module
| implementation? Last I checked, it was just MSVC -- GCC and Clang
| were like 90% but missing a few features.
|
| Looks like still no, although Intel provides yet another partial
| implementation:
| https://en.cppreference.com/w/cpp/compiler_support/20
___________________________________________________________________
(page generated 2024-05-25 23:02 UTC)