Post AzHDmltrnJ4VM4qATo by thephd@pony.social
(DIR) More posts by thephd@pony.social
(DIR) Post #AzHDmk0WpZDDU9wSaO by thephd@pony.social
2025-10-16T17:51:15Z
0 likes, 0 repeats
Finally getting around to listening to the Wookash Podcast where defer is, apparently, mentioned. I guess I should listen to this thing in ful-- oh my god it's two hours?? Well, time for the 2x speed buff.
(DIR) Post #AzHDmltrnJ4VM4qATo by thephd@pony.social
2025-10-16T17:52:53Z
0 likes, 0 repeats
Oh okay nope one person talks WAY too fast for 2x to work, scales down to 1.75x.
(DIR) Post #AzHDmn5bNA3L2lL3PU by thephd@pony.social
2025-10-16T17:53:08Z
0 likes, 0 repeats
Oh god they're talking about defer. Do I really wanna hear this.
(DIR) Post #AzHDmnfPE2h0po5eDY by thephd@pony.social
2025-10-16T17:55:11Z
0 likes, 0 repeats
Oh, okay, so before talking about defer they briefly mentioned modules, and now spans. The Pro-C person says he's not super convinced spans are necessary, the Hates-C person says he wants spans so that there's a better place to do bounds-checking in a dedicated, compiler-driven fashion. Makes sense. That's about my takes on it, more than any syntactic gains. (I'm also working on the paper for this, but it's tabled until after defer and other stuff are finished).They use the word slices but it's the same thing, spans or slices.
(DIR) Post #AzHDmoFv2Htqf3Ao88 by thephd@pony.social
2025-10-16T18:44:03Z
1 likes, 0 repeats
"The one C standard I would use is C11. Basically, for the atomics."Yeah, that's my take pretty much as well. I do wish I had a lot of the stuff I was standardizing, though. C89/98 is the most portable, agreed there. Seems like a big reason people struggle on C11 is MSVC.The other guy who loves C basically loves C99 (minus VLAs and other turned-optional-in-C11 optional). Both of them are complaining that C23 is adding too much stuff and "forgetting the essence of C".Lol. Lmao, even!
(DIR) Post #AzHDmouKc2E4gO557Y by thephd@pony.social
2025-10-16T18:45:52Z
1 likes, 0 repeats
"These unelected bureucrats... who do stuff and it trickles down. And that's not how it works... To me, what's important, is who implements what."The people on this podcast... ALMOST get it. It's a push-and-pull, but the implementers DO have ultimate power. But, there IS a reason why MSVC has a 30-year-old GCC extension (typeof) implemented basically a month after C23 was ratified.And that's because we standardized it. Sometimes, we get our legs cut by implementers and can't fix or change things. Other times, you bring people in with standards.It's also a blinkered view to say "well, people have to compile the code from 20 years ago, now". What about the code being written right now? What happens 20 years from now? Will you still be using a 40, 60 year old bit of code with 0 improvements, ever? Nobody acknowledges a middleground, where things DO have to improve for the better, or even tools backporting good things into older versions of the standard. There's plenty of "C89, but these fixes and atomics"-style of compilers.If vendors subset the standard, fine, but there has to be a standard to subset! If there isn't, you get what MSVC did to GCC/Clang source code compat for 30 years: load a shotgun and start fucking blasting it. They also acknowledge this with the enumeration situation."It was broken, C23 fixed it" yeah, man. Now that can be backported!
(DIR) Post #AzHDmpaW5ByCnDolsG by thephd@pony.social
2025-10-16T18:47:43Z
1 likes, 0 repeats
"No one, actually NO-one, is writing pure C. We're all using extensions!" ... So, like. Should the Committee acknowledge that and work to standardize those extensions and come to a middleground, or do we just... keep shooting up with extension-drugs forever?I know stability is tempting, but like. You hate the Committee and the de-jure law, but you acknowledge that NOBODY is writing pure, fully-legal C. Do we just keep doing that for eternity? The other part is vendors are complaining about needing to maintain extensions forever.They aren't happy about the situation they're put in, either! It doesn't seem like it's good to keep having (loosely) defined extensions with sometimes subpar documentation upholding whole industries! But maybe I'm just weird for observing that and trying to change it, hm.
(DIR) Post #AzHDmqN5AconDqXYZc by thephd@pony.social
2025-10-16T19:06:58Z
0 likes, 0 repeats
"We should get rid of declaration matches usage" YEEAAAH!Unfortunately, the Hates-C person is playing spoiler: "That's a new language. That's not C anymore!" And he's right! But ooooooOOOOH we gotta get rid of that shit, it's A W F U L. Worst design decision of any proglang!!!(For Committee Reasons:we will never get rid of it. It will never change. Ugly functions will continue to be jaw-droppingly ugly, declarations will continue to look fucked up. That's just a core tenet of C that not even I would try to change.I'm sorry.)"Yes, I'd remove VLAs, but I never use them in the first place." Unilateral agreement from the Loves-C person too. I would agree, but unfortunately VLAs have straight shooters in the Committee, bro.You try to remove them and a small but very strong subsection of C people WILL bet at your door, big dawg. You gotta be careful, Bill! Your truth too right, your insight too real. They'll kill you!(This is a joke and a play on a meme format: nobody will actually come hurt anyone about VLAs, but removing them after being made optional is a thing. MSVC will likely never ever implement them for moral reasons. And they're already conditional, so there's that.)
(DIR) Post #AzHDmr3ccSqVLmRWsa by thephd@pony.social
2025-10-16T19:11:19Z
1 likes, 0 repeats
"Remove const"OOOF, that's a SCALDING hot take. But it's kind of correct. "read-only" is like an object property, not really a good type-system property. But we don't have a good decl-based/object-based system in C (in fact, it has NO real good object model, which will come up later when talking defer); only compile-time system that exists in C is the type system, alas!"I'd remove const from C, but I'd have to replace it with something else." Yeah. It's too useful to get rid of outright, you'd have to replace it, unfortunately. Odin manages this slightly differently than C does."C is the wrong langauge to try and add immutability too" agreed, 100%. It's also why "functional programming" really struggles here, and why C++ struggled to make a functional library bit without some semi-dire consequences (C++20 filter_view shenanigans, anyone?).(They start talking about how overloaded static is, and they bring up the obscure [static N] bit in the language.)Yeah, that's awful. It's not a good way to do bounds, unfortunately! Being able to lie and not having a pre-packed way of keeping the size and pointer together is NOT good. I'm not saying you should never have the ability to lie or just have a naked pointer and attach a random size to it, but 99% of the time you want the system to carry the size and the pointer together, and it's effectively what you do with malloc since it stores it in either a specific block-size section in their heap or they store the size in some pre-determined place before the pointer they hand back to you, or in a pool somewhere!"I would remove the entire standard library" hard agree! C's standard library is its WEAKEST part. Very awful design, lots of shared, mutable, invisible state. Truly bad, very thread-antagonistic. Blowing it up and starting over would be very good stuff.But then it's not C. :3
(DIR) Post #AzHDmrnLsRQRdbq39s by thephd@pony.social
2025-10-16T19:13:35Z
0 likes, 0 repeats
"Only include memset, memmove, and memcpy" yeah, basically. That's pretty much all C is good for for its stdlib.The printing stuff sucks, the time stuff is dated and sucks, the thread stuff still sucks (I'm working on it), the atomics have some guarantees but not all, the Unicode stuff is now okay for conversions (but lacks support pretty much everywhere else), and so on and so forth. It's all booty-ass-cheeks, so, you know. Blow it up, start all over again, make sure it's multi-threading aware (NO internal locks/shared data), stuff like that!
(DIR) Post #AzHDmsXn5mZXxdZ8Xg by thephd@pony.social
2025-10-16T19:14:24Z
1 likes, 0 repeats
I need these people to talk to the C folks because SO MANY people in C-land are shooters for nul-terminated strings, and they are THE MOST crapshoot GARBAGE in C. But these two get it, BOTH the C lover and the C hater directly describe why the implicit-malloc, 0-terminated, threading-awful, and other such things in the standard library are AWFUL.(I also had this issue which is why I proposed SIZED thread attributes, but the Committee blew them all up and wanted null-terminated string ones only. So, even though these two people get it, other C people are EXTREMELY brain-poisoned on this subject and will die with their null-terminated strings despite the various issues that come with them. It's refreshing that even the younger C programmers understand how dogshit null-terminated is, though!)
(DIR) Post #AzHDmtLm5wYSSex3S4 by thephd@pony.social
2025-10-16T19:15:56Z
0 likes, 0 repeats
YEEEAH TALK YO SHIT, MALLOC SUCKS, CALLOC IS AWFUL, REALLOC IS GARBAGE, NO ALIGNMENT NO SIZES IT'S A "REALLY BAD API" THAT'S RIGHT BILL TALK TO 'EEEEEEEEEEEEM!!!!!!!!!!!!!!!!!!!!!!!!!!!!Everyone is agreeing that the whole C standard library design is shit. Good.There's hope for the youth yet."Memory management in C is not actually hard, NO! It's malloc that's hard!"Half-agree. Like, this is a complete true statement, but it's missing the "and we don't have good slices and everything's still a pointer"."Start with the most descriptive name, THEN compress it later." Bill and Ryan will be happy to know that we've adopted this policy! ... But mostly for very new features/headers. Old stuff will still probably have the compressed names! <stdbit.h> has descriptive names, though, as a recent example."We would remove most stuff from C23, and only work from C11 or C99/89" I've talked about this a lot previously, but honestly it's fine. This is how most C developers are; they consider the things they care about to be the holy grail, and then EVERYTHING else is FOR LOSERS.
(DIR) Post #AzHDmuLSObtXXxoJay by thephd@pony.social
2025-10-16T19:18:24Z
1 likes, 0 repeats
Good vouch for Bill's design chops, though:"In Odin, I have atomic operations, but I don't have atomic types but I've been considering the types."This is actually the correct direction of how to design atomics. Atomic operations are the fundamental, hardware-mirroring, low-level design. But then you put types on it for the same reason that Ryan advocated for shortened names: it's such a ridiculously common operation, and certain objects will never, ever be modified non-atomically, ever, and anytime you'd want the benefits of non-atomics you want it done safely from the comfort of an optimizing compiler that's proven all the bits necessary to just start doing shit non-atomically where it matters. C++ and C got this wrong in their designs, they started with atomic objects, but needed atomic "accesses" (operations).Linus also complained about this on the LKML, and C++ fixed it with atomic_ref in C++26. C has no fix for this, because it doesn't have references and because _Atomic(T)* and _Atomic T* are "pointer to atomic object T" and _Atomic(T*) is just "atomic pointer to object T" (e.g., just the accesses to the pointer are atomic). Not great.It is, unfortunately, something that is entirely unrepresentable in C at the moment. It needs to be fixed, but it won't for quite some time I'm sure because C is slow as balls to recognizing its mistakes and moving to fix them. Alas!
(DIR) Post #AzHDmvJMnroiXlq9ya by thephd@pony.social
2025-10-16T19:27:27Z
1 likes, 0 repeats
"I wouldn't have while, or do while, just for."I mean... I guess? Once you have one, the others are free. This is more nitpicky and honestly a flavor remove than anything, and he says as much, but. Even if I was designing a new language from scratch, it seems so totally.. free? Like it would cost me ~nothing to have while/for/loop in a new language. It's not even worth thinking about. If someone comes up with a fourth one that isn't well-handled by those three, then like, sure man. Spice it up. At some point I'd not have more, but. You know.This is one of those things that generates like 500-post-mailing-threads because it's such low-hanging fruit, so people will immediately start frothing at the mouth and bitching endlessly. It's a real timesuck; being a dictator means you can decide one way or another and then you can just start ignoring everyone who gets bitchy about it. It's not serious enough to matter, but BOY the amount people would bitch for or against it would make for legendary meltdowns.
(DIR) Post #AzHDmwBbYDCbFzDTW4 by thephd@pony.social
2025-10-16T19:34:58Z
0 likes, 0 repeats
"For language design, you do what people expect, not just pull stuff out of your ass." Again, good language design chops from Bill here. There's people who design what they want (and it fucking sucks to work with, like Java, Haskell, etc.) and then there's people who design for their users (C, Odin, C#, Typescript, Python, etc.)."Remove the arrow operator, just make the dot operator do the dereference."I mean. Kinda agree, but not really worth it now that the blood is already in the water. It's more important for C to do this than C++, cause no T&."I would remove uninitialized stuff. Equal-zero by design." Good stuff."Error handling in C?" Some disagreement between Ryan and Bill here. Ryan would might do implicit thread_local stuff, but Bill would do return types and keep it all locally. But Bill figures it's API stuff.Also talks about "partial success". He's not sure things are really cleaned up there.
(DIR) Post #AzHDmx7O5NQI9CFca8 by thephd@pony.social
2025-10-16T19:37:27Z
0 likes, 0 repeats
"Build syste-" YEAH NAH, NOT LISTENING ANYMORE, SKIP!!!!
(DIR) Post #AzHDmy4EYaUj5hmcIy by thephd@pony.social
2025-10-16T19:39:27Z
1 likes, 0 repeats
"My opinion on macros is that they're the only reason C has survived"Buddy you don't know the half of it. There's excel files using C macros as a preprocessing step. It's fucking bananas out there.
(DIR) Post #AzHDmynxoZ4fNXB8aG by asie@mk.asie.pl
2025-10-16T19:44:51.714Z
0 likes, 0 repeats
@thephd@pony.social I used the C preprocessor for inlining code in Lua scripts once.I'm sorry.
(DIR) Post #AzHMKeSUlcr3krhhSK by datenwolf@chaos.social
2025-10-16T20:34:28Z
1 likes, 0 repeats
@thephd *cough*I'm just saying: "X11 Resource Database"IYKYK…