Post 9hMEB56cTH69mBSfHk by grainloom@cybre.space
 (DIR) More posts by grainloom@cybre.space
 (DIR) Post #9hHFJezu2UEVZZAtG4 by grainloom@cybre.space
       2019-03-29T22:34:31Z
       
       1 likes, 0 repeats
       
       I'm starting to see why people like #Ada (the programming language), for decades it was probably a nice alternative for systems programming, but it has so many weird moving parts and complicated rules that I'd rather use Rust.And hopefully in a few years systems programming with dependent types will take off...
       
 (DIR) Post #9hHFXv2zABxLP7PpnE by Eidon@niu.moe
       2019-03-29T22:37:06Z
       
       0 likes, 0 repeats
       
       @grainloom I'm currently learning Ada and I find it a great language. But the great point of Ada is not only the language -- it's the compiler. Gnat is simply amazing -- it helps you like no other compiler I've ever worked with!
       
 (DIR) Post #9hHH4eEOnziglEvNSa by grainloom@cybre.space
       2019-03-29T22:54:14Z
       
       0 likes, 0 repeats
       
       @Eidon Idk, some of its errors have been very misleading, it was not on the level of rustc's helpfulness, which has codes for each error that you can look up for info (instead of having to websearch the error message and trying to guess if what you found is the same error)
       
 (DIR) Post #9hHHFMGTwbvIDz0PtQ by drwho@hackers.town
       2019-03-29T22:56:08Z
       
       0 likes, 1 repeats
       
       @grainloom Ada is still, unfortunately, used for application development.  In a previous life, the satellite trajectory simulator we used was written in Ada, the interpreter of which sat on top of a shim to run on top of Windows XP SP3 (oddly specific OS and patch level checks), with a 32-bit GUI sitting on top.  Thing was a monstrosity.
       
 (DIR) Post #9hHHHcw8vgQwVjZfGa by grainloom@cybre.space
       2019-03-29T22:56:34Z
       
       0 likes, 0 repeats
       
       @Eidon The type system is also horrifically complicated. So much that is done via traits in Rust has special syntax, and worst of all, they drop all pretense of their keyword noise being there to convey information, because.... well just look at it:https://en.wikibooks.org/wiki/Ada_Programming/Generics#Generic_formal_types
       
 (DIR) Post #9hHHNHHsKnmV6UQxjU by grainloom@cybre.space
       2019-03-29T22:57:36Z
       
       0 likes, 0 repeats
       
       @Eidon Ok, that's a bit hyperbolic, but seriously, why does <> mean so many things....
       
 (DIR) Post #9hHHPIQj3a8LE8XINk by grainloom@cybre.space
       2019-03-29T22:57:57Z
       
       0 likes, 0 repeats
       
       @Eidon and why is <> not a keyword like all the other things?
       
 (DIR) Post #9hHHY70Ou15sOtQUnA by grainloom@cybre.space
       2019-03-29T22:59:34Z
       
       0 likes, 0 repeats
       
       @drwho I wonder if it's better than C because of the checks it has.... or worse because who even knows what it's doing and checking.....
       
 (DIR) Post #9hHHdyyBgxbdDRBrsW by grainloom@cybre.space
       2019-03-29T23:00:36Z
       
       0 likes, 0 repeats
       
       @drwho I heard people doing formally verified C++ and C and I think there's a reason they chose those languages instead of Ada. (although C++ is still pretty huge)
       
 (DIR) Post #9hHSxowHnJLHOSxmAi by freakazoid@retro.social
       2019-03-30T01:07:26Z
       
       0 likes, 0 repeats
       
       @grainloom @drwho I suspect we'll see more program extraction from theorem provers rather than dependently-typed systems programming languages. Rust can't help you prove your unsafe modules are correct.
       
 (DIR) Post #9hIC8PqztgB8iQuqHo by grainloom@cybre.space
       2019-03-30T09:33:36Z
       
       0 likes, 0 repeats
       
       @freakazoid @drwho Rust can't really prove too much, since it's not dependently typed, but I've been :thounking: ....what if instead of doing extraction like a classical build step, the language let you do some introspectiony stuff. you could have a representation of the unsafe function and prove things on that.
       
 (DIR) Post #9hIDI0uhZxiaDNmyqe by Eidon@niu.moe
       2019-03-30T09:46:32Z
       
       0 likes, 0 repeats
       
       @grainloom Yeah the language evolved through various major revisions, with the need to have backward compatibility. Extensions were added for object orientation etc; this required some rethinking. As a result the syntax is not always clean. <> is a placeholder. Often it is used as a type constraint. In generics (similar to C++ templates) it is used to specify the type variable. When you instantiate the generic you have to tell what <> means.
       
 (DIR) Post #9hIJAWFfmPQIoiPKYy by grainloom@cybre.space
       2019-03-30T10:52:24Z
       
       0 likes, 0 repeats
       
       @Eidon that I sort of get, but why doestype Foo is <>mean discrete typewhereastype Foo is (<>) privateis a type supporting assignment and basically nothing else
       
 (DIR) Post #9hIJLA9SHwbtSYZgVE by grainloom@cybre.space
       2019-03-30T10:54:21Z
       
       0 likes, 0 repeats
       
       @Eidon re: errorsits syntax errors are the worst to be honest, often gnat just reports a missing semicolon where there's clearly a semicolon, when the actual error was that I omitted the "with" from a function parameter in a generic
       
 (DIR) Post #9hIJwmTGmQ5r1JmR9M by Eidon@niu.moe
       2019-03-30T11:01:08Z
       
       0 likes, 0 repeats
       
       @grainloom Honestly I never experienced anything like this. Gnat is usually perfect in its explanations, much differently from any other compiler I worked with...
       
 (DIR) Post #9hIK8DC5lUd15MCEJE by grainloom@cybre.space
       2019-03-30T11:03:13Z
       
       0 likes, 0 repeats
       
       @Eidon It's certainly better than gcc.
       
 (DIR) Post #9hIKOnYeKhUYV0elBw by Eidon@niu.moe
       2019-03-30T11:06:12Z
       
       0 likes, 0 repeats
       
       @grainloom I'm still relatively new to the language; I will ask my colleagues and let you know!
       
 (DIR) Post #9hIKbM31vS4EwWr0Yi by grainloom@cybre.space
       2019-03-30T11:08:28Z
       
       0 likes, 0 repeats
       
       @Eidon :blobuwu:altho I suspect the answer is along the lines of "historical accident"
       
 (DIR) Post #9hIKoVqqmtB573Qnrc by Eidon@niu.moe
       2019-03-30T11:10:51Z
       
       0 likes, 0 repeats
       
       @grainloom I don't know; my senpais will though! ^_^
       
 (DIR) Post #9hKnY3FKeuqKkVIyxc by freakazoid@retro.social
       2019-03-31T15:42:14Z
       
       0 likes, 1 repeats
       
       @grainloom @drwho The problem with trying to create a dependently typed systems programming language is that you'd need a model of the whole system's behavior, which is a massive undertaking. The system itself would also need to be well-behaved, which they're not. Program extraction lets you focus on modeling just the properties of the system that you care about.It would be good to have libraries to make this sort of thing easier.
       
 (DIR) Post #9hKqACnHoEJtGzVY5g by grainloom@cybre.space
       2019-03-31T16:11:33Z
       
       0 likes, 0 repeats
       
       @freakazoid @drwho idk, what's wrong with just giving the language a nice C FFI? Idris already has that, and Rust uses safe wrappers around unsafe FFI calls.
       
 (DIR) Post #9hKyE2CFx0PnCKWYrY by freakazoid@retro.social
       2019-03-31T17:41:52Z
       
       0 likes, 0 repeats
       
       @grainloom @drwho The promises your unsafe code needs to make in Rust are significantly easier than they would be in a language with a more powerful typesystem. Rust mainly cares about memory safety. Calling out to C at all limits what the compiler can prove about a program. You can annotate the code with types, but that only gets you so far because of side effects. It all depends on what you want to be able to make guarantees about.
       
 (DIR) Post #9hKzLXcTcSimdSvqTI by grainloom@cybre.space
       2019-03-31T17:54:26Z
       
       0 likes, 0 repeats
       
       @freakazoid @drwho sure, but it can't prove more just because you're generating C, can it? wouldn't linear types help with the side effects?
       
 (DIR) Post #9hL02SmlT54Q5xD8Rk by freakazoid@retro.social
       2019-03-31T18:02:11Z
       
       0 likes, 0 repeats
       
       @grainloom @drwho Not just because you're generating C, but because the C is generated from a model of the system that covers everything the programmer wants to be able to prove.
       
 (DIR) Post #9hL0Zikx6spDBSDnto by grainloom@cybre.space
       2019-03-31T18:08:12Z
       
       0 likes, 0 repeats
       
       @freakazoid @drwho Well yeah, but does it have to be the main way of generating code?
       
 (DIR) Post #9hLFgDom2EtsZNZDWK by freakazoid@retro.social
       2019-03-31T20:57:27Z
       
       0 likes, 0 repeats
       
       @grainloom @drwho It depends on what you're trying to accomplish. If you just want more safety, then no. If you want full formal verification of security properties, then yes.
       
 (DIR) Post #9hLHLYviXuNjsH5jMG by grainloom@cybre.space
       2019-03-31T21:16:08Z
       
       0 likes, 0 repeats
       
       @freakazoid @drwho Idk, C with all its weird behaviours.... doesn't that just increase the amount of things one has to prove? I guess one could target some very strict subset and compile it with CompCert.... hmm.hm..
       
 (DIR) Post #9hMBQUHCrxtmK64Gpc by Shamar@mastodon.social
       2019-04-01T07:44:30Z
       
       0 likes, 0 repeats
       
       We need simple programming languages.I like #C (mainly because I don't code in #POSIX and my #libc don't give a shit about any standard) and I used to like programming languages like #Idris, #Haskell or #Rust.But now I realized that they are trying to address a problem that shouldn't be there in the first place: software complexity.We need simple and readable programming languages that force people to keep programs simple and short.@grainloom @freakazoid @drwho
       
 (DIR) Post #9hMCZCc20tS86DWnwW by grainloom@cybre.space
       2019-04-01T07:57:16Z
       
       0 likes, 0 repeats
       
       @Shamar @freakazoid @drwho Idk, I'd agree that Rust is getting a bit too enterprise-y, but they are much more elegant to work with. I can keep my code complexity low in Rust, while also not having to suffer from the issues of C. Plan9port/9front libc is nicer to work with, sure, but it still has quite enough issues. And there are problems where you just can't get around the complexity, eg. verifying optimized cryptographic primitives:https://youtu.be/3F8PU7Y-lTw
       
 (DIR) Post #9hMCjswTUH9cHKBTiS by grainloom@cybre.space
       2019-04-01T07:59:13Z
       
       0 likes, 0 repeats
       
       @Shamar @freakazoid @drwho Idris also kind of forces you to really minimize your code because proving things gets harder the larger your definitions are.
       
 (DIR) Post #9hMCudxbWtNSzyqIa0 by grainloom@cybre.space
       2019-04-01T08:01:10Z
       
       0 likes, 0 repeats
       
       @Shamar @freakazoid @drwho Can't really speak for more experienced devs and other languages though. I think Agda is similar in this regard, it not having tactics forces people to make their proofs clean and modular and reusable.
       
 (DIR) Post #9hMDa0ZAO0uKyy8V6G by Shamar@mastodon.social
       2019-04-01T08:08:36Z
       
       0 likes, 0 repeats
       
       @grainloom I'm not arguing for #C.I'm saying that we need something new but simpler.To be honest I don't know what.#Oberon seems like a step in the right direction, yet I'm not sure if it's simple enough. I mean, you have both FOR and WHILE!I think that automatic verification and proofs looks great (but never learnt to use them, actually 😞), but I'm not sure if we should NEED them.Looks like they raise the bar to program while we need to lower it... the right way.@freakazoid @drwho
       
 (DIR) Post #9hMEB56cTH69mBSfHk by grainloom@cybre.space
       2019-04-01T08:15:20Z
       
       0 likes, 0 repeats
       
       @Shamar @freakazoid @drwho You don't really need to prove everything though, it's just encouraged. But safety by default is preferable for me. IMHO C has too many invisible pitfalls for beginners. Haskell is not that hard to learn imho and it's quite fast. Idris is just a small step up from that, it's even simpler in some ways, because proper dependent types means less weird type magic. At least that's the promise.
       
 (DIR) Post #9hMI6pI9izc1O95GL2 by Shamar@mastodon.social
       2019-04-01T08:59:22Z
       
       0 likes, 0 repeats
       
       @grainloom I should try #Idris again and yes I liked dependent types and I was curious to see it used even more widely. Yet it's not a simple functional language. Consider for example how interfaces (similar to Haskell's typeclasses) could be replaced with ADTs and functions: http://www.haskellforall.com/2012/05/scrap-your-type-classes.html#Haskell is not a simple language either, but 2^99 different ones.#C has pitfalls for beginners.But it's simpler than both of these modern languages and force you to #KISS.@freakazoid @drwho
       
 (DIR) Post #9hMIa0mW5Lm3QACYJU by grainloom@cybre.space
       2019-04-01T09:04:39Z
       
       0 likes, 0 repeats
       
       @Shamar @freakazoid @drwho the problem with passing interface implementations around is that you lose the guarantee that you always have the same implementation, which may or may not be worth ita lot of Haskell's extensions are only there because its type system is too limiting
       
 (DIR) Post #9hMIgAPLCyDJ0uIRE0 by grainloom@cybre.space
       2019-04-01T09:05:46Z
       
       0 likes, 0 repeats
       
       @Shamar @freakazoid @drwho Idris on the other hand has a few optional features and that's it, we're gonna see how many of those will remain gated behind %language pragmas in Blodwen
       
 (DIR) Post #9hMJnimTuKq3bMHnwu by Shamar@mastodon.social
       2019-04-01T09:18:20Z
       
       0 likes, 0 repeats
       
       @grainloom @freakazoid @drwho In my attempt to design a simpler language (I don't have the skills, but the frustration at times is too high) I tried to base it on #Cantor's Set Theory.Don't worry, that was just another heresy from me: a very strong type system... but without types! 🤣(If I remember correctly we already talked about this pipe dream of mine, and @abs suggested an interesting paper in that design space but for module system design).Well guess what?1/
       
 (DIR) Post #9hMKatwrAzngQTadvc by Shamar@mastodon.social
       2019-04-01T09:24:30Z
       
       0 likes, 0 repeats
       
       @grainloom @freakazoid @drwho @abs If a set is a singleton in your running application, there is no reason to explicitly pass it as an argument to functions.This has wide reaching implications that AFAIK have never been explored in a programming language.However I don't really know much about PL design, I just think we should explore radically different paths as the one we are exploring right now doesn't work.Take Haskell/Idris/Cantor (my vaporware language), for example.2/
       
 (DIR) Post #9hMKau64cjAmt3u0IK by grainloom@cybre.space
       2019-04-01T09:27:13Z
       
       0 likes, 0 repeats
       
       @Shamar @freakazoid @drwho @abs wait, what does singleton mean here? Idris and Agda (and probably others) already have implicit arguments
       
 (DIR) Post #9hMLFNMpDWNPFa2JF2 by Shamar@mastodon.social
       2019-04-01T09:34:33Z
       
       0 likes, 0 repeats
       
       @grainloom @freakazoid @drwho @abs A singleton was a set with only one element.Take I/O: if you have an I/O set/ADT (a sort of product type of all I/O based functions signatures) and you have just one value (a single variable of this type/set with all functions' implementation) you don't need to pass it to functions that need IO.Yes, it's an implicit argument whose value is decided by the compiler without requiring you to explicitly pass it.PS I didn't know Idris has implicit args: cool!
       
 (DIR) Post #9hN3bCJpN4YjL2qksK by Shamar@mastodon.social
       2019-04-01T09:12:23Z
       
       0 likes, 0 repeats
       
       @grainloom Is #Lua simple?Quite, but still has issues (how can you start counting from 1! what were they thinking? 🤣)Is #Python simple?It tried to, but grow too much.#Rust? #Go? #Java? ..No.#JavaScript? 🤮#Lisp (or #Scheme)?In a way it is... In a way it's brilliant! But it errs on the other end of the spectrum: it's too hard to verify anything about it.All of these provide too many features, support large codebases and provide too many ways of doing each thing.@freakazoid @drwho
       
 (DIR) Post #9hN3bCVWfZutvKK66q by amz3@framapiaf.org
       2019-04-01T11:12:11Z
       
       0 likes, 0 repeats
       
       @Shamar @grainloom @freakazoid @drwho What does this mean:> Is Python simple? It tried to, but grow too much.It is not because it is big that it is bad. That counter productive argument.
       
 (DIR) Post #9hN3bCizrUgyb6cr6e by Shamar@mastodon.social
       2019-04-01T12:41:03Z
       
       0 likes, 0 repeats
       
       @amz3 It's exactly because it's big that it's bad. Bad as a language (too many features, too many paradigms at once) and from a security perspective: too few people will read and understand the whole #CPython code base, not to mention the standard library.It's still one of the best and most productive scripting language out there, but still too primitive for #Informatics to step forward.@grainloom @freakazoid @drwho
       
 (DIR) Post #9hN3bCtzCdTz9BldEe by amz3@framapiaf.org
       2019-04-01T12:53:51Z
       
       0 likes, 0 repeats
       
       @Shamar @grainloom @freakazoid @drwho I agree that #python is big and somewhat cumbersome. I hope it is not because it is #mainstream that it is bad :blobreach:
       
 (DIR) Post #9hN3bD4GaPhpf4ZqG8 by Shamar@mastodon.social
       2019-04-01T15:07:31Z
       
       0 likes, 0 repeats
       
       @amz3 @grainloom @freakazoid @drwho I hope it too.To be honest, I can't think of any #mainstream software that is not crap.Maybe #Notepad? 🤣But is it mainstream?Some programs actually #SuckLess, but finding good software in the mainstream is very hard.Does this means that mainstream software have to suck?If so, why?My bet is that since most people can't read the sources (whatever the license) and judge by the cover, we have the worst possible incentives at work from the ground up.
       
 (DIR) Post #9hN3bDJreQBORRsIZU by freakazoid@retro.social
       2019-04-01T15:27:21Z
       
       0 likes, 0 repeats
       
       @Shamar @drwho @grainloom @amz3 Consumers don't care about the guts of the technology they use, otherwise they wouldn't be consumers. As long as we enable and support the model of third parties providing opaque boxes, we're going to be stuck with this situation.
       
 (DIR) Post #9hN3bDT569YUu2BewC by Shamar@mastodon.social
       2019-04-01T15:31:02Z
       
       0 likes, 0 repeats
       
       @freakazoid That's why I want to teach everybody how to program.So that they can (and will want) to understand why their opaque boxes suck so much, and they will feel my pain and frustration and will throw that shit away.@amz3 @grainloom @drwho
       
 (DIR) Post #9hN3bDdiSc3vR1A9Vw by amz3@framapiaf.org
       2019-04-01T15:32:15Z
       
       0 likes, 0 repeats
       
       @Shamar @freakazoid @grainloom @drwho It seems to me that like cars / automobiles, there is no way all the people will be knowledgeable in it.
       
 (DIR) Post #9hN3bDlryIaHqIyfDs by freakazoid@retro.social
       2019-04-01T15:44:38Z
       
       0 likes, 1 repeats
       
       @amz3 @drwho @grainloom @Shamar I do think some level of universal computer literacy is achievable. If people drove the way they currently use computers, the freeways would be covered with piles of wrecked cars and dead bodies. We can do better. I don't know whether this means everyone must learn to program, just that we should be able to get past the point where all software is the equivalent of bumper cars.
       
 (DIR) Post #9hN7FjtEH3Acu2QqUS by alcinnz@floss.social
       2019-04-01T18:17:04Z
       
       0 likes, 1 repeats
       
       @Shamar @freakazoid @amz3 @grainloom @drwho My two biggest gripes helping my family with their computers:1) printer drivers, why do we need them?2) software updates, it is absolutely painless on my system so why not theirs'?
       
 (DIR) Post #9hN94L14c2QuTwnBQG by June@kitty.town
       2019-04-01T17:53:57Z
       
       0 likes, 0 repeats
       
       @freakazoid @Shamar@mastodon.social @grainloom @drwho@hackers.town @amz3 it's like we could choose as a society to have a publicly owned reference general purpose computer and OS and everything and just that alone would help push forward reasonable tech
       
 (DIR) Post #9hN94LDpqadp7WlNJY by freakazoid@retro.social
       2019-04-01T18:50:49Z
       
       0 likes, 1 repeats
       
       @June @amz3 @grainloom I still consider computing to be in its infancy, so I think if we wanted to do something like this it should come alongside very active research into alternatives, which would need to be publicly funded since if everyone used the same architecture there'd be no private incentive to research anything else except for proprietary internal use in the case that the stuff used by the public got too far behind.
       
 (DIR) Post #9hNB2Ut98YQkQfKnbM by June@kitty.town
       2019-04-01T18:51:45Z
       
       0 likes, 0 repeats
       
       @freakazoid @grainloom @amz3 totally agree on everything you said
       
 (DIR) Post #9hNB2VHxeIHPfcwcHQ by freakazoid@retro.social
       2019-04-01T19:05:44Z
       
       0 likes, 1 repeats
       
       @June @amz3 @grainloom I think your use of the term "reference" didn't entirely click with me at first. I guess this is what RISK-V could be on the CPU side. On the OS side, there's POSIX, L4, X11, XDG, OpenGL, and the various standardized languages. Unfortunately, very few applications are particularly portable, though, in part because the standards don't cover enough yet. And some of these are probably the wrong standards for bringing more platforms "under the tent".
       
 (DIR) Post #9hNB2Vb6V7amcztu7M by June@kitty.town
       2019-04-01T19:10:22Z
       
       0 likes, 1 repeats
       
       @freakazoid @grainloom @amz3 well by reference i mean like a "reference implementation" so like the government would have an actual investment in a publicly owned computing stack of whatever current vintage is. So as a member of the public you could buy a "public" computer or be provided with one... You know whatever the structure is in this fantasy land where this is feasible.
       
 (DIR) Post #9hNBiz739cd2KY65xo by 361.xj9@social.sunshinegardens.org
       2019-04-01T19:22:32.265601Z
       
       0 likes, 0 repeats
       
       @Shamar @drwho @freakazoid @grainloom > scheme [...] it's too hard to verify anything about itdepends on how you approach it. the basic forms that make scheme work are not big at all. the rest of the language is just applications of the same features. if you start from the basic forms the surface area for formal verification is fairly minimal. you'll need to verify the higher level abstractions, but since everything compiles down to basic forms its just a matter of ensuring that some interface (r5rs for example) is correctly transformed.i could see scheme becoming the base platform for higher level languages (with any syntax) that can benefit from building on a safe and simple foundation.i have an idea for this, but its way on the back burner for now.
       
 (DIR) Post #9hNDIkXhssZSkurqq0 by freakazoid@retro.social
       2019-04-01T19:18:27Z
       
       0 likes, 0 repeats
       
       @June @amz3 @grainloom So, like, Universal Basic Computing. There is precedent for this in the form of the BBC Micro.
       
 (DIR) Post #9hNDIklB4nLXQhAbpo by June@kitty.town
       2019-04-01T19:25:45Z
       
       0 likes, 1 repeats
       
       @freakazoid @grainloom @amz3 like it's kinda WEIRD that we got so far away from this when we started out so publicly-owned
       
 (DIR) Post #9hNGhY3pjCdl97PXqS by Shamar@mastodon.social
       2019-04-01T20:04:01Z
       
       0 likes, 0 repeats
       
       @xj9Actually this have a lot of sense.If I understand your insight, you say: if we have a minimal verifiable bytecode (expressed as #scheme standard forms) all languages that compile to it will be verifiable too.This seems reasonable but I guess syntax often expresses what you want to be verified (eg through type declarations, pre or post conditions...) so I'm not sure how such verifications would be translated into these few forms.@drwho @freakazoid @grainloom
       
 (DIR) Post #9hNGhYT0Dcm0PBBe4m by 361.xj9@social.sunshinegardens.org
       2019-04-01T20:18:17.300704Z
       
       0 likes, 0 repeats
       
       @Shamar @grainloom @freakazoid @drwho oh yeah, there's definitely a lot more to it than making the scheme compiler formally correct. you'd need additional hooks for creating the macros that let you transform arbitrary syntax into scheme s-expression AST that is correctly annotated with types and whatnot.
       
 (DIR) Post #9hNGs1DACajCRzlezY by 361.xj9@social.sunshinegardens.org
       2019-04-01T20:20:11.710213Z
       
       1 likes, 0 repeats
       
       @Shamar @grainloom @freakazoid @drwho imo scheme folks got too caught up in how powerful working in AST can be that they just forgot that you can extend that to creating arbitrary syntax. (Racket on the other hand, knows exactly how awesome this can be)
       
 (DIR) Post #9hNIpDQQXC1bN3lLs0 by freakazoid@retro.social
       2019-04-01T20:28:29Z
       
       0 likes, 1 repeats
       
       @June @amz3 @grainloom The government was involved with promoting computing when it was still mostly an academic thing. Once industry got involved, the government stepped back. I think this is not an uncommon path for new technologies/industries. But I think the desire with the BBC Micro was always to support British *industry* rather than any particular thought to the rights of individuals.
       
 (DIR) Post #9hNJaouICTaj1GUANM by freakazoid@retro.social
       2019-04-01T20:31:05Z
       
       0 likes, 0 repeats
       
       @grainloom @amz3 @June I think the fact that it came out of the BBC is rather telling: the BBC was more up on technology than most Brits. It'd be great to have a significant portion of any government for which that was the case. In the US, I guess that would be the NIST? Maybe they just need more funding and a larger role?
       
 (DIR) Post #9hNJap5dWIfJaRnE3c by June@kitty.town
       2019-04-01T20:45:50Z
       
       0 likes, 1 repeats
       
       @freakazoid @amz3 @grainloom i was thinking about this when i tried to pay for some crackers today at the gas station and they were like "oh our thing only works with Samsung phones" and like yea the credit card companies are terrible but at least there was some Attempt at standardization of the medium and protocol. Like. I dunno. It's a weird place we're in right now.
       
 (DIR) Post #9hNMeufb10s1uiFqm8 by Kensan@mastodon.social
       2019-04-01T21:25:05Z
       
       0 likes, 0 repeats
       
       @grainloom @drwho Maybe you want to look at SPARK 2014?https://www.adacore.com/about-spark
       
 (DIR) Post #9hNQ8uUIYb7ptlxvRQ by Shamar@mastodon.social
       2019-04-01T21:27:59Z
       
       0 likes, 0 repeats
       
       @freakazoidVery interesting idea.I don’t think that a #UBI would help poors: on the contrary it would oppress them more as, without a corresponding Universal Maximum Income, the prices would simply go up.OTOH, Universal Basic Computing might be useful to enable everyone to code their own software.There's just the risk of being locked in a dystopia where all of this is owned by #Google and you are locked to their #OSS #Surveillance software.@grainloom @amz3 @June
       
 (DIR) Post #9hNQ8ufHtjuqRr6hZQ by grainloom@cybre.space
       2019-04-01T22:04:05Z
       
       0 likes, 0 repeats
       
       @Shamar @freakazoid @amz3 @June What if we just decriminalized piracy? I mean, it's a small step, and it doesn't mean corporations have to share everything, but they also wouldn't be able to sue you for using a reverse engineered alternative to their proprietary crap.
       
 (DIR) Post #9hNQECjY98DVw3cjke by grainloom@cybre.space
       2019-04-01T22:05:04Z
       
       0 likes, 0 repeats
       
       @Shamar @freakazoid @amz3 @June (that's just for the computing stuff, idk about the deeper implications of UBI. I heard the pilot programs were promising, so, I have some hope.)
       
 (DIR) Post #9hNQd15iZNnn4Bp9pg by June@kitty.town
       2019-04-01T22:09:33Z
       
       0 likes, 0 repeats
       
       @grainloom @Shamar@mastodon.social @freakazoid @amz3 definitely part of the solution. Chip manufacture is so specialised though it almost requires a state level power to force it to be publicly accessible in all the ways it ought to be.
       
 (DIR) Post #9hNQziUrrww03dPbsG by grainloom@cybre.space
       2019-04-01T22:13:39Z
       
       0 likes, 0 repeats
       
       @Kensan @drwho ooh, the Ada book I have from the library talks about that, I'll skim through it to get an idea of what it's all about and what it can and cannot prove
       
 (DIR) Post #9hNROGoJwWqBQiDdVA by freakazoid@retro.social
       2019-04-01T22:18:04Z
       
       0 likes, 1 repeats
       
       @grainloom @June @amz3 @Shamar Reverse engineering of hardware is mostly legal in the US. Reverse engineering a complex microprocessor is incredibly difficult, so I don't think it can help the situation much, at least for advanced ICs. That's why I was thinking make them actually document it.
       
 (DIR) Post #9hNRdGoqC5qXndcBQe by kwarrtz@mathstodon.xyz
       2019-04-01T22:20:47Z
       
       0 likes, 0 repeats
       
       @grainloom Have you heard of ATS? I just ran into it recently but it sounds like you might find it interesting. ats-lang.org
       
 (DIR) Post #9hNRePiwZUJfhuOyau by Kensan@mastodon.social
       2019-04-01T22:21:00Z
       
       0 likes, 0 repeats
       
       @grainloom @drwho The scope of SPARK has significantly increased with 2014. Also, aside from Automated Provers like Z3, CVC4 and Alt-Ergo you can use Interactive Theorem Provers like Isabelle and Coq for more involved proofs.Also since you seem to be coming from Rust this paper might be of interest:„Borrowing Safe Pointers from Rust in SPARK“https://arxiv.org/abs/1805.05576The plan is for this to be added to the language in the not too distant future.
       
 (DIR) Post #9hNS0kHWooKTpxE8eG by grainloom@cybre.space
       2019-04-01T22:25:02Z
       
       0 likes, 0 repeats
       
       @kwarrtz I have but it looks super hard but I wanna learn it!! I just haven't gotten around to it yet. :blobsad:My current plan is to get better at one of the more mainstream languages (Agda, Coq, hopefully Idris but its experimental status is not that great if you're a noob) and then move on to the harder stuff
       
 (DIR) Post #9hNfJLCbaOyIxd7bP6 by kwarrtz@mathstodon.xyz
       2019-04-02T00:54:00Z
       
       0 likes, 0 repeats
       
       @grainloom ATS probably would be jumping in the deep end if you've never done much dependent programming before 😄 And actually, of those three languages, I'd say Idris is the easiest to learn despite its experimental status. All of the existing literature on Agda and Coq (that I'm aware of) is kind of impenetrable. The Idris tutorial isn't as complete but is more accessible imo. The language itself is also arguably easier to learn, especially if you know Haskell.
       
 (DIR) Post #9hNi05nEx6w1WxlpmS by grainloom@cybre.space
       2019-04-02T01:24:13Z
       
       0 likes, 0 repeats
       
       @kwarrtz well, I'm reading the TDD in Idris book but idk how well that knowledge will scale to more involved proofs (and I should actually work through the assignments too...)
       
 (DIR) Post #9hNiAnU3Mo7lvjtt3o by grainloom@cybre.space
       2019-04-02T01:26:09Z
       
       0 likes, 0 repeats
       
       @kwarrtz I know some Agda but only the absolute basics, I don't even know how to use the stdlibright now I'm in this weird spot where I know a few things but getting to the next level is a bigger step and I'm running into Idris's limitations (mostly with memory)
       
 (DIR) Post #9hNo7IBDtdLjhRnxAW by kwarrtz@mathstodon.xyz
       2019-04-02T02:32:42Z
       
       0 likes, 0 repeats
       
       @grainloom That’s fair. Agda is certainly more mature. I am curious though, what kind of limitations are you running into? As an aside, if you haven’t already, I’d suggest looking into Lean. It’s also on the more “experimental” side compared to Coq, but I personally have found it much more natural to learn and use.
       
 (DIR) Post #9hNoyl6v40ccm4HpCK by grainloom@cybre.space
       2019-04-02T02:42:24Z
       
       0 likes, 0 repeats
       
       @kwarrtz Mostly just things taking too long or using too much memory. I tried to model a MOS 6502 CPU, gave it some RAM as a Data.Vect. Bam. OOM killer striked down. Turns out Vects take quadratic space. There is also an open bug report about even moderately big Nats taking forever to check. Blodwen is supposed to fix these, but it's pretty alpha right now.
       
 (DIR) Post #9hNpRtZkfamLQTfLsG by grainloom@cybre.space
       2019-04-02T02:47:39Z
       
       0 likes, 0 repeats
       
       @kwarrtz Lean seems.... idk. I don't really see many people using it for stuff, whereas with Agda I know people irl who could help me learn it, so I thought, eh, let's try do some Agda.(but then I ran into stuff like library loading being a total mess (who the hell tries to open the stdlib for writing? especially when it's installed from a package and owned by root?) so I gave up and did other programming/uni stuff instead)
       
 (DIR) Post #9hNqMfncYvOZCpE8Bs by kwarrtz@mathstodon.xyz
       2019-04-02T02:57:55Z
       
       0 likes, 0 repeats
       
       @grainloom If you’re trying to do programming with safety guarantees courtesy of depending typing, then absolutely Idris or Agda are the way to go, but if you’re doing straight theorem proving then their capabilities are kind of limited to compared to Coq or Lean.
       
 (DIR) Post #9hNrrWUBL7YRFfPOfA by grainloom@cybre.space
       2019-04-02T03:14:43Z
       
       0 likes, 0 repeats
       
       @kwarrtz I was sort of hoping to do both, by trying to formalize some of the things I'm learning in uni, but that quickly turned out to be much harder than it is on paper.
       
 (DIR) Post #9hNs6neBofFaowNGEq by kwarrtz@mathstodon.xyz
       2019-04-02T03:17:26Z
       
       0 likes, 0 repeats
       
       @grainloom Yeah, formalization is harddddddd, as I’ve had driven home to me recently too
       
 (DIR) Post #9hf6mbHUIHrfcpjvUG by grainloom@cybre.space
       2019-04-10T10:51:59Z
       
       0 likes, 0 repeats
       
       @Shamar @freakazoid @drwho I was :thounking: about this.....1. verbosity can be decreased using implicit and auto arguments2. the consistency provided by every type having only one implementation for a given typeclass is only a crutch in #Haskell, IMHO in #Idris you could - and indeed should - state all your assumptions about the parameters in the type
       
 (DIR) Post #9hf8lu6V1ebHccObFg by Shamar@mastodon.social
       2019-04-10T11:14:16Z
       
       0 likes, 0 repeats
       
       @grainloomI'm not sure Verbosity is a bad thing.Haskell can be as terse as cryptic and I guess the lack of balance between readability and verbosity shows that we are missing something fundamental.@freakazoid @drwho
       
 (DIR) Post #9hf9gr5stBAOnEGT1E by grainloom@cybre.space
       2019-04-10T11:24:34Z
       
       0 likes, 0 repeats
       
       @Shamar @freakazoid @drwho It's bad if you have to repeat yourself. Among other things, you risk the possibility that you have to change what you repeat, but you forget to change it in one place.
       
 (DIR) Post #9hfMgZ05SWK255kCKu by Shamar@mastodon.social
       2019-04-10T13:50:09Z
       
       0 likes, 0 repeats
       
       @grainloom True.But this is independent of readability.In 20 years I've seen a lot of extreme DRY, leading to code bloat.Yeah, you don't repeat yourself, but you have 10 times the code you would need.and guess what? when requirements change, you end to throw it all away because you can't really touch it anymore!So you repeat yourself much much more!@freakazoid @drwho
       
 (DIR) Post #9hfViVlFNBgYCWo3dI by hansbauer@fosstodon.org
       2019-04-10T15:31:19Z
       
       0 likes, 0 repeats
       
       @Shamarhave any of you guys heard of zig? i was wondering if anyone has already tried it.http://ziglang.org/@grainloom @freakazoid @drwho
       
 (DIR) Post #9hfWgWDumayZ0gpTma by Shamar@mastodon.social
       2019-04-10T15:42:13Z
       
       0 likes, 0 repeats
       
       @hansbauer I've seen it in the past, but never tried it.@grainloom @freakazoid @drwho
       
 (DIR) Post #9hfbUfW8ty1I8AWY3U by freakazoid@retro.social
       2019-04-10T16:36:01Z
       
       0 likes, 0 repeats
       
       @grainloom @drwho @Shamar I think we need to stop reading raw source code. It's not necessary to put the full burden of readability onto the writer when we have IDEs that can annotate code with types, either interactively or by spitting out annotated source code that could be printed.Of course, at that point there's not much difference, since such an IDE can also autocomplete for you. But if it's all the same, why not have the source code be a minimal description?
       
 (DIR) Post #9hfckOPMNixZedPg1o by Shamar@mastodon.social
       2019-04-10T16:50:07Z
       
       0 likes, 0 repeats
       
       @hansbauerTo a rapid read it has interesting features (eg running code at compilation time) and seem readable to a programmer.And this is also why I think it's not readable enough.Is it possible to design a language that the layman and the #programmer can read equally well? Which trade-offs we would face?In 2019 we would end with a new #python or a new #cobol?@grainloom @freakazoid @drwho
       
 (DIR) Post #9hfdk6urW3V7MHZrJQ by hansbauer@fosstodon.org
       2019-04-10T17:01:16Z
       
       1 likes, 0 repeats
       
       @Shamari don't quite get what you mean with readability. for me a programming language, like the mathematical language, is a set of symbols and forms of flux structures. some are quite clear like if then else, because they are a primary thought pattern, but others ones are almost impossible to make explicit the same way.@grainloom @freakazoid @drwho
       
 (DIR) Post #9hfe0aB8UEB8kQ2lvc by a_breakin_glass@cybre.space
       2019-04-10T17:04:17Z
       
       0 likes, 0 repeats
       
       @grainloom also the '''object orientation'''... isn't
       
 (DIR) Post #9hfeduOZFtXWk54vLM by freakazoid@retro.social
       2019-04-10T17:00:55Z
       
       0 likes, 0 repeats
       
       @Shamar @drwho @grainloom @hansbauer Zig looks like a better memory-unsafe language than C, but it's still memory-unsafe. Given how hard memory-safety issues can be to debug, can such a language ever really be considered readable? I think language constraints matter a lot to readability, because otherwise unless you fully understand the program (which is impossible for any non-trivial program), the reader and the programmer are both left to just trust the programmer.
       
 (DIR) Post #9hfedubKURkRNf37Ee by freakazoid@retro.social
       2019-04-10T17:03:07Z
       
       0 likes, 1 repeats
       
       @hansbauer @grainloom @drwho @Shamar IOW the one thing you can rely on when reading a program, assuming you've actually compiled it, is that the compiler accepts it. If the compiler accepts a larger set of incorrect programs, then the compiler's acceptance of a program tells you less than a compiler that accepts a smaller set of incorrect program.
       
 (DIR) Post #9hff3MrsX0bcHwvc6y by grainloom@cybre.space
       2019-04-10T17:15:58Z
       
       0 likes, 0 repeats
       
       @a_breakin_glass it certainly seems to have a weird take on OOP but I don't really understand it yet...it looks like it went the inheritance way first when that was popular but it also has traits?? but I've only seen these mentioned, luckily we aren't doing OOP on class.
       
 (DIR) Post #9hffHw5EOkpjYr7WnA by a_breakin_glass@cybre.space
       2019-04-10T17:18:38Z
       
       0 likes, 0 repeats
       
       @grainloom also the requirement for different files is....awkwardespecially if you're using something like https://godbolt.org
       
 (DIR) Post #9hffaKf7kkud0IgI0e by hansbauer@fosstodon.org
       2019-04-10T17:21:57Z
       
       0 likes, 0 repeats
       
       @freakazoidi still don't get the 'readable' part. is it 'readable' or you mean fully or partially 'understandable'. @grainloom @drwho @Shamar
       
 (DIR) Post #9hffgvfz37cXXPEFdI by freakazoid@retro.social
       2019-04-10T17:23:07Z
       
       0 likes, 0 repeats
       
       @hansbauer @Shamar @drwho @grainloom I don't think there's any meaningful difference between the two.
       
 (DIR) Post #9hffr6GyVRXrIZNgky by hansbauer@fosstodon.org
       2019-04-10T17:24:59Z
       
       0 likes, 0 repeats
       
       @freakazoidthere is an abysmal difference between the two.@grainloom @drwho @Shamar
       
 (DIR) Post #9hffyGQv81JkpzRttw by grainloom@cybre.space
       2019-04-10T17:26:16Z
       
       0 likes, 0 repeats
       
       @a_breakin_glass This is super interesting imho, because it's obvious why header files are necessary in C (simpler compiler implementation) and C++ (because it wanted to be a superset of C) and I think people who had to write those files tried to rationalize it later with stuff like "it documents your code" but like, no??? documentation does??? which you could be generating???but back in the dark ages they didn't even have Doxygen.
       
 (DIR) Post #9hffyebjFUDvoSgs52 by freakazoid@retro.social
       2019-04-10T17:26:19Z
       
       0 likes, 0 repeats
       
       @hansbauer @Shamar @drwho @grainloom You'll have to explain it to me then before I can tell you if I'm talking about readability or understandability, since by the definition I've been using, at least as pertains to source code, they're the same.
       
 (DIR) Post #9hfg75jGIO0UFHElvs by grainloom@cybre.space
       2019-04-10T17:27:51Z
       
       0 likes, 0 repeats
       
       @a_breakin_glass Ada could have done away with that stuff, but no, instead they added So Fricking Much Syntax.And I guess now they're stuck with having header files. Except they don't call them that.
       
 (DIR) Post #9hfh6X3VJKRKf50Do0 by hansbauer@fosstodon.org
       2019-04-10T17:38:58Z
       
       0 likes, 0 repeats
       
       @freakazoidby reading a program, I can understand more or less the structures used to perform a task as defined by the symbols and rules of this language and its basic inner workings. if i master this language, i can say i read it correctly. but can i really say that i understand how this program will behave?@grainloom @drwho @Shamar
       
 (DIR) Post #9hfhFHZUFY4PFE8PuC by a_breakin_glass@cybre.space
       2019-04-10T17:40:33Z
       
       0 likes, 0 repeats
       
       @grainloom I have no problem with the additional syntax, if they had a painless way of combining them in one file, but they don't.GNAT's one "compilation unit" per file rule is equally grating
       
 (DIR) Post #9hfhbYxgH2zNYB4jqa by freakazoid@retro.social
       2019-04-10T17:44:33Z
       
       0 likes, 0 repeats
       
       @hansbauer @Shamar @drwho @grainloom Ah. When I talk about readability I'm talking more about how easy it is to make readable programs in the language. Any program can be made readable through literate programming, of course. But some languages lend themselves better to obfuscated code, while others lend themselves better to readable code.
       
 (DIR) Post #9hfhqgmRUDLI8x20Ya by hansbauer@fosstodon.org
       2019-04-10T17:47:19Z
       
       0 likes, 0 repeats
       
       @freakazoidbut you are assuming the reader master the language and the analogies being made through it, right? @grainloom @drwho @Shamar
       
 (DIR) Post #9hfjW3lf1ZphGSxNdA by Shamar@mastodon.social
       2019-04-10T17:59:17Z
       
       0 likes, 0 repeats
       
       @freakazoidA programming language is "readable" if a human can understand what a program does by reading its source from the entry point.A programming language A is more readable than a programming language B if more people can understand the programs written in A than in B or if the number of people is equal, if the average time required to understand a program in A is less than the time required for a program in B.@grainloom @drwho @hansbauer
       
 (DIR) Post #9hfjW4NwjESRBCrxJ2 by hansbauer@fosstodon.org
       2019-04-10T18:05:59Z
       
       0 likes, 0 repeats
       
       @Shamarmy question is: how proficient in the languages and analogies being used is this reader? @freakazoid @grainloom @drwho
       
 (DIR) Post #9hfkkOH9fmhGsS5Lge by Shamar@mastodon.social
       2019-04-10T18:19:46Z
       
       0 likes, 0 repeats
       
       @hansbauerThat's irrelevant.This is an objective and measurable definition exactly because it does not include informations that, being inside the programmer mind, cannot be easily measured.As of today #C is probably the most readable programming language out there.🤦‍♂️Yes, we can do better.@freakazoid @grainloom @drwho
       
 (DIR) Post #9hfm61EG8mDVixbkX2 by hansbauer@fosstodon.org
       2019-04-10T18:34:54Z
       
       0 likes, 0 repeats
       
       @Shamari would dispute these claims, but it might be curious to explore their implications.it might be possible to reach some good 'readability', but it would boil down to a balance between expressivity x complexity. a readable language would be simple and explicit, but it would require more complexity and loc to describe an operation. if an operation is too complex it would be less readable by its extension and not exactly by its expressiveness. @freakazoid @grainloom @drwho
       
 (DIR) Post #9hfmvpaIgCMUfSchpA by Shamar@mastodon.social
       2019-04-10T18:44:13Z
       
       0 likes, 0 repeats
       
       @hansbauer> if an operation is too complex it would be less readable by its extension and not exactly by its expressiveness. Mind to elaborate? I have no idea of what you mean. Maybe you can share an example or an algorithm you think would be affected by using more readable languages?Also I'm not sure what you mean by expressivity. Haskell is often said very expressive because of its operators but it's not readable exactly because of them.@freakazoid @grainloom @drwho
       
 (DIR) Post #9hfoZPWV5tU3NTHbHc by hansbauer@fosstodon.org
       2019-04-10T19:02:37Z
       
       0 likes, 0 repeats
       
       @Shamarwe can use the example of a fibonacci algorithm in c and in haskell. in haskell, it is pretty short and not complex to state it, but it might be deemed not so readable, if i understood well what readable means. in c, it is more readable, as you can see every single step of it, but might be too long for someone to follow.@freakazoid @grainloom @drwho
       
 (DIR) Post #9hfobwYjEpYKPqX63M by hansbauer@fosstodon.org
       2019-04-10T19:03:04Z
       
       0 likes, 0 repeats
       
       @Shamar @freakazoid @grainloom @drwhomaybe expressive is not the best word to use. what i mean is something like explicit, or less abstract, as reducing operations to a set of basic ones instead of using higher abstractions of them.
       
 (DIR) Post #9hfriq5ryUtIFl0vXE by Shamar@mastodon.social
       2019-04-10T19:37:55Z
       
       0 likes, 0 repeats
       
       @hansbauerAs for #C, if it's too long, split it into more readable functions... am I missing something obvious?@freakazoid @grainloom @drwho
       
 (DIR) Post #9hfs6JWQj7fkBTdBRI by hansbauer@fosstodon.org
       2019-04-10T19:42:11Z
       
       0 likes, 0 repeats
       
       @Shamarthat is what i mean with turning it more complex and more extensive, and maybe making it less readable for some. @freakazoid @grainloom @drwho
       
 (DIR) Post #9hfsGciEETnCb7482q by grainloom@cybre.space
       2019-04-10T19:44:02Z
       
       0 likes, 0 repeats
       
       @Shamar @hansbauer @freakazoid @drwho Having to explicitly state certain things that are true for the entire codebase (or at least the part that you care about) is just line noise. EDSLs are very nice IMHO and C is not good at that. Gradually making code more general/abstract is imho quite useful, C places an upper limit on how general you can be.
       
 (DIR) Post #9hg2zUub534qyft4N6 by Shamar@mastodon.social
       2019-04-10T21:15:08Z
       
       0 likes, 0 repeats
       
       @grainloomSuch upper limit is too high! 😉You should NOT be able to write a program that I can't fully understand in a week.@hansbauer @freakazoid @drwho
       
 (DIR) Post #9hg3NGhnYEfFrpEf44 by grainloom@cybre.space
       2019-04-10T21:48:31Z
       
       0 likes, 0 repeats
       
       @Shamar @hansbauer @freakazoid @drwho Which is where EDSLs and powerful type systems shine. By restricting what you can do within a function or how an element of a type can be constructed, you can enforce simpler code.
       
 (DIR) Post #9hg4452Sns9g8TJdk8 by Shamar@mastodon.social
       2019-04-10T21:56:14Z
       
       0 likes, 0 repeats
       
       @grainloomIf you NEED to learn category theory to FULL understand an application written in a programming language H, that language is funny and educative, but NOT readable.@hansbauer @freakazoid @drwho
       
 (DIR) Post #9hg46LJfFt6FCMRLzU by Shamar@mastodon.social
       2019-04-10T21:56:40Z
       
       0 likes, 0 repeats
       
       @grainloomIf you NEED to learn category theory to FULLY understand an application written in a programming language H, that language is funny and educative, but NOT readable.@hansbauer @freakazoid @drwho
       
 (DIR) Post #9hg630UIRFgRdhwyga by aeveltstra@mastodon.social
       2019-04-10T22:18:27Z
       
       0 likes, 0 repeats
       
       @grainloom @Shamar @hansbauer @freakazoid @drwho Really? Simpler code? Like programs written in java, which aimed to prevent coders from making mistakes? So it forced us all into an o.o. straightjacket so tight, that the Gang of 4 had to divine architectural patterns just so we could be productive?
       
 (DIR) Post #9hg6EwDu7SGZt9yKsC by grainloom@cybre.space
       2019-04-10T22:20:37Z
       
       0 likes, 0 repeats
       
       @aeveltstra @Shamar @hansbauer @freakazoid @drwho You... do realize.... that Java does not have a strong type system?
       
 (DIR) Post #9hg6I1v5OdISQOe1CK by aeveltstra@mastodon.social
       2019-04-10T22:21:10Z
       
       0 likes, 0 repeats
       
       @grainloom @Shamar @hansbauer @freakazoid @drwho Explain?
       
 (DIR) Post #9hg6h72zTAZbZDImuG by alcinnz@floss.social
       2019-04-10T22:25:42Z
       
       0 likes, 0 repeats
       
       @grainloom @aeveltstra @Shamar @hansbauer @freakazoid @drwho Java does have an extremely verbose type system, but it lets many developer errors slide through.Languages like Haskell, Rust, and Whiley show you can do it right; and the verbosity of Java's type system is entirely unnecessary to this end.
       
 (DIR) Post #9hh6VsXVGOA8XoB3Q0 by aeveltstra@mastodon.social
       2019-04-11T09:58:23Z
       
       0 likes, 0 repeats
       
       @grainloom @Shamar @hansbauer @freakazoid @drwho So you're saying that a language like Rust is more restrictive and more rigid than Java, and that will lead to simpler code? Interesting point. I'm fairly familiar with Java, and I'm learning Rust. I greatly welcome simpler code, knowing the contraptions Java causes us to create.
       
 (DIR) Post #9hhUUZiCAyUGlJ5cgK by grainloom@cybre.space
       2019-04-11T14:27:04Z
       
       0 likes, 0 repeats
       
       @aeveltstra @Shamar @hansbauer @freakazoid @drwho Ye, basically. Although I think you should be able to embed a GC language inside Rust if you really needed to. My dream is a language that is powerful enough to bootstrap itself but not too permissive in practice to be confusing, and it generates native code that can compete with C.