Posts by asie@mk.asie.pl
(DIR) Post #B2lLS6huYkIW2iBU5g by asie@mk.asie.pl
2026-01-28T21:40:58.392Z
0 likes, 0 repeats
@eniko@mastodon.gamedev.place At what level of accuracy? That heavily affects the answer, but let's say... a ballpark of between 20 and 1000, eyeballing the order of magnitude.
(DIR) Post #B2lMlabthlIUF2E16u by asie@mk.asie.pl
2026-01-28T21:50:43.367Z
0 likes, 0 repeats
@eniko@mastodon.gamedev.place If you're willing to take some liberties, I'd say at least 15-20 per CPU core on a reasonably modern x86-64 machine, then?
(DIR) Post #B2lXho6cgZeeR0znMm by asie@mk.asie.pl
2026-01-28T19:33:28.776Z
0 likes, 0 repeats
PSA: Wonderful Toolchain does not currently work on Fedora 44+ - the package manager wf-pacman assumes the presence of /etc/ssl/certs/ca-certificates.crt, which is no longer true as of that version of that distribution.https://fedoraproject.org/wiki/Changes/droppingOfCertPemFilehttps://codeberg.org/WonderfulToolchain/wf-issues/issues/44I'll try to get it fixed soon.
(DIR) Post #B2lYpmXYUSlHqkxYbw by asie@mk.asie.pl
2026-01-28T19:36:57.466Z
0 likes, 0 repeats
@whitequark@social.treehouse.systems ah.yeah in those situations i just resigned to writing Makefiles by hand. maybe it's the wrong tool for the job but at least I can understand them when they break
(DIR) Post #B2laiIwqJZMbx8Lky8 by asie@mk.asie.pl
2026-01-28T19:35:03.362Z
0 likes, 0 repeats
@whitequark@social.treehouse.systems ain't that the fate of every build system user summed updo you like meson because you hate CMake and Autotools, or is there a secret third hate
(DIR) Post #B2maauaSgTeVWtJeiW by asie@mk.asie.pl
2026-01-29T16:33:16.704Z
0 likes, 0 repeats
@ptrc@social.treehouse.systems online service transition tier list when
(DIR) Post #B2npKtvV48LdJkwLa4 by asie@mk.asie.pl
2026-01-30T06:53:11.039Z
0 likes, 0 repeats
I just learned that GNU gettext decided to release a 1.0 version after 30 years of development. Congratulations! The new feature which made this version deserving of that honor is adding automatic LLM translation support!I think I am inches away from a psychotic break! I just let out what felt like a simultaneous hollow laugh and scream!
(DIR) Post #B2npiHjEjg6RncM1mC by asie@mk.asie.pl
2026-01-30T06:57:26.046Z
0 likes, 0 repeats
It does, however, tell us where at least some of the GNU folk stand on LLMs:- They request that their users host LLMs locally, and not in a third party cloud,- They encourage people to use open weight models, but provenance seems less relevant - Ministral 3 14B is explicitly named as an example in the documentation.
(DIR) Post #B2nq65LsP6QkMamDHk by asie@mk.asie.pl
2026-01-30T07:01:44.129Z
0 likes, 0 repeats
To be clear, yes, I agree with them that using open weight models on a device you control without relying on external data centers and proprietary third party services is the lesser evil, compared to just dialing into ChatGPT.But from the organization that felt it was a moral imperative to deny CPU microcode updates from users in the name of software freedom, I think I expected a little more than "don't forget to use large matrices which are Apache 2.0 licensed".
(DIR) Post #B2nq7WrraNtnzZQW00 by asie@mk.asie.pl
2026-01-30T07:01:55.171Z
0 likes, 0 repeats
@natty@astolfo.social https://gitweb.git.savannah.gnu.org/gitweb/?p=gettext.git;a=blobdiff;f=gettext-tools/doc/gettext.texi;h=f2f09586180884ca966e1f3e4e841dc5cb62753d;hp=f8913e5c4d5f3c33847633a5d7ca2ce536f6c0df;hb=c979c72860cbfaba7d2e745b8bf9dcf27f7f030f;hpb=7c7ff10dcf979358e32b0c0da72fa4f00d45aab5
(DIR) Post #B2nqAQCGqPa8AhLq9w by asie@mk.asie.pl
2026-01-30T07:02:33.397Z
0 likes, 0 repeats
@natty@astolfo.social The link might not work if you don't copy-paste it, but here's the source:https://gitweb.git.savannah.gnu.org/gitweb/?p=gettext.git;a=blobdiff;f=gettext-tools/doc/gettext.texi;h=f2f09586180884ca966e1f3e4e841dc5cb62753d;hp=f8913e5c4d5f3c33847633a5d7ca2ce536f6c0df;hb=c979c72860cbfaba7d2e745b8bf9dcf27f7f030f;hpb=7c7ff10dcf979358e32b0c0da72fa4f00d45aab5
(DIR) Post #B2nqQd2mqa9IeJdxPE by asie@mk.asie.pl
2026-01-30T07:05:26.851Z
0 likes, 0 repeats
To be clear, yes, I agree with them that using open weight models on a device you control without relying on external data centers and proprietary third party services is the lesser evil, compared to just dialing into ChatGPT.But from the organization that felt it was a moral imperative to deny CPU microcode updates from users in the name of software freedom, I think I expected a little more than "don't forget to use large matrices which are allegedly Apache 2.0 licensed".
(DIR) Post #B2nqtxOyEi7MBa0LNw by asie@mk.asie.pl
2026-01-30T07:10:44.225Z
0 likes, 0 repeats
For a more principled example: Debian's Deep Learning Team (which does not represent the Debian Project) had been drafting an unofficial machine learning policy for some time - even before the LLM boom - and it's more considerate of these issues:https://salsa.debian.org/deeplearning-team/ml-policy/-/blob/master/ML-Policy.pdf?ref_type=heads
(DIR) Post #B2nrhYkaADW4D4lWUK by asie@mk.asie.pl
2026-01-30T07:19:40.366Z
0 likes, 0 repeats
@lua@vixen.zone I mean, sure, for things like highly derivative, expected phrases with adequate context, an LLM can probably do a good job. Most people aren't exactly using gettext of all things to translate creative masterpieces, and many open source translation services like Weblate have had translation service integration for a long time.I'm more astonished that there was seemingly no consideration given to data provenance. However, I'd like to award them one (1) comedy point for naming the LLM translation tool "spit".
(DIR) Post #B2nrpgAsReQJYHw7o8 by asie@mk.asie.pl
2026-01-30T07:21:11.700Z
0 likes, 0 repeats
Oh, they named the LLM tool "spit", which is objectively funny. One (1) comedy point.
(DIR) Post #B2ofHstbdUfkrww9Zo by asie@mk.asie.pl
2026-01-30T16:35:19.437Z
0 likes, 1 repeats
https://49bitcat.com/news/2026-01-30-nileswan-relaunch/Second time's the charm, hopefully. February 12.
(DIR) Post #B2ovw19pw8a9LIghBw by asie@mk.asie.pl
2026-01-30T19:41:48.695Z
0 likes, 0 repeats
By the way, if you want to see the menu program swanshell in your native language (or Toki Pona), feel free to contribute at https://weblate.asie.pl/projects/49bitcat/swanshell/I have already received work in progress translation contributions in multiple languages. Some of those will be released in early February.
(DIR) Post #B2oxDhVbpTG9ZMrpyK by asie@mk.asie.pl
2026-01-30T19:56:08.729Z
0 likes, 0 repeats
@nina_kali_nina@tech.lgbt Wait, you're looking into the Pocket Viewer? I've been considering buying one and shipping a toolchain for it at some point too... given the V30MZ connection and all.
(DIR) Post #B2oxpyfwS6WHoax624 by asie@mk.asie.pl
2026-01-30T20:03:09.105Z
0 likes, 0 repeats
@nina_kali_nina@tech.lgbt I did look into the anime mascot at least. This person drew her: https://www.mlum-factory.com/
(DIR) Post #B2p4Zu2AdnACSvdDWq by asie@mk.asie.pl
2026-01-30T21:18:39.795Z
0 likes, 1 repeats
@mkljczk@pl.fediverse.pl