[HN Gopher] PYX: The next step in Python packaging
       ___________________________________________________________________
        
       PYX: The next step in Python packaging
        
       Author : the_mitsuhiko
       Score  : 309 points
       Date   : 2025-08-13 18:42 UTC (4 hours ago)
        
 (HTM) web link (astral.sh)
 (TXT) w3m dump (astral.sh)
        
       | notatallshaw wrote:
       | Probably the more useful blog post:
       | https://astral.sh/blog/introducing-pyx
        
         | 6thbit wrote:
         | Thanks that's bit less cryptic than the linked page.
         | 
         | Still don't get how they are solving what they claim to solve.
        
           | chao- wrote:
           | I suspect that, in order to succeed, they will need to build
           | something that is isomorphic to Nix.
        
       | PaulHoule wrote:
       | Been waiting for something like this to make it easier to manage
       | multi-package projects.
        
       | ddavis wrote:
       | Been waiting to see what Astral would do first (with regards to
       | product). Seems like a mix of artifactory and conda? artifactory
       | providing a package server and conda trying to fix the difficulty
       | that comes from Python packages with compiled components or
       | dependencies, mostly solved by wheels, but of course PyTorch
       | wheels requiring specific CUDA can still be a mess that conda
       | fixes
        
         | notatallshaw wrote:
         | Given Astral's heavy involvement in the wheelnext project I
         | suspect this index is an early adopter of Wheel Variants which
         | are an attempt to solve the problems of CUDA (and that entire
         | class of problems not just CUDA specifically) in a more
         | automated way than even conda:
         | https://wheelnext.dev/proposals/pepxxx_wheel_variant_support...
        
           | zanie wrote:
           | It's actually not powered by Wheel Variants right now, though
           | we are generally early adopters of the initiative :)
        
             | notatallshaw wrote:
             | Well it was just a guess, "GPU-aware" is a bit mysterious
             | to those of us on the outside ;).
        
       | TheChaplain wrote:
       | Pyx is just a registry, just like Pypi, or did I misunderstood
       | it?
        
         | bitpush wrote:
         | Sounds like it. Also ..
         | 
         | > pyx is also an instantiation of our strategy: our tools
         | remain free, open source, and permissively licensed -- forever.
        
           | woodruffw wrote:
           | FWIW, I think the full paragraph around that snippet is
           | important context:
           | 
           | > Beyond the product itself, pyx is also an instantiation of
           | our strategy: our tools remain free, open source, and
           | permissively licensed -- forever. Nothing changes there.
           | Instead, we'll offer paid, hosted services that represent the
           | "natural next thing you need" when you're already using our
           | tools: the Astral platform.
           | 
           | pyx itself is not a tool, it's a service.
        
         | woodruffw wrote:
         | Not exactly -- part of pyx is a registry (and that part speaks
         | the same standards as PyPI), but the bigger picture is that pyx
         | part of a larger effort to make Python packaging faster and
         | more cohesive for developers.
         | 
         | To be precise: pyx isn't intended to be a public registry or a
         | free service; it's something Astral will be selling. It'll
         | support private packages and corporate use cases that are
         | (reasonably IMO) beyond PyPI's scope.
         | 
         | (FD: I work on pyx.)
        
       | ctoth wrote:
       | As I said a couple weeks ago, they're gonna have to cash out at
       | some point. The move won't be around Uv -- it'll be a protected
       | private PyPi or something.
       | 
       | https://news.ycombinator.com/item?id=44712558
       | 
       | Now what do we have here?
        
         | kinow wrote:
         | I haven't adopted uv yet watching to see what will be their
         | move. We recently had to review our use of Anaconda tools due
         | to their changes, then review Qt changes in license. Not
         | looking forward to another license ordeal.
        
           | zanie wrote:
           | We're hoping that building a commercial service makes it
           | clear that we have a sustainable business model and that our
           | tools (like uv) will remain free and permissively licensed.
           | 
           | (I work at Astral)
        
             | simonw wrote:
             | I think having a credible, proven business model is a
             | _feature_ of an open source project - without one there are
             | unanswered questions about ongoing maintenance.
             | 
             | I'm glad to see Astral taking steps towards that.
        
           | __mharrison__ wrote:
           | You know what they say: The best time to adopt uv was last
           | year...
           | 
           | I'm all seriousness, I'm all in on uv. Better than any
           | competition by a mile. Also makes my training and clients
           | much happier.
        
           | lenerdenator wrote:
           | Fortunately for a lot of what uv does, one can simply switch
           | to something else like Poetry. Not exactly a zero-code lift
           | but if you use pyproject.toml, there are other tools.
           | 
           | Of course if you are on one of the edge cases of something
           | only uv does, well... that's more of an issue.
        
         | snooniverse wrote:
         | Not sure what you're trying to get at here. Charlie Marsh has
         | literally said this himself; see e.g. this post he made last
         | September:
         | 
         | > _" An example of what this might look like (we may not do
         | this, but it's helpful to have a concrete example of the
         | strategy) would be something like an enterprise-focused private
         | package registry."_
         | 
         | https://hachyderm.io/@charliermarsh/113103605702842937
         | 
         | Astral has been very transparent about their business model.
        
           | MoreQARespect wrote:
           | Astral doesn't really have a business model yet, it has
           | potential business models.
           | 
           | The issue is that there isn't a clean business model that
           | will produce _the kind of profits that will satisfy their
           | VCs_ - not that there isn 't _any_ business model that will
           | help support a business like theirs.
           | 
           | Private package management would probably work fine if they
           | hadn't taken VC money.
        
         | eldenring wrote:
         | Cash out is a bit of a negative word here. They've shown the
         | ability to build categorically better tooling, so I'm sure a
         | lot of companies would be happy to pay them to fix even more of
         | their problems.
        
           | klysm wrote:
           | It's not negative, it's accurate. The playbook is well known
           | and users should be informed.
        
       | Myrmornis wrote:
       | I wonder whether it will have a flat namespace that everyone
       | competes over or whether the top-level keys will be user/project
       | identifiers of some sort. I hope the latter.
        
         | tmvphil wrote:
         | Fundamentally we still have the flat namespace of top level
         | python imports, which is the same as the package name for ~95%
         | of projects, so I'm not sure how they could really change that.
        
           | notatallshaw wrote:
           | Package names and module names are not coupled to each other.
           | You could have package name like "company-foo" and import it
           | as "foo" or "bar" or anything else.
           | 
           | But you can if you want have a non-flat namespace for imports
           | using PEP 420 - Implicit Namespace Packages, so all your
           | different packages "company-foo", "company-bar", etc. can be
           | installed into the "company" namespace and all just work.
           | 
           | Nothing stops an index from validating that wheels use the
           | same name or namespace as their package names. Sdists with
           | arbitrary backends would not be possible, but you could
           | enforce what backends were allowed for certain users.
        
         | cr125rider wrote:
         | Once we learn to namespace things it's gonna be so nice. Seems
         | we keep re-learning that lesson...
        
           | Y_Y wrote:
           | import io       from contextlib import redirect_stdout as r
           | b=io.StringIO()       with r(b):import this
           | print(b.getvalue().splitlines()[-1])
           | 
           | > Namespaces are one honking great idea -- let's do more of
           | those!
        
         | actinium226 wrote:
         | I feel like github solved this problem pretty well.
        
       | monster_truck wrote:
       | I've been burned too many times by embracing open source products
       | like this.
       | 
       | We've been fed promises like these before. They will inevitably
       | get acquired. Years of documentation, issues, and pull requests
       | will be deleted with little-to-no notice. An exclusively
       | commercial replacement will materialize from the new company that
       | is inexplicably missing the features you relied on in the first
       | place.
        
         | woodruffw wrote:
         | For what it's worth, I understand this concern. However, I want
         | to emphasize that pyx is intentionally distinct from Astral's
         | tools. From the announcement post:
         | 
         | > Beyond the product itself, pyx is also an instantiation of
         | our strategy: our tools remain free, open source, and
         | permissively licensed -- forever. Nothing changes there.
         | Instead, we'll offer paid, hosted services that represent the
         | "natural next thing you need" when you're already using our
         | tools: the Astral platform.
         | 
         | Basically, we're hoping to address this concern by building a
         | _separate_ sustainable commercial product rather than
         | monetizing our open source tools.
        
           | o11c wrote:
           | The entire reason people choose "permissive licenses" is so
           | that it _won 't_ last forever. At best, the community can
           | fork the old version without any future features.
           | 
           | Only viral licenses are forever.
        
             | woodruffw wrote:
             | I don't think this is true -- a license's virality doesn't
             | mean that its copyright holders can't switch a future
             | version to a proprietary license; past grants don't imply
             | grants to future work under _any_ open source license.
        
               | notpushkin wrote:
               | Correct; however, without a CLA and assuming there are
               | outside contributors, relicensing the existing code would
               | be mildly painful, if not downright impossible.
        
               | woodruffw wrote:
               | You're saying that would be more painful in a viral
               | license setting, right? If so I agree, although I think
               | there's a pretty long track record of financially
               | incentivized companies being willing to take that pain.
               | MongoDB's AGPL transition comes to mind.
               | 
               | But, to refocus on the case at hand: Astral's tools don't
               | require contributors to sign a CLA. I understand (and am
               | sympathetic) to the suspicion here, but the bigger
               | picture here is that Astral wants to build _services_ as
               | a product, rather than compromising the open source
               | nature of its _tools_. That 's why the announcement tries
               | to cleanly differentiate between the two.
        
             | krupan wrote:
             | I think you are making a good point, but please don't use
             | the old Steve Baller FUD term, "viral." Copyleft is a
             | better term
        
               | Muromec wrote:
               | The word "left" is now very charged too, maybe even more
               | than "viral".
        
               | HiPhish wrote:
               | Every word is charged now, so you might as well use it.
               | "Copyleft" is a fine pun on "copyright".
        
               | neutronicus wrote:
               | Careful, or you'll get the copyleftists calling you a
               | neo-libre-al.
        
               | ameliaquining wrote:
               | I don't think the connotation these days is particular
               | negative, in the sense it's being used here. See, e.g.,
               | "viral video".
        
             | actinium226 wrote:
             | By viral, do you mean licenses like GPL that force those
             | who have modified the code to release their changes (if
             | they're distributing binaries that include those changes)?
             | 
             | Because FWIW CPython is not GPL. They have their own
             | license but do not require modifications to be made public.
        
           | zemo wrote:
           | > Basically, we're hoping to address this concern by building
           | a separate sustainable commercial product rather than
           | monetizing our open source tools.
           | 
           | jfrog artifactory suddenly very scared for its safety
        
             | mcdow wrote:
             | Only a matter of time before someone makes something better
             | than Artifactory. It's a low bar to hit imho.
        
           | zahlman wrote:
           | Will pyx describe a server _protocol_ that could be
           | implemented by others, or otherwise provide software that
           | others can use to host their own servers? (Or maybe even that
           | PyPI can use to improve its own offering?) That is, when
           | using  "paid, hosted services like pyx", is one paying for
           | the ability to use the pyx software in and of itself, or is
           | one simply paying for access to Astral's particular server
           | that runs it?
        
             | woodruffw wrote:
             | I might not be following: what would that protocol entail?
             | pyx uses the same PEP 503/691 interfaces as every other
             | Python index, but those interfaces would likely not be
             | immediately useful to PyPI itself (since it already has
             | them).
             | 
             | > or is one simply paying for access to Astral's particular
             | server that runs it?
             | 
             | pyx is currently a service being offered by Astral. So it's
             | not something you can currently self-host, if that's what
             | you mean.
        
               | zahlman wrote:
               | > pyx uses the same PEP 503/691 interfaces as every other
               | Python index
               | 
               | ... Then how can it make decisions about how to serve the
               | package request that PyPI can't? Is there not some
               | extension to the protocol so that uv can tell it more
               | about the client system?
        
               | woodruffw wrote:
               | The repository API allows server-driven content
               | negotiation[1], so pyx can service specialized requests
               | while also honoring the normal 503/691 ones.
               | 
               | [1]: https://packaging.python.org/en/latest/specification
               | s/simple...
        
               | happymellon wrote:
               | Ah.
               | 
               | > honoring the normal 503/691 ones.
               | 
               | Embrace
               | 
               | > pyx can service specialized requests
               | 
               | Extend
               | 
               | ... ;)
        
               | woodruffw wrote:
               | Snark aside, you're missing the part where pyx doesn't
               | compete with PyPI. It's a private service.
        
               | zahlman wrote:
               | Interesting. I'll have to look into this further. (I've
               | bookmarked the entire thread for reference, especially
               | since I'm a bit attached to some of my other comments
               | here ;) )
        
           | abdullahkhalids wrote:
           | I believe that you are sincere and truthful in what you say.
           | 
           | Unfortunately, the integrity of employees is no guard against
           | the greed of investors.
           | 
           | Maybe next year investors change the CEO and entire
           | management and they start monetizing the open source tools.
           | There is no way of knowing. But history tells us that there
           | is a non-trivial chance of this happening.
        
         | ActorNightly wrote:
         | I agree. If any of the stuff was worthwhile to pursue, it would
         | be merged into pip.
        
           | woodruffw wrote:
           | This doesn't generalize: you could have said the same thing
           | about pip versus easy_install, but pip clearly has worthwhile
           | improvements over easy_install that were never merged back
           | into the latter.
        
           | zahlman wrote:
           | Pyx represents the server side, not the client side. The
           | analogue in the pre-existing Python world is PyPI.
           | 
           | Many ideas are being added to recent versions of pip that are
           | at least inspired by what uv has done -- and many things are
           | possible in uv specifically because of community-wide
           | standards development that also benefits pip. However, pip
           | has some really gnarly internal infrastructure that prevents
           | it from taking advantage of a lot of uv's good ideas (which
           | in turn are not all original). That has a lot to do with why
           | I'm making PAPER.
           | 
           | For just one example: uv can quickly install previously
           | installed packages by hard-linking a bunch of files from the
           | cache. For pip to follow suit, it would have to completely
           | redo its caching strategy from the ground up, because right
           | now its cache is designed to save only _download_ effort and
           | not anything else about the installation process. It
           | remembers entire wheels, but finding them in that cache
           | requires knowing the _URL from which they were downloaded_.
           | Because PyPI organizes the packages in its own database with
           | its own custom URL scheme, pip would have to reach out to
           | PyPI across the Internet in order to figure out where it put
           | its own downloads!
        
             | notatallshaw wrote:
             | > However, pip has some really gnarly internal
             | infrastructure that prevents it from taking advantage of a
             | lot of uv's good ideas (which in turn are not all
             | original).
             | 
             | FWIW, as a pip maintainer, I don't strongly agree with this
             | statement, I think if pip had the same full time employee
             | resources that uv has enjoyed over the last year that a lot
             | of these issues could be solved.
             | 
             | I'm not saying here that pip doesn't have some gnarly
             | internal details, just that the bigger thing holding it
             | back is the lack of maintainer resources.
             | 
             | > For just one example: uv can quickly install previously
             | installed packages by hard-linking a bunch of files from
             | the cache. For pip to follow suit, it would have to
             | completely redo its caching strategy from the ground up,
             | because right now its cache is designed to save only
             | download effort and not anything else about the
             | installation process.
             | 
             | I actually think this isn't a great example, evidenced by
             | the lack of a download or wheel command from uv due to
             | those features not aligning with uv's caching strategy.
             | 
             | That said, I do think there are other good examples to your
             | point, like uv's ability to prefetch package metadata, I
             | don't think we're going to be able to implement that in pip
             | any time soon due to probably the need for a complete
             | overhaul of the resolver.
        
           | lijok wrote:
           | It's not that complex - just try it
        
           | nilamo wrote:
           | Pip is broken and has been for years, they're uninterested in
           | fixing the search. Or even removing the search or replacing
           | it with a message/link to the package index.
           | 
           | imo, if pip's preference is to ship broken functionality,
           | then what is/is not shipped with pip is not meaningful.
        
             | woodruffw wrote:
             | This is not a charitable interpretation. The more
             | charitable read is that fixing search is non-trivial and
             | has interlocking considerations that go beyond what pip's
             | volunteer maintainers reasonably want to or _can_ pick up.
             | 
             | (And for the record: it isn't their fault _at all_. `pip
             | search` doesn 't work because PyPI removed the search API.
             | PyPI removed that API for _very good_ reasons[1].)
             | 
             | [1]: https://github.com/pypa/pip/issues/5216
        
               | nilamo wrote:
               | That was 7 years ago. If it's not coming back, the CLI
               | should make that clear, instead of giving a temporary
               | "cannot connect" message that implies it could work, if
               | you wait a minute and try again.
        
               | woodruffw wrote:
               | It was three years ago; 2018 is when they considered
               | removing the command, not when the search API was
               | actually removed from PyPI.
               | 
               | And this is part of the interlocking considerations I
               | mentioned: there _are_ private indices that supply the
               | XML-RPC API, and breaking them doesn 't seem
               | justifiable[1].
               | 
               | Edit: fixed the link.
               | 
               | [1]: https://github.com/pypa/pip/issues/5216#issuecomment
               | -1235329...
        
               | nilamo wrote:
               | Does that seem like a real solution to you? That it's ok
               | to represent a never-functional operation as one that
               | might maybe work? ...because it could work of you jump
               | through a bunch of undocumented hoops?
               | 
               | It's so wild to me that so many people are apparently
               | against making a user-friendly update. The whole thing
               | seems very against pep8 (its surprising, complicated,
               | non-specific, etc)
        
               | woodruffw wrote:
               | I don't know what to tell you; I just gave an example of
               | it being functional for a subset of users, who don't
               | deserve to be broken just because it's non-functional on
               | PyPI.
               | 
               | Nobody wants anything to be user-unfriendly. You're
               | taking a very small view into Python packaging and
               | extending it to motives, when _resources_ (not even
               | financial ones) are the primary challenge.
        
         | mnazzaro wrote:
         | This is a valid concern, but astral just has an amazing track
         | record.
         | 
         | I was surprised to see the community here on HN responding so
         | cautiously. Been developing in python for about a decade now-
         | whenever astral does something I get excited!
        
       | simonw wrote:
       | This is effectively what Charlie said they were going to build
       | last September when quizzed about their intended business model
       | on Mastodon:
       | https://hachyderm.io/@charliermarsh/113103564055291456
        
       | metalliqaz wrote:
       | > Waitlist
       | 
       | > Private registry
       | 
       | ouch.
        
         | yoavm wrote:
         | I actually think this is great. If Astral can figure out a way
         | to make money using a private registry (something that is used
         | mainly by companies), then they'll have to resources to keep
         | building their amazing open-source projects -- Ruff and uv.
         | That's a huge win for Python.
        
           | erikgahner wrote:
           | 100% agree. I am more than happy to see Astral taking steps
           | in this direction. People can continue to use uv, ruff, and
           | ty without having to pay anything, but companies that benefit
           | tremendously from open source initiatives can pay for a
           | private package registry and directly support the continued
           | development of said tools.
        
           | orra wrote:
           | In particular I think it's nice for uv and ruff to remain
           | open source, not open core. And as you say, companies always
           | need paid private registries, for their internal software. A
           | true win-win.
        
       | _verandaguy wrote:
       | _Soon: there are 14 competing Python packaging standards._
       | 
       | This is a joke, obviously. We've had more than 14 for years.
        
         | woodruffw wrote:
         | Python packaging has a lot of standards, but I would say most
         | of them (especially in the last decade) don't really compete
         | with each other. They lean more towards the "slow accretion of
         | generally considered useful features" style.
         | 
         | This itself is IMO a product of Python having a relatively
         | healthy consensus-driven standardization process for packaging,
         | rather than an authoritative one. If Python had more of an
         | authoritative approach, I don't think the language would have
         | done as well as it has.
         | 
         | (Source: I've written at least 5 PEPs.)
        
       | cpeterso wrote:
       | How do you pronounce "pyx"? Pikes, picks, pie-ex?
        
         | woodruffw wrote:
         | We've been pronouncing it pea-why-ecks, like uv (you-vee) and
         | ty (tee-why). But I wouldn't say that's permanent yet.
        
           | aaronvg wrote:
           | definitely will just say picks...
        
           | xavdid wrote:
           | that's fascinating - I've definitely been saying "you've" and
           | "tie". I assumed this was "picks"
        
           | r_lee wrote:
           | py-x like pie-ex?
        
         | Y_Y wrote:
         | I demand it be pronounced like the existing English word "pyx"
         | (aka "pyxis"), meaning a weird religious box:
         | 
         | https://en.wikipedia.org/wiki/Pyx
        
       | m4r71n wrote:
       | What does GPU-aware mean in terms of a registry? Will `uv`
       | inspect my local GPU spec and decide what the best set of
       | packages would be to pull from Pyx?
       | 
       | Since this is a private, paid-for registry aimed at corporate
       | clients, will there be an option to expose those registries
       | externally as a public instance, but paid for by the company?
       | That is, can I as a vendor pay for a Pyx registry for my own set
       | of packages, and then provide that registry as an entrypoint for
       | my customers?
        
         | charliermarsh wrote:
         | > Will `uv` inspect my local GPU spec and decide what the best
         | set of packages would be to pull from Pyx?
         | 
         | We actually support this basic idea today, even without pyx.
         | You can run (e.g.) `uv pip install --torch-backend=auto torch`
         | to automatically install a version of PyTorch based on your
         | machine's GPU from the PyTorch index.
         | 
         | pyx takes that idea and pushes it further. Instead of "just"
         | supporting PyTorch, the registry has a curated index for each
         | supported hardware accelerator, and we populate that index with
         | pre-built artifacts across a wide range of packages, versions,
         | Python versions, PyTorch versions, etc., all with consistent
         | and coherent metadata.
         | 
         | So there are two parts to it: (1) when you point to pyx, it
         | becomes much easier to get the right, pre-built, mutually
         | compatible versions of these things (and faster to install
         | them); and (2) the uv client can point you to the "right" pyx
         | index automatically (that part works regardless of whether
         | you're using pyx, it's just more limited).
         | 
         | > Since this is a private, paid-for registry aimed at corporate
         | clients, will there be an option to expose those registries
         | externally as a public instance, but paid for by the company?
         | That is, can I as a vendor pay for a Pyx registry for my own
         | set of packages, and then provide that registry as an
         | entrypoint for my customers?
         | 
         | We don't support this yet but it's come up a few times with
         | users. If you're interested in it concretely feel free to email
         | me (charlie@).
        
           | scarlehoff wrote:
           | Hi Charlie
           | 
           | what happens in a situation in which I might have access to a
           | login node, from which I can install packages, but then the
           | computing nodes don't have internet access. Can I define in
           | some hardware.toml the target system and install there even
           | if my local system is different?
           | 
           | To be more specific, I'd like to do `uv --dump-system
           | hardware.toml` in the computing node and then in the login
           | node (or my laptop for that matter) just do `uv install my-
           | package --target-system hardware.toml` and get an environment
           | I can just copy over.
        
       | dakiol wrote:
       | I'm brushing up with Python for a new job, and boy what a ride.
       | Not because of the language itself but the tooling around
       | packages. I'm coming from Go and TS/JS and while these two
       | ecosystems have their own pros and cons, at least they are more
       | or less straightforward to get onboarded (there are 1 or 2 tools
       | you need to know about). In Python there are dozens of
       | tools/concepts related to packaging: pip, easy_install,
       | setuptools, setup.py, pypy, poetry, uv, venv, virtualenv, pipenv,
       | wheels, ... There's even an entire website dedicated to this
       | topic: https://packaging.python.org
       | 
       | Don't understand how a private company like Astral is leading
       | here. Why is that hard for the Python community to come up with a
       | single tool to rule them all? (I know https://xkcd.com/927/).
       | Like, you could even copy what Go or Node are doing, and make it
       | Python-aware; no shame on that. Instead we have these who-knows-
       | how-long-they-will-last tools every now and then.
       | 
       | They should remove the "There should be one-- and preferably only
       | one --obvious way to do it." from the Python Zen.
        
         | formerly_proven wrote:
         | > Don't understand how a private company like Astral is leading
         | here. Why is that hard for the Python community to come up with
         | a single tool to rule them all? (I know https://xkcd.com/927/).
         | Like, you could even copy what Go or Node are doing, and make
         | it Python-aware; no shame on that. Instead we have these who-
         | knows-how-long-they-will-last tools every now and then.
         | 
         | Python packaging is (largely) solving problems that Go and Node
         | packaging are not even trying to address.
        
           | Spivak wrote:
           | Specifically simultaneous distribution of precompiled
           | binaries for many different OS and hardware configurations
           | and built-on-demand source distribution of non-Python
           | software to be used as dependencies with as little (ideally
           | none) host setup by the user all installable under a single
           | name/version everywhere.
           | 
           | imagine a world without: failed to build native gem extension
        
           | asa400 wrote:
           | As someone outside those communities, could you elaborate?
        
             | actinium226 wrote:
             | Not the person you're replying to, so I don't know if this
             | is what he had in mind, but with Python packages you can
             | distribute more than just Python. Some packages contain
             | C/C++/Fortran/Rust/others? source code that pip will try to
             | automatically build upon install. Of course you can't
             | expect everyone to have a dev environment set up, so
             | packages can also contain pre-compiled binary for any
             | combination of windows/mac/linux + amd/arm + glibc/musl +
             | CPython/pypy (did I miss any?).
             | 
             | I don't know much about go, and I've only scratched the
             | surface with node, but as far as node goes I think it just
             | distributes JS? So that would be one answer to what Python
             | packaging is trying to solve that node isn't trying to
             | address.
        
               | zahlman wrote:
               | > any combination of windows/mac/linux + amd/arm +
               | glibc/musl + CPython/pypy (did I miss any?).
               | 
               | From a standards perspective, it is a combination of a
               | Python version/implementation, a "platform" and an "ABI".
               | (After all, the glibc/musl distinction doesn't make sense
               | on Windows.)
               | 
               | Aside from CPython/pypy, the system recognizes IronPython
               | (a C# implementation) and Jython (a Java implementation)
               | under the version "tag"; of course these implementations
               | may have their own independent versioning with only a
               | rough correspondence to CPython releases.
               | 
               | The ABI tag largely corresponds to the implementation and
               | version tag, but for example for CPython builds it also
               | indicates whether Python was built in debug or release
               | mode, and from 3.13 onward whether the GIL is enabled.
               | 
               | The platform tag covers Mac, Windows, several generic
               | glibc Linux standards (called "manylinux" and designed to
               | smooth over minor differences between distros), and now
               | also some generic musl Linux standards (called
               | "musllinux"). Basic CPU information (arm vs intel, 32- vs
               | 64-bit etc.) is also jammed in here.
               | 
               | Details are available at https://packaging.python.org/en/
               | latest/specifications/platfo... .
        
         | notatallshaw wrote:
         | > easy_install
         | 
         | I don't know what guides you're reading but I haven't touched
         | easy_install in at least a decade. It's successor, pip, had
         | effectively replaced all use cases for it by around 2010.
        
           | Thrymr wrote:
           | > I don't know what guides you're reading but I haven't
           | touched easy_install in at least a decade.
           | 
           | It is mentioned in the "Explanations and Discussions" section
           | [0] of the linked Python Packaging guide.
           | 
           | Old indeed, but can still be found at the top level of the
           | current docs.
           | 
           | [0] https://packaging.python.org/en/latest/#explanations-and-
           | dis...
        
         | lenerdenator wrote:
         | > In Python there are dozens of tools/concepts related to
         | packaging: pip, easy_install, setuptools, setup.py, pypy,
         | poetry, uv, venv, virtualenv, pipenv, wheels,
         | 
         | Some of those are package tools, some are dependency managers,
         | some are runtime environments, some are package formats...
         | 
         | Some are obsolete at this point, and others by necessity cover
         | different portions of programming language technologies.
         | 
         | I guess what I'm saying is, for the average software engineer,
         | there's not _too_ many more choices in Python for programming
         | facilities than in Javascript.
        
           | zahlman wrote:
           | I don't know why you were downvoted. You are absolutely
           | correct.
        
         | WesolyKubeczek wrote:
         | > Why is that hard for the Python community to come up with a
         | single tool to rule them all?
         | 
         | Whatever I would say at this point about PyPA would be so
         | uncharitable that dang would descend on me with the holy hammer
         | of banishment, but you can get my drift. I just don't trust
         | them to come out with good tooling. The plethora they have
         | produced so far is quite telling.
         | 
         | That said, pip covers 99% of _my_ needs when I need to do
         | anything with Python. There are ecosystems that have it way
         | worse, so I count my blessings. But apparently, since Poetry
         | and uv exist, my 99% are not many other people 's 99%.
         | 
         | If I wanted to package my Python stuff, though, I'm getting
         | confused. Is it now setup.py or pyproject.toml? Or maybe both?
         | What if I need to support an older Python version as seen in
         | some old-but-still-supported Linux distributions?
         | 
         | > They should remove the "There should be one-- and preferably
         | only one --obvious way to do it." from the Python Zen.
         | 
         | Granted, tooling is different from the language itself.
         | Although PyPA could benefit from a decade having a BDFL.
        
           | zahlman wrote:
           | > If I wanted to package my Python stuff, though, I'm getting
           | confused. Is it now setup.py or pyproject.toml? Or maybe
           | both? What if I need to support an older Python version as
           | seen in some old-but-still-supported Linux distributions?
           | 
           | Your Python version is irrelevant, as long as your tools and
           | code both run under that version. The current ecosystem
           | standard is to move in lock-step with the Python versions
           | that the core Python team supports. If you want to offer
           | extended support, you should expect to require more know-how,
           | regardless. (I'm happy to receive emails about this kind of
           | thing; I use this username, on the Proton email service.)
           | 
           | Nowadays, you should really always use _at least_
           | pyproject.toml.
           | 
           | If your distribution will include code in non-Python
           | languages _and_ you choose to use Setuptools to build your
           | package, you will also need a setup.py. But your use of
           | setup.py will be limited to _just_ the part that explains how
           | to compile your non-Python code; don 't use it to describe
           | project metadata, or to orchestrate testing, or to implement
           | your own project management commands, or any of the other
           | advanced stuff people used to do when Setuptools was the only
           | game in town.
           | 
           | In general, create pyproject.toml first, and then figure out
           | if you need anything else in addition. Keeping your metadata
           | in pyproject.toml is the sane, modern way, and if we could
           | just get everyone on board, tools like pip could be
           | considerably simpler. Please read
           | https://blog.ganssle.io/articles/2021/10/setup-py-
           | deprecated... for details about modern use of Setuptools.
           | 
           | Regardless of your project, I strongly recommend considering
           | alternatives to Setuptools. It was never designed for its
           | current role and has been stuck maintaining tons of legacy
           | cruft. If your project is pure Python, Flit is my current
           | recommendation as long as you can live with its opinionated
           | choices (in particular, you must have a single top-level
           | package name in your distribution). For projects that need to
           | access a C compiler for a little bit, consider Hatch. If
           | you're making the next Numpy, keep in mind that they switched
           | over to Meson. (I also have thrown my hat in this ring,
           | although I really need to get back to that project...)
           | 
           | If you use any of those alternatives, you may have some tool-
           | specific configuration that you do in pyproject.toml, but you
           | may also have to include arbitrary code analogous to setup.py
           | to orchestrate the build process. There's only so far you can
           | get with a config file; real-world project builds get
           | ferociously complex.
        
         | beshrkayali wrote:
         | It's not an easy task, and when there's already lots of
         | established practices, habits, and opinions, it becomes even
         | more difficult to get around the various pain points. There's
         | been many attempts: pip (the standard) is slow, lacks
         | dependency resolution, and struggles with reproducible builds.
         | Conda is heavy, slow to solve environments, and mixes Python
         | with non-Python dependencies, which makes understanding some
         | setups very complicated. Poetry improves dependency management
         | but is sluggish and adds unnecessary complexity for simple
         | scripts/projects. Pipenv makes things simpler, but also has the
         | same issue of slow resolution and inconsistent lock files.
         | Those are the ones I've used over the years at least.
         | 
         | uv addressed these flaws with speed, solid dependency
         | resolution, and a simple interface that builds on what people
         | are already used to. It unifies virtual environment and package
         | management, supports reproducible builds, and integrates easily
         | with modern workflows.
        
         | devjab wrote:
         | I work with Python, Node and Go and I don't think any of them
         | have great package systems. Go has an amazing module isolation
         | system and boy do I wish hiding functions within a
         | module/package was as easy in Python as it is in Go. What saves
         | Go is the standard library which makes it possible to write
         | almost everything without needing external dependencies. You've
         | worked with JavaScript and I really don't see how Python is
         | different. I'd argue that Deno and JSR is the only "sane"
         | approach to packages and security, but it's hardly leading and
         | NPM is owned by Microsoft so it's not like you have a great
         | "open source" platform there either. On top of that you have
         | the "fun" parts of ESM vs CommonJS.
         | 
         | Anyway, if you're familiar with Node then I think you can view
         | pip and venv as the npm of Python. Things like Poetry are Yarn,
         | made as replacements because pip sort of sucks. UV on the other
         | hand is a drop-in replacement for pip and venv (and other
         | things) similar to how pnpm is basically npm. I can't answer
         | your question on why there isn't a "good" tool to rule them
         | all, but the single tool has been pip since 2014, and since UV
         | is a drop-in, it's very easy to use UV in development and pip
         | in production.
         | 
         | I think it's reasonable to worry about what happens when Astral
         | needs to make money for their investors, but that's the beauty
         | of UV compared to a lot of other Python tools. It's extremely
         | easy to replace because it's essentially just smarter pip. I do
         | hope Astral succeeds with their pyx, private registries and so
         | on by the way.
        
         | AlphaSite wrote:
         | I appreciate everything they've done but the group which
         | maintains Pip and the package index is categorically incapable
         | of shipping anything at a good velocity.
         | 
         | It's entirely volunteer based so I don't blame them, but the
         | reality is that it's holding back the ecosystem.
         | 
         | I suspect it's also a misalignment of interests. No one there
         | really invests in improving UX.
        
           | mixmastamyk wrote:
           | Indeed, they broke a few features in the last few years and
           | made the excuse "we can't support them, we're volunteers."
           | Well, how about stop breaking things that worked for a
           | decade? That would take less effort.
           | 
           | They had time to force "--break-system-packages" on us
           | though, something no one asked for.
        
             | zahlman wrote:
             | > how about stop breaking things that worked for a decade?
             | 
             | They aren't doing this.
             | 
             | > They had time to force "--break-system-packages" on us
             | though, something no one asked for.
             | 
             | The maintainers of several Linux distros asked for it very
             | explicitly, and cooperated to design the feature. The
             | rationale is extensively documented in the proposal
             | (https://peps.python.org/pep-0668/). This is especially
             | important for distros where the system package manager is
             | itself implemented in Python, since corrupting the system
             | Python environment could produce a state that is
             | effectively unrecoverable (at least without detailed
             | Python-specific know-how).
        
           | zahlman wrote:
           | > the group which maintains Pip and the package index is
           | categorically incapable of shipping anything at a good
           | velocity.
           | 
           | > It's entirely volunteer based so I don't blame them
           | 
           | It's not just that they're volunteers; it's the legacy
           | codebase they're stuck with, and the use cases that people
           | will expect them to continue supporting.
           | 
           | > I suspect it's also a misalignment of interests. No one
           | there really invests in improving UX.
           | 
           | "Invest" is the operative word here. When I read discussions
           | in the community around tools like pip, a common theme is
           | that the developers don't consider themselves competent to
           | redesign the UX, and there is no money from anywhere to hire
           | someone who would be. The PSF operates on an annual budget on
           | the order of $4 million, and a _big_ chunk of that is taken
           | up by PyCon, supporting programs like PyLadies, generic
           | marketing efforts, etc. Meanwhile, total bandwidth use at
           | PyPI has crossed into the exabyte range (it was ~600
           | petabytes in 2023 and growing rapidly). They would be
           | completely screwed without Fastly 's incredible in-kind
           | donation.
        
         | mixmastamyk wrote:
         | You don't need to know most of those things. Until last year I
         | used setup.py and pip exclusively for twenty years, with a venv
         | for each job at work. Wheels are simply prebuilt .zips. That's
         | about an hour of learning more or less.
         | 
         | Now we have pyproject.toml and uv to learn. This is another
         | hour or so of learning, but well worth it.
         | 
         | Astral is stepping up because no one else did. Guido never
         | cared about packaging and that's why it has been the wild west
         | until now.
        
         | zahlman wrote:
         | > Why is that hard for the Python community to come up with a
         | single tool to rule them all?
         | 
         | 0. A lot of those "tools and concepts" are actually completely
         | irrelevant or redundant. easy_install has for all practical
         | purposes been dead for many years. virtualenv was the original
         | third party project that formed the basis for the standard
         | library venv, which has been separately maintained for people
         | who want particular additional features; it doesn't count as a
         | separate concept. The setup.py file is a configuration file for
         | Setuptools that also happens to be Python code. You only need
         | to understand it if you use Setuptools, and the entire point is
         | that you can use other things now (specifically because
         | configuring metadata with Python code is a terrible idea that
         | we tolerated for far too long). Wheels are just the
         | distribution format and you don't need to know anything about
         | how they're structured as an end user or as a developer of
         | ordinary Python code -- only as someone who makes packaging
         | tools. And "pypy" is an alternate implementation of Python --
         | maybe you meant PyPI? But that's just the place that hosts your
         | packages; no relevant "concept" there.
         | 
         | Imagine if I wanted to make the same argument about JavaScript
         | and I said that it's complicated because you have to understand
         | ".tar.gz (I think, from previous discussion here? I can't even
         | find documentation for how the package is actually stored as a
         | package on disk), Node.js, NPM, TypeScript, www.npmjs.com,
         | package.json..." That's basically what you're doing here.
         | 
         | But even besides that, _you don 't have to know about all the
         | competing alternatives_. If you know how to use pip, and your
         | only goal is to install packages, you can completely ignore all
         | the other tools that install packages (including poetry and
         | uv). You only have any reason to care about pipenv if you want
         | to use pip _and_ care about the specific things that pipenv
         | does _and_ haven 't chosen a different way to address the
         | problem. Many pip users won't have to care about it.
         | 
         | 1. A _lot_ of people _actively do not want_ it that way. The
         | Unix philosophy actually does have some upsides, and there are
         | tons of Python users out there who have zero interest in
         | participating in an  "ecosystem" where they share their code
         | publicly even on GitHub, never mind PyPI -- so no matter what
         | you say should be the files that give project metadata or what
         | they should contain or how they should be formatted, you aren't
         | going to get any buy-in. But beyond that, different people have
         | different _needs_ and a tool that tries to make everyone happy
         | is going to require tons of irrelevant cruft for almost
         | everyone.
         | 
         | 2. Reverse compatibility. The Python world -- both the
         | packaging system and the language itself -- has been trying to
         | get people to do things in better, saner ways for many years
         | now; but people will scream bloody murder if their ancient
         | stuff breaks in any way, even when they are advised years in
         | advance of future plans to drop support. Keep in mind here that
         | Python is more than twice as old as Go.
         | 
         | 3. Things are simple for Go/JS/TS users because they normally
         | only have to worry about that one programming language. Python
         | packages (especially the best-known, "serious" ones used for
         | heavyweight tasks) _very commonly_ must interface with code
         | written in _many_ other programming languages (C and C++ are
         | very common, but you can also find Rust, Fortran and many more;
         | and Numpy must work with both C _and_ Fortran), and there are
         | many different ways to interface (and nothing that Python could
         | possibly do at a language level to prevent that): by using an
         | explicit FFI, by dlopen() etc. hooks, by shelling out to a
         | subprocess, and more. And users expect that they can just
         | install the Python package and have all of that stuff just
         | work. Often that means that compiled-language code has to be
         | rebuilt locally; and the install tools are expected to be able
         | to download and set up a build system, build code in an
         | isolated environment, etc. etc. All of this is way beyond the
         | expectations placed on something like NPM.
         | 
         | 4. The competition is deliberate. Standards -- the clearest
         | example being the PEPs 517/518/621 that define the
         | pyproject.toml schema -- were created specifically to enable
         | both competition and interoperation. Uv is gaining market share
         | because a lot of people like its paradigms. Imagine if, when
         | people in the Python community first started thinking about the
         | problems and limitations of tools from the early days, decided
         | to try to come up with all the paradigms themselves. Imagine if
         | they got them wrong, and then set it in stone for everyone
         | else. When you imagine this, keep in mind that projects like
         | pip and setuptools date to the 2000s. People were simply not
         | thinking about open-source ecosystems in the same way in that
         | era.
         | 
         | > They should remove the "There should be one-- and preferably
         | only one --obvious way to do it." from the Python Zen.
         | 
         | First, define "it". The task is orders of magnitude greater
         | than you might naively imagine. I know, because I've been an
         | active participant in the surrounding discussion for a couple
         | of years, aside from working on developing my own tooling.
         | 
         | Second, see https://news.ycombinator.com/item?id=44763692 . It
         | doesn't mean what you appear to think it does.
        
         | blibble wrote:
         | > Why is that hard for the Python community to come up with a
         | single tool to rule them all? (I know https://xkcd.com/927/).
         | 
         | because they're obsessed with fixing non-issues (switching out
         | pgp signing for something you can only get from microsoft,
         | sorry, "trusted providers", arguing about mission statements,
         | etc.)
         | 
         | whilst ignoring the multiple elephants in the room
         | (namespacing, crap slow packaging tool that has to download
         | everything because the index sucks, mix of 4 badly documented
         | tools to build anything, index that operates on filenames,
         | etc.)
        
       | m_kos wrote:
       | > Why is it so hard to install PyTorch, or CUDA, or libraries
       | like FlashAttention or DeepSpeed that build against PyTorch and
       | CUDA?
       | 
       | This is so true! On Windows (and WSL) it is also exacerbated by
       | some packages requiring the use of compilers bundled with
       | outdated Visual Studio versions, some of which are only available
       | by manually crafting download paths. I can't wait for a better
       | dev experience.
        
         | giancarlostoro wrote:
         | Stuff like that led me fully away from Ruby (due to Rails),
         | which is a shame, I see videos of people chugging along with
         | Ruby and loving it, and it looks like a fun language, but when
         | the only way I can get a dev environment setup for Rails is
         | using DigitalOcean droplets, I've lost all interest. It would
         | always fail at compiling something for Rails. I would have
         | loved to partake in the Rails hype back in 2012, but over the
         | years the install / setup process was always a nightmare.
         | 
         | I went with Python because I never had this issue. Now with any
         | AI / CUDA stuff its a bit of a nightmare to the point where you
         | use someone's setup shell script instead of trying to use pip
         | at all.
        
           | nurettin wrote:
           | Have you tried JRuby? It might be a bit too large for your
           | droplet, but it has the java versions of most gems and you
           | can produce cross-platform jars using warbler.
        
             | nickserv wrote:
             | The speed of Ruby with the memory management of Java,
             | what's not to love?
             | 
             | Also, now you have two problems.
        
           | awesome_dude wrote:
           | Lets be honest here - whilst some experiences are
           | better/worse than others, there doesn't seem to be a
           | dependency management system that isn't (at least half)
           | broken.
           | 
           | I use Go a lot, the journey has been
           | 
           | - No dependency management
           | 
           | - Glide
           | 
           | - Depmod
           | 
           | - I forget the name of the precursor - I just remembered, VGo
           | 
           | - Modules
           | 
           | We still have proxying, vendoring, versioning problems
           | 
           | Python: VirtualEnv
           | 
           | Rust: Cargo
           | 
           | Java: Maven and Gradle
           | 
           | Ruby: Gems
           | 
           | Even OS dependency management is painful - yum, apt (which
           | was a major positive when I switched to Debian based
           | systems), pkg (BSD people), homebrew (semi-official?)
           | 
           | Dependency Management is the wild is a major headache, Go (I
           | only mention because I am most familiar with) did away with
           | some compilation dependency issues by shipping binaries with
           | no dependencies (meaning that it didn't matter which version
           | of linux you built your binary for, it will run on any of the
           | same arch linux - none of that "wrong libc" 'fun'), but you
           | still have issues with two different people building the same
           | binary in need of extra dependency management (vendoring
           | brings with it caching problems - is the version in the cache
           | up to date, will updating one version of one dependency break
           | everything - what fun)
        
           | ilvez wrote:
           | Do I get it right that this issue is within Windows? I've
           | never heard of the issues you describe while working with
           | Linux.. I've seen people struggle with MacOS a bit due to
           | brew different versions of some library or the other, mostly
           | self compiling Ruby.
        
             | threeducks wrote:
             | There certainly are issues on Linux as well. The Detectron2
             | library alone has several hundred issues related to
             | incorrect versions of something:
             | https://github.com/facebookresearch/detectron2/issues
             | 
             | The mmdetection library (https://github.com/open-
             | mmlab/mmdetection/issues) also has hundreds of version-
             | related issues. Admittedly, that library has not seen any
             | updates for over a year now, but it is sad that things just
             | break and become basically unusable on modern Linux
             | operating systems because NVIDIA can't stop breaking
             | backwards and forwards compatibility for what is
             | essentially just fancy matrix multiplication.
        
           | poly2it wrote:
           | Have you tried Nix?
           | 
           | https://nixos.org
        
           | viraptor wrote:
           | I would recommend learning a little bit of C compilation and
           | build systems. Ruby/Rails is about as polished as you could
           | get for a very popular project. Maybe libyaml will be a
           | problem once in a while if you're compiling Ruby from
           | scratch, but otherwise this normally works without a hassle.
           | And those skills will apply everywhere else. As long as we
           | have C libraries, this is about as good as it gets,
           | regardless of the language/runtime.
        
           | chao- wrote:
           | I'm surprised to hear that. Ruby was the first language in my
           | life/career where I felt good about the dependency management
           | and packaging solution. Even when I was a novice, I don't
           | remember running into any problems that weren't obviously my
           | fault (for example, installing the Ruby library for
           | PostgreSQL before I had installed the Postgres libraries on
           | the OS).
           | 
           | Meanwhile, I didn't feel like Python had reached the bare
           | minimum for package management until Pipenv came on the
           | scene. It wasn't until Poetry (in 2019? 2020?) that I felt
           | like the ecosystem had reached what Ruby had back in 2010 or
           | 2011 when bundler had become mostly stable.
        
             | thedanbob wrote:
             | Bundler has always been the best package manager of any
             | language that I've used, but dealing with gem extensions
             | can still be a pain. I've had lots of fun bugs where an
             | extension worked in dev but not prod because of differences
             | in library versions. I ended up creating a docker image for
             | development that matched our production environment and
             | that pretty much solved those problems.
        
               | awesome_dude wrote:
               | > I ended up creating a docker image for development that
               | matched our production environment and that pretty much
               | solved those problems.
               | 
               | docker has massively improved things - but it _still_ has
               | edge cases (you have to be really pushing it hard to find
               | them though)
        
           | selimnairb wrote:
           | Have you tried conda? Since the integration of mamba its
           | solver is fast and the breadth of packages is impressive.
           | Also, if you have to support Windows and Python with native
           | extensions, conda is a godsend.
        
         | morkalork wrote:
         | This was basically _the reason_ to use anaconda back in the
         | day.
        
           | setopt wrote:
           | In my experience, Anaconda (including Miniconda, Micromamba,
           | IntelPython, et al.) is still the default choice in
           | scientific computing and machine learning.
        
             | NeutralForest wrote:
             | It's useful because it also packages a lot of other deps
             | like CUDA drivers, DB drivers, git, openssl, etc. When you
             | don't have admin rights, it's really handy to be able to
             | install them and there's no other equivalent in the Python
             | world. That being said, the fact conda (and derivatives) do
             | _not_ follow any of the PEPs about package management is
             | driving me insane. The ergonomics are bad as well with
             | defaults like auto activation of the base env and bad
             | dependency solver for the longest time (fixed now), weird
             | linking of shared libs, etc.
        
           | IHLayman wrote:
           | Anaconda was a good idea until it would break apt on Ubuntu
           | and make my job that much harder. That became the reason
           | _not_ to use Anaconda in my book.
           | 
           | venv made these problems start to disappear, and now uv and
           | Nix have closed the loop for me.
        
             | StableAlkyne wrote:
             | How did it manage to do that?
             | 
             | Not saying it didn't, I've just never ran into that after a
             | decade of using the thing on various Nixes
        
         | bytehumi wrote:
         | This is the right direction for Python packaging, especially
         | for GPU-heavy workflows. Two concrete things I'm excited about:
         | 1) curated, compatibility-tested indices per accelerator
         | (CUDA/ROCm/CPU) so teams stop bikeshedding over torch/cu*
         | matrixes, and 2) making metadata queryable so clients can
         | resolve up front and install in parallel. If pyx can reduce the
         | 'pip trial-and-error' loop for ML by shipping narrower,
         | hardware-targeted artifacts (e.g., SM/arch-specific builds) and
         | predictable hashes, that alone saves hours per environment.
         | Also +1 to keeping tools OSS and monetizing the hosted service
         | --clear separation builds trust. Curious: will pyx expose
         | dependency graph and reverse-dependency endpoints (e.g., "what
         | breaks if X-Y?") and SBOM/signing attestation for supply-chain
         | checks?
        
         | miohtama wrote:
         | In the past, part of the definition of an operating system was
         | that it ships with a compiler.
        
       | rob wrote:
       | Is there a big enough commercial market for private Python
       | package registries to support an entire company and its staff?
       | Looks like they're hiring for $250k engineers, starting a
       | $26k/year OSS fund, etc. Expenses seem a bit high if this is
       | their first project unless they plan on being acquired?
        
         | eldenring wrote:
         | Its interesting because the value is definitely there. Every
         | single python developer you meet (many of who are highly paid)
         | has a story about wasting a bunch of time on these things. The
         | question is how much of this value can Astral capture.
         | 
         | I think based on the quality of their work, there's also an
         | important component which is trust. I'd trust and pay for a
         | product from them much more readily than an open source
         | solution with flaky maintainers.
        
           | est31 wrote:
           | Yeah they certainly generate a lot of value by providing
           | excellent productivity tooling. The question is how they
           | capture some of that value, which is notoriously hard with an
           | OSS license. A non-OSS license creates the adobe trap on the
           | other hand, where companies deploy more and more aggressive
           | moetization strategies, making life worse and worse for users
           | of the software.
        
         | dimatura wrote:
         | Just one data point, but if it's as nice to use as their open
         | source tools and not outrageously expensive, I'd be a customer.
         | Current offerings for private python package registries are
         | kind of meh. Always wondered why github doesn't offer this.
        
       | fph wrote:
       | I lost track of how many different ways to install a Python
       | library there are at the moment.
        
         | nodesocket wrote:
         | Much better than the Node a handful of years back. Everybody
         | used NPM, everybody switched to Yarn, everybody switched back
         | to NPM.
        
           | no_wizard wrote:
           | Thats because npm finally decided to adopt enough features as
           | time went on that it could be used in place of yarn and
           | eventually, if they adopt enough of the features of pnpm, it
           | will replace that too.
           | 
           | Though speaking as a long time developer in the ecosystem,
           | switching between npm, yarn, and pnpm is fairly trivial in my
           | experience. Especially after node_gyp went away
        
       | JackYoustra wrote:
       | The real thing that I hope someone is able to solve is
       | downloading such huge amounts of unnecessary code. As I
       | understand, the bulk of the torch binary is just a huge nvfatbin
       | compiled for every SM under the sun when you usually just want it
       | to run on whatever accelerators you have on hand. Even just
       | making narrow builds of like `pytorch-sm120a` (with stuff like
       | cuBLAS thin binaries paired with it too) as part of a handy uv
       | extra or something like that would make it much quicker and
       | easier.
        
         | mikepurvis wrote:
         | Another piece is that PyPI has no index-- it's just a giant
         | list of URLs [1] where any required metadata (eg, the OS,
         | python version, etc) is encoded in the filename. That makes it
         | trivial to throw behind a CDN since it's all super static, but
         | it has some important limitations:
         | 
         | - there's no way to do an installation dry run without pre-
         | downloading all the packages (to get their dep info)
         | 
         | - there's no way to get hashes of the archives
         | 
         | - there's no way to do things like reverse-search (show me
         | everything that depends on x)
         | 
         | I'm assuming that a big part of pyx is introducing a
         | dynamically served (or maybe even queryable) endpoint that can
         | return package metadata and let uv plan ahead better, identify
         | problems and conflicts before they happen, install packages in
         | parallel, etc.
         | 
         | Astral has an _excellent_ track record on the engineering and
         | design side, so I expect that whatever they do in this space
         | will basically make sense, it will eventually be codified in a
         | PEP, and PyPI will implement the same endpoint so that other
         | tools like pip and poetry can adopt it.
         | 
         | [1]: Top-level: https://pypi.org/simple/ Individual package:
         | https://pypi.org/simple/pyyaml/
        
           | zahlman wrote:
           | Your information is out of date.
           | 
           | > there's no way to do an installation dry run without pre-
           | downloading all the packages (to get their dep info)
           | 
           | Not true for wheels; PyPI implements
           | https://peps.python.org/pep-0658/ here. You can pre-download
           | just the dependency info instead.
           | 
           | For sdists, this is impossible until we can drop support for
           | a bunch of older packages that don't follow modern standards
           | (which is to say, including the actual "built" metadata as a
           | PKG-INFO file, and having that file include static data for
           | at least name, version and dependencies). I'm told there are
           | real-world projects out there for which this is currently
           | impossible, because the dependencies... depend on things that
           | can't be known without inspecting the end user's environment.
           | At any rate, this isn't a PyPI problem.
           | 
           | > there's no way to get hashes of the archives
           | 
           | This is provided as a URL fragment on the URLs, as described
           | in https://peps.python.org/pep-0503/. Per PEP 658, the hash
           | for the corresponding metadata files is provided in the data-
           | dist-info-metadata (and data-core-metadata) attributes of the
           | links.
           | 
           | But yes, there is no reverse-search support.
        
       | forrestthewoods wrote:
       | Neat. uv is spectacular.
       | 
       | But I don't get it. _How_ does it work? Why is it able to solve
       | the Python runtime dependency problem? I thought uv had kinda
       | already solved that? Why is a new thingy majig needed?
        
         | simonw wrote:
         | This is a server hosted SaaS product that acts as a private
         | registry for your company's Python packages.
         | 
         | uv becomes a client that can install those packages from your
         | private registry.
         | 
         | I imagine pip will be able to install them from a PYX registry
         | too.
        
         | zahlman wrote:
         | > Why is it able to solve the Python runtime dependency
         | problem? I thought uv had kinda already solved that?
         | 
         | The dependencies in question are compiled C code that Python
         | interfaces with. Handling dependencies for a graph of packages
         | that are all implemented in pure Python, is trivial.
         | 
         | C never really solved all the ABI issues and especially for GPU
         | stuff you end up having to link against very specific details
         | of the local architecture. Not all of these can be adequately
         | expressed in the current Python package metadata system.
         | 
         | Aside from that, a lot of people would like to have packages
         | that use pre-installed dependencies that came with the system,
         | but the package metadata isn't designed to express those
         | dependencies, and you're also on your own for actually figuring
         | out where they are at runtime, even if you take it on blind
         | faith that the user separately installed them.
         | 
         | This is not really my wheelhouse, though; you'll find a much
         | better write-up at https://pypackaging-native.github.io/ .
        
           | forrestthewoods wrote:
           | > especially for GPU stuff you end up having to link against
           | very specific details of the local architecture.
           | 
           | Hrm. This doesn't sound right to me. Any project should
           | target a particular version of Cuda and then the runtime
           | machine simply needs to have that version available. Right?
           | 
           | > a lot of people would like to have packages that use pre-
           | installed dependencies that came with the system
           | 
           | Those people are wrong. Everything these days requires Docker
           | because it's the only way to deploy software that can
           | reliable not crash on startup. (This is almost entirely a
           | Linux self induced problem)
        
       | ossusermivami wrote:
       | i wonder if nix has been considered
        
       | lardbgard wrote:
       | you should use bluedwarf.top its better than python.
        
       | nilslindemann wrote:
       | I do not trust Astral.
       | 
       | Much ad language.
       | 
       | They do not explain what an installation of their software does
       | to my system.
       | 
       | They use the word "platform".
        
         | wtallis wrote:
         | It's hosted SaaS; you _don 't_ install it on your system.
        
       | FergusArgyll wrote:
       | Astral is the coolest startup
        
       | tomwphillips wrote:
       | >Modern
       | 
       | I'll pass. I'd rather have the battle-tested old thing, thanks.
        
         | LtWorf wrote:
         | I've been limiting myself to whatever is available on debian
         | and it's been fine for me since several years.
         | 
         | I don't understand why people who don't do weird AI stuff would
         | use any of that instead of sticking to distribution packages
         | and having the occasional 1 or 2 external modules that aren't
         | packaged.
        
           | tomwphillips wrote:
           | Indeed, to expand on my remark: I wrote Python in academia
           | for ~6 years and then professionally for nearly a decade in
           | data science, data engineering and backend web apps.
           | Virtualenv was fine. Pipenv had a nicer CLI and easier to use
           | dependency pinning. But fundamentally all this stuff worked
           | fine.
        
           | zahlman wrote:
           | Because making external modules cooperate with the system
           | environment is awkward at best (and explicitly safeguarded
           | against since 3.11, since it can cause serious problems
           | otherwise even with "user" installs), and installing the
           | distro's packages in a separate environment is not supported
           | as far as I can tell. And because the system environment is
           | often deliberately crippled; it may not even include the
           | entire standard library.
        
       | Animats wrote:
       | Groan. Another one.
       | 
       | The compiled languages now have better "packaging" than the
       | interpreted ones. "go build" and "cargo build" (for Rust), which
       | do real work, are easier to use than the packaging systems for
       | Python and Javascript.
       | 
       | We've come a long way in the compiled world since "make depend;
       | make".
        
         | TOGoS wrote:
         | I'm a fan of Deno's                 deno run
         | http://uri.of.the/dang.program.ts
        
       | runningmike wrote:
       | All python packaging challenges are solved. Lesson learned is
       | that there is not a single solution for all problems. getting
       | more strings attached with VC funded companies and leaning on
       | their infrastructure is a high risk for any FOSS community.
        
         | tempest_ wrote:
         | I share your concern but I have saved so much time with uv
         | already that I figure ill ride it till the VC enshitification
         | kills the host.
         | 
         | Hopefully at the point the community is centralized enough to
         | move in one direction.
        
       | rtpg wrote:
       | I really want to pay someone money to run package repo mirrors
       | for me, but my problems have been more with npm than with Pypi.
       | Astral, if you're listening.... maybe tackle JS packaging too?
        
       | kristoff_it wrote:
       | Is this going to solve the combinatorial explosion of pre-
       | building native dependencies for every possible target?
       | 
       | Python should get rid of its training wheels :^)
       | 
       | https://kristoff.it/blog/python-training-wheels/
        
         | zahlman wrote:
         | > Emphasis mine. It would indeed be hard to survive without
         | that kind of support from a corporation. A user on HN estimated
         | the yearly cost of this traffic at around 12 million USD/year
         | (according to AWS Cloudfront rates), more than four times the
         | full operating budget of the Python Software Foundation as of
         | 2024.
         | 
         | (As the user in question: three times.)
         | 
         | > Leverage progress in the systems programming ecosystem to
         | create repeatable builds. Turn prebuilt binaries from "sources"
         | into cacheable artifacts that can be deleted and reconstructed
         | at will. Institute a way of creating secondary caches that can
         | start shouldering some of the workload.
         | 
         | This doesn't avoid the need for the wheels to exist and be
         | publicly available. People running CI systems should figure out
         | local caching that actually works, sure. But if you delete that
         | cacheable artifact on the public PyPI website for something
         | like, I don't know, numpy-2.3.2-cp312-cp312-win_arm64.whl,
         | you're going to be re-creating it (and having it downloaded
         | again) constantly. Windows users are just not going to be able
         | to build that locally.
         | 
         | And you know, space usage isn't the problem here -- we're
         | talking about a few hard drives' worth of space. The number of
         | downloads is the problem. Outside of CI, I guess that's mostly
         | driven by end users defaulting to the latest version of
         | everything, every time they make a new virtual environment,
         | rather than using whatever's in their package installer's
         | cache. I do know that uv makes the latter a lot easier.
        
       | pshirshov wrote:
       | Again! ezsetup, setuptools, conda, poetry, uv, now this.
        
       | jacques_chester wrote:
       | Interesting watching this part of the landscape heating up. For
       | repos you've got stalwarts like Artifactory and Nexus, with
       | upstart Cloudsmith. For libraries you've got the OG ActiveState,
       | Chainguard Libraries and, until someone is distracted by a shiny
       | next week, Google Assured Open Source.
       | 
       | Sounds like Pyx is trying to do a bit of both.
       | 
       | Disclosure: I have interacted a bunch with folks from all of
       | these things. Never worked for or been paid by, though.
        
       | hommes-r wrote:
       | Only thing that is unclear to me is to which extend this setup
       | depends on the package publisher. PyPi might be terrible at least
       | it just works when you want to publish that it leads to more
       | complexity for the ones that are looking to use this piece of
       | free software is not for the maintainer.
       | 
       | Maybe they are only targeting dev tooling companies as a way to
       | simplify how they distribute. Especially in the accelerated
       | compute era.
        
       | lvl155 wrote:
       | I don't know how I feel about one company dominating this space.
       | I love what they do but what happens 5 years down the road?
        
       | no_wizard wrote:
       | Good on you guys!
       | 
       | I wanted to start a business exactly like this years ago, when I
       | actually worked in Python. I ended up not doing so, because at
       | the time (circa 2014-2015) I was told it would never take off, no
       | way to get funding.
       | 
       | I'm glad you're able to do what ultimately I was not!
        
       | Aeolun wrote:
       | Yay, _another_, probably incompattible, python package manager
       | has arrived.
        
       | NeutralForest wrote:
       | Cool idea! I think I could benefit from this at my job if they're
       | able to eat Anaconda's lunch and provide secure, self-hosted
       | artifacts.
        
       ___________________________________________________________________
       (page generated 2025-08-13 23:00 UTC)