return to table of content

Uv: Python packaging in Rust

endgame
29 replies
19h41m

How does a language ecosystem that bakes "there should be one-- and preferably only one --obvious way to do it" into the interpreter* as a statement of values end up with such a convoluted packaging story?

* Try running `python -c 'import this' | sed -n 15p`

yen223
18 replies
19h7m

Because packaging is a very complex problem, and it's rare that any one packaging solution can get everything right in the first try.

You will notice that every package management solution from all your favourite languages will have its own set of tradeoffs.

lucideer
7 replies
18h59m

This is absolutely true, but I haven't seen any language ecosystems that have gotten things wrong as often as Python.

And quite a few where the tradeoffs are minor enough to be well worth some sanity & consistency.

mixmastamyk
2 replies
18h57m

It's because no-one is in charge anymore, and Guido never cared about packaging anyway.

linsomniac
1 replies
17h37m

If you think Guido never cared about packaging, try calling it "The Cheese Shop" in distance of his hearing. :-)

mixmastamyk
0 replies
11h28m

Wasn’t any better back then. Not much of a cheese shop, is it?

yen223
1 replies
9h36m

You haven't used Swift before I see

lucideer
0 replies
4h34m

The recent CocoaPods SPM migration is almost as painful as py2->3 but it seems to be a clearly understood direction with end in sight.

SPM's manifests being non-declarative is also super problematic for 3P tooling but again - it's one well defined target. Decisions are clear, if a little cumbersome. There's nowhere near the mess of indecision and reduplication you see in python projects.

rtpg
1 replies
16h39m

C++. I think Java is pretty close in nastiness but I haven't used it in a while.

lucideer
0 replies
4h37m

Java is Maven or Gradle mostly. Neither perfectly ideal but there's clear choices by subcommunitues (e.g. Android =>Gradle) & churn is low. Not at all an awful situation.

C++ is mostly unmanaged in practice which is obviously bad (though Python is the only "modern" language where I regularly see published projects without so much as a requirements.txt so very comparable). However looking at the actual options for C++, Conan seems pretty clear cut & not awful in itself.

So... not as bad as Python by any measure imo.

lolinder
7 replies
17h20m

Packaging is hard but it's not hard enough to wholly explain pip.

Lock files alone are a proven piece of technology that pretty much just works across all modern package managers, and yet the best that pip can recommend to date is `pip freeze > requirements.txt`, which is strictly inferior than what is available in other package managers because there's no hash verification and no distinction between transitive dependencies and explicit dependencies.

That pip still doesn't have an answer for lock files is a sign of a deep problem in the Python package management world, not the natural result of packaging being hard.

larme
3 replies
15h9m

Lockfile does not work perfectly for me because it does not lock python version. Using a lock file with different python version or even same version but different OS may fail.

zadokshi
0 replies
13h49m

That’s literally his point. Other languages have solved this. It makes no sense that it’s not solved in python. (If your not sure how look at a golang go.mod file or a rust Cargo.toml file.)

lolinder
0 replies
13h45m

There's no reason why Python version couldn't be baked into the lock file or package manifest if pip had one. package.json does this [0]. Since all pip has is requirements.txt, it's a bit of a moot point.

[0] https://docs.npmjs.com/cli/v10/configuring-npm/package-json#...

__MatrixMan__
0 replies
13h57m

Sounds like you want a nix flake.

Too
2 replies
12h9m

pip has had constraints.txt forever, it is equivalent to lock file. Your are not supposed to freeze into requirements.txt.

Hopefully uv can make this the default behavior. It seems like the majority of users are not aware of its existence because it’s an optional flag.

lolinder
0 replies
10h50m

Somehow I've managed to go all this time without ever having heard of this feature. If this is the blessed path, can you explain why the pip docs recommend freezing to requirements.txt [0]? And why does the documentation for Constraints Files in the next section talk about them as though they're for something completely different?

Here's what they say about requirements:

Requirements files are used to hold the result from pip freeze for the purpose of achieving Repeatable Installs. In this case, your requirement file contains a pinned version of everything that was installed when pip freeze was run.

Here's what they say about constraints:

Constraints files are requirements files that only control which version of a requirement is installed, not whether it is installed or not. ... In terms of semantics, there is one key difference: Including a package in a constraints file does not trigger installation of the package.

... Write a single constraints file for your organisation and use that everywhere. If the thing being installed requires “helloworld” to be installed, your fixed version specified in your constraints file will be used.

Constraints file support was added in pip 7.1. In Changes to the pip dependency resolver in 20.3 (2020) we did a fairly comprehensive overhaul, removing several undocumented and unsupported quirks from the previous implementation, and stripped constraints files down to being purely a way to specify global (version) limits for packages.

This sounds like something vaguely similar to a lock file, but they seem to intend it to be used globally at the organization level, and they're certainly not pushing it as the answer to locking dependencies for a specific project (they specifically recommend requirements for that). Maybe you can use it that way—although this Stack Overflow answer says that the 2020 update broke it for this use case [1]—but even so it doesn't answer the fundamental need for lock files unless everyone is actually on board with using it for that, and even the pip maintainers don't seem to think you should.

[0] https://pip.pypa.io/en/latest/user_guide/#requirements-files

[1] https://stackoverflow.com/questions/34645821/pip-constraints...

lloeki
0 replies
10h39m

Your are not supposed to freeze into requirements.txt

ironic that pip freeze literally generates a requirements.txt with ==

constraints.txt certainly did not exist back when I was doing python.

Conversely Ruby packaging has been a solved problem for a decade, when the python community has been extremely resistant to conceptually similar solutions for the longest time on strange ideological grounds, and came around only recently.

pjc50
1 replies
7h38m

Python refuses to have packages (folders don't count) or accept build steps. Everything follows from there, making it much harder to have a good solution.

strbean
0 replies
6h2m

Can't even rely on a package to have statically defined dependencies. You can just make setup.py generate a random selection of packages!

stared
6 replies
19h15m

Well, there are languages better at incorporating the Zen of Python than Python itself, vide (full disclosure: my blog post) https://p.migdal.pl/blog/2020/03/types-test-typescript/.

When it comes to package managers, as you noted, the situation is even more ironic, as nicely depicted in https://xkcd.com/1987/.

mixmastamyk
3 replies
18h58m

One way to do things becomes impossible to after thirty years unless one wants to make large, breaking changes. After Python 3:

    >>> from __future__ import braces
      File "<stdin>", line 1
    SyntaxError: not a chance

endgame
1 replies
17h28m

That's been around since at least Python 2.6, and I think I remember seeing it before then:

    $ LC_ALL=C NIX_PATH=nixpkgs=https://github.com/NixOS/nixpkgs/archive/release-13.10.tar.gz nix-shell -p python26 --run 'python -c "from __future__ import braces"'
      File "<string>", line 1
    SyntaxError: not a chance

mixmastamyk
0 replies
11h26m

“After Python 3” doesn’t mean it arrived then. It was supposed to be read, “after Python three … not a chance.”

lijok
0 replies
18h51m

Haha, didn't know about this easter egg, amazing

surfingdino
1 replies
19h0m

What's your view on how Golang deals with this problem? Serious question.

zadokshi
0 replies
13h46m

Golang users don’t sit around arguing about package managers, nor do they have trouble with trying to make sure they have the right version of go installed.

Early versions of go didn’t solve things well, but go has had this fixed for a long time now. Any arguments about packaging issues are likely related to older out of date problems people had.

The only real problem people have these days is around private non open source packages. You can do it, but you need to learn how to set environment variables.

PurpleRamen
2 replies
7h20m

Simple: Python is old. It predates most shiny languages by decades, and so does its packaging-trip. I mean, most modern packaging can shine today, because they could learn from the imperfections of their forerunners. Additionally, python has a very wide field to cover, far wider than most other languages, which makes things more complicate and thus people are more open for experiments, which leads to so many different solutions.

lolinder
1 replies
3h58m

Python is old, but pip itself had its 1.0 release 8 months after Ruby's Bundler did and 6 years after Apache Maven, both of which are substantially better package managers than pip and both of which are also for 90s-era programming languages, not the new shiny ones.

Age is part of the problem, but it's not the whole problem. Ruby manages to have a package manager that everyone loves, and Java has settled on a duopoly. Meanwhile Python has someone making a serious attempt at a new package manager every few years.

PurpleRamen
0 replies
3h13m

pip wasn't pythons first package-manager, and obviously wasn't it just created with version 1.0. There were already years of history at that point. But yes, Ruby has a better history with packaging, similar to Perl. Python was kinda always copying Perl and then Ruby in what they did, but still aiming to preserve their own identity and requirements. Which is also why nobody wanted to copy the cargo-cult-pain of the Java-Enterprise-World.

Meanwhile Python has someone making a serious attempt at a new package manager every few years.

Many People have many attempts at many things in many communities. Doesn't mean they are important. Heck, vim and zsh have also dozens of different package-managers, and they have less use cases and problems to solve then python.

eigenvalue
22 replies
22h11m

Looks awesome. I find that pip is usually pretty fast for me, and when it's not, it is mostly because it has to download so much data or wait for native libraries to compile in the background (or anything involving cuda which always seems to take forever). What really needs some help with speed is conda, which is just so absurdly slow for literally anything, even on ridiculously powerful machines.

daniel_grady
12 replies
21h0m

What are some of the reasons that teams use conda (and related tools) today? As a machine learning scientist, I used conda exclusively in the mid-2010s because it was the only framework that could reliably manage Python libraries like NumPy, PyTorch, and so on, that have complex binary dependencies. Today, though, pip install works fine for those packages. What am I missing?

Ringz
6 replies
20h9m

Unfortunately, far too often: tradition.

Using only „Pythons native tools“ like pip and venv simply works nowadays so good that I wonder about the purpose of many tools like poetry etc. etc.

nateglims
2 replies
19h43m

The biggest advantage for poetry I found, working with a lot of non-traditional software people, is that it does a lot of things by default like pin versions and manage virtual envs. Unfortunately, it does complicate some things.

Ringz
1 replies
19h34m

I can understand that well. A few articles from ByteCode! helped me to "follow my intuition" and do as much as possible with native Python tools.

https://www.bitecode.dev/p/back-to-basics-with-pip-and-venv

https://www.bitecode.dev/p/relieving-your-python-packaging-p...

daniel_grady
0 replies
18h7m

Those are interesting pointers; appreciate it! My own experience over the past three years has been similar. I tried using Pipenv, and then Poetry, for internal projects at my company; in both cases the tool seemed overly complicated for the problem, slow, and I had a hard time getting co-workers on board. About a year and a half ago, I saw [Boring Python: dependency management](https://www.b-list.org/weblog/2022/may/13/boring-python-depe...), which recommends using the third-party `pip-tools` library alongside the standard library’s `pip` and `venv`, and switched to that for the next project. It’s been working great. The project has involved a small team of scientists (four or five, depending) who use a mix of macOS and Windows. We do analysis and development locally and write production-facing algorithms in Python packages tracked in our repository, and publish releases to Gitlab’s PyPI. For our team, the “get up and running” instructions are “clone, create a venv, and pip install -r requirements.txt” and for the software team that manages the production systems, deploying an update just means pip installing a new version of the package. Every team’s got different constraints, of course, but this has been working very smoothly for us for over a year now, and it’s been easy, no pushback, with everyone understanding what’s going on. Really impressed with the progress of the core Python packaging infrastructure over the past several years.

pininja
0 replies
19h53m

Another reason I used to use conda was for easy native Windows installation. GPU accelerated packages like OpenCV were especially difficult when I used use it 6 years ago. Now there’s Linux subsystem.. has pip support dramatically improved on Windows?

gcarvalho
0 replies
17h13m

For me it's the easiest and fastest cross-platform way to consistently install a Python version.

pip and venv work fine, but you have to get them first; and that can be a struggle for unseasoned python devs, especially if you need a version that's not what your distro ships, and even more so on Windows and macOS.

I use micromamba [1] specifically, which is a single binary.

[1] https://mamba.readthedocs.io/en/latest/user_guide/micromamba...

RockRobotRock
0 replies
18h25m

Has anyone else been paying attention to how hilariously hard it is to package PyTorch in poetry?

https://github.com/python-poetry/poetry/issues/6409

tehnub
1 replies
13h50m

One reason to choose one over the other is the dependencies they’re bundled with. Take numpy. With PyPI, it’s bundled with OpenBLAS, and with conda, it’s bundled with Intel MKL, which can be faster. See https://numpy.org/install/#

daniel_grady
0 replies
10h32m

That’s a great point; I didn’t know about that!

dragonwriter
1 replies
19h12m

Today, though, pip install works fine for those packages.

pip install works, but pip's dependency management doesn't seem to (for Pytorch, specifically) which is why projects that have pip + requirements.txt as one of their installation methods will often have separate pytorch installation instructions when using that method, though if the same project supports conda installation it will be a one-stop-shop installation that way.

daniel_grady
0 replies
18h42m

pip's dependency management doesn't seem to (for Pytorch, specifically)

That’s interesting — I’ve also had difficulties with PyTorch and dependency resolution, but only on the most recent versions of Python, for some period of time after they’re released. Picking Python 3.9 as a baseline for a project, for example, has been very reliable for PyTorch and all the related tooling.

blactuary
0 replies
17h13m

For me personally, I prefer conda because it is dependency resolution (mamba), virtual environments, and a package repository (conda-forge) all from one base miniconda installation. And for all of my use cases, all of those just work. Dependency solving used to be painfully slow, mamba solved that. Packages used to be way behind the latest, setting conda-forge as my default solved that.

After fiddling with different solutions for years and having to start fresh with a new Python install, I've been using nothing by miniconda for years and it just works

paulddraper
6 replies
22h9m

Yeah, I'm curious how much uv is actually faster.

npm -> Yarn was life changing performance-wise.

I wonder what pip -> uv is.

laborcontract
1 replies
22h1m

Bun, ruff/uv, polars.. all have been major qol improvements.

I’m loving the pace of release with this crop of speed obsessed projects and I cannot wait for astral to tackle typing.

laborcontract
0 replies
18h53m

Ok, here are some benchmarks installing an application that has a couple build-heavy dependencies:

    Installation times (MM:SS):

    # No cache
    - uv:  01:05
    - pip: 01:56

    # Cache:
    - uv:  00:02
    - pip: 00:42

eigenvalue
1 replies
21h46m

npm seems to have gotten a lot faster lately. All that competition from yarn and now bun seems to have pushed them to focus on optimization.

paulddraper
0 replies
21h38m

Oh certainly.

Npm has improved greatly.

Npm features follow a two-step process: (1) Get implemented in Yarn (2) Get implemented in npm.

the_mitsuhiko
0 replies
22h2m

From my testing in rye it’s significantly faster in day to day. There are numbers in the blog post obviously but it’s the feeling you get using it locally that makes it much more fun to use.

driverdan
0 replies
14h49m

I did some basic testing today and uv was around 2x faster than pip for a clean venv and cold cache on a real, decent sized project. With warm uv cache it was incredibly fast, under 10 sec.

rogue7
1 replies
21h49m

For conda there is mamba [0], a drop-in replacement that's really fast.

By the way, the creator of mamba started his own company at https://prefix.dev/

They want to essentially leverage the conda(-forge) infrastructure to build a new cross-platform, cross-language, cargo-like package manager: pixi

[0] https://github.com/mamba-org/mamba

maxnoe
0 replies
21h17m

Since last year, conda itself is actually using the mamba solver:

https://conda.org/blog/2023-11-06-conda-23-10-0-release/

captn3m0
19 replies
21h56m

So what's the plan for monetization? NPM getting acquired by GitHub was an anomaly, and I'm wary of letting VC-backed companies becoming ecosystem lynchpins.

sophacles
11 replies
21h41m

It's MIT and Apache dual licensed. If the company fails, you fork and move on. If the company never makes a dime, that's OK for me: some folks got paid to make open source tooling.

anze3db
10 replies
21h34m

What they are doing is already discouraging people from contributing to the projects their tools are replacing[0]. If they go out of business and stop supporting their tools, it might leave the Python community in a bad place.

[0] https://www.youtube.com/watch?v=XzW4-KEB664

woodruffw
7 replies
21h28m

I'm not sure this is a reasonable framing: LLVM stole much of GCC's thunder by being significantly easier to contribute to (and extend externally), but I don't think it would be accurate to say that LLVM is "discouraging" people from contributing to GCC.

ForHackernews
2 replies
21h9m

The slow death of GCC has been unquestionably bad for copyleft software.

woodruffw
1 replies
21h4m

Why is it LLVM's job to be good for copyleft software?

ForHackernews
0 replies
20h45m

It's not. But some suspicion of successor projects is probably warranted by the devs and users of existing tools.

marwis
1 replies
20h58m

LLVM was at least written in the same language it is compiling.

In this case they are replacing Python code with Rust which might exclude large part of Python community from being able to contribute.

woodruffw
0 replies
20h52m

By this token, we should be concerned that CPython is not written in Python. But that historically has not posed a significant risk to CPython's longevity, nor to the ability of motivated contributors to learn the C necessary to contribute to it.

(Or another framing: only the tiniest fraction of the Python community was contributing to CQA tooling in the first place. It's not clear that prioritizing the interests of the 99.9% of Python programmers who -- rightfully! -- will never modify these tools makes sense.)

anze3db
1 replies
21h8m

I'm not sure if you can compare a project that came out of a university and got adopted by Apple with a project developed by a VC backed company with no revenue. I'm sure Charlie has the best intentions with both ruff and uv, but we have no idea how this is going to play out.

woodruffw
0 replies
21h1m

My understanding of Ruff's history is that it predates Astral, and was originally a side project by Charlie. I have similar reservations about VC funding and sustainability, but I don't think Ruff was created as a trojan horse or with a secret takeover plan in mind.

I agree that we have no idea how it'll play out. But I've seen nothing but good faith from Charlie and others, and I see no reason to preemptively condemn their attempt to build a business.

eviks
0 replies
21h24m

Wouldn't their failure re-encourage people to contribute?

bdzr
0 replies
18h55m

That video is.. weird. It claims that astral isn't contributing back despite the entirety of their code being permissively licensed. It's also sort of baffling that making tooling more accessible doesn't seem to be considered contributing back.

I'm not sure what the maker of that video wants, does he want money to be poured back into the community, or for no one else to make money?

anze3db
6 replies
21h43m

I hope they answer this ASAP. People in the Python community are already concerned that Astral is following the Embrace, extend, and extinguish playbook.

aidos
4 replies
21h34m

If all they did right now was leave Ruff exactly as it was and walk away they would have given the Python community a great gift.

Sure, it’s rendered a bunch of projects obsolete, but it’s because it’s so much better than the predecessors.

anze3db
3 replies
21h23m

If they stop development on Ruff today, Ruff won't be useful anymore after a few new Python releases. If the maintainers of the tools that Ruff is replacing stop maintaining them, the whole Python community will be in a really bad place. So, there is reason to be cautious. Astral being transparent on how they plan to make money would be very helpful.

the_mitsuhiko
0 replies
20h25m

If they stop development on Ruff today, Ruff won't be useful anymore after a few new Python releases

That is true for every single development tool in the Python space.

theLiminator
0 replies
19h14m

People could just fork it if that happens. Since the foundation of the tooling is solid, I 'm sure the community would rally around a fork that keeps it going.

aidos
0 replies
21h12m

I do get the caution, but I think that even in that case there’s enough momentum behind Ruff for the community to fork and carry on.

Obviously it would be pretty crummy to end up in a situation like pyright where MS really have leant in to their embrace, extend, extinguish strategy.

cqqxo4zV46cp
0 replies
21h21m

“People in the Python community”?

Yeah. Probably precisely the same people that call everything EEE.

You’ll always find people acting alarmist about anything.

lijok
16 replies
20h16m

Gonna put my pessimist hat on and say, great - another python package manager. If it was at least a drop-in replacement for pip, we could switch to it as we did to ruff just for the speed gains.

We need a permanent solve for python package management, governed by the python steering council.

keb_
7 replies
20h12m

Literally the first words in the article says that it's a drop-in replacement for pip.

joshSzep
5 replies
20h6m

And the following examples show it is not. While it is pretty straightforward to add `uv ` in front of any pip calls, it is not API compatible.

romantomjak
4 replies
19h32m

Granted they are still working on it, but in the meantime - add a shell alias for uv?

alias pip="/usr/bin/uv pip"

aragilar
2 replies
8h32m

How does that solve the problem that uv isn't a drop-in replacement? Are they going to implement the whole of pip, warts and all? Unlikely, because even though its in rust, they're getting a fair bit of speedup by making assumptions (see their ruff benchmarks, most of pylint isn't implemented), and as we've seen with both poetry and pipenv, those assumptions break down. pixi may get somewhat closer (given their experience with conda, and so familiarity with workflows and needs outside webdev), but I suspect uv will only further add issues and confusion to Python installs (especially if people decide to try aliasing and things break).

romantomjak
1 replies
5h42m

I jumped to conclusions in my previous comment, but that was based on a personal experience. Realistically, I've only ever needed 2 commands: `pip freeze > requirements.txt` and `pip install -r requirements.txt`. They don't need 100% API compatibility to deliver value, just "good enough". 80/20 rule and all that.

spapas82
0 replies
1h38m

I can't use it unless it implements list -o and install -U

liveoneggs
0 replies
1h32m

take it all the way and force integrate it into venv

lijok
0 replies
20h7m

And you just took a VC-backed company at its word on a marketing article? I'm sure that's what they're aiming for, but at this time, it's not

raccoonDivider
4 replies
20h13m

Isn't it a replacement for pip? I've seen people on Twitter saying that they migrated very quickly. TFA:

TL;DR: uv is an extremely fast Python package installer and resolver, written in Rust, and designed as a drop-in replacement for pip and pip-tools workflows.
lijok
3 replies
20h8m

I tested it on my repo, multiple errors, it's not a drop-in replacement.

  (.venv) lijok@mbp icebreaker % python -m uv pip install --upgrade uv setuptools wheel

  error: unexpected argument '--upgrade' found

  tip: to pass '--upgrade' as a value, use '-- --upgrade'

  Usage: uv pip install [OPTIONS] <PACKAGE|--requirement <REQUIREMENT>|--editable <EDITABLE>>

  For more information, try '--help'.

  (.venv) lijok@mbp icebreaker % python -m uv pip install -- --upgrade uv setuptools wheel

  error: Failed to parse `--upgrade`
  Caused by: Expected package name starting with an alphanumeric character, found '-'
  --upgrade
  ^

I'm sure it can be made to work - but it's definitely not drop in

drcongo
2 replies
8h41m

What happens if you don't prefix with `python -m`? There's no mention of doing that in the blog post and it certainly feels wrong to call a rust package with python -m

aragilar
1 replies
6h27m

That's specifying which python to use. The issue which is printed is there's no support for `--upgrade`, so it's not a drop in replacement for pip.

spapas82
0 replies
1h37m

And there no support for pip list -o

throw12344612
2 replies
20h8m

The PSF has been asleep at the wheel for the past 10-15 years. The python packaging ecosystem has only gotten more fragmented under their stewardship.

lijok
1 replies
20h7m

Fully agree - they need to get it together

kzrdude
0 replies
20h2m

A lot is happening these last few years, it's literally exploding. I think it will start to consolidate.

hprotagonist
13 replies
20h13m

A VC-backed pip-and-more doesn't make sense to me. It's 2024: what's the revenue model when the free money printer's on the fritz?

dj_gitmo
6 replies
20h6m

That was one of my first questions, but Anaconda exists https://www.anaconda.com/download/

Python is used by loads of scientists, academics, and other non software engineers. Those people need an easy way to use python.

nateglims
5 replies
19h47m

I knew they sold something, but I am amazed to learn they have 300 employees.

BiteCode_dev
4 replies
19h44m

Anaconda is very big in corporate envs.

If you want to have a platform that allow to manage Python on all machines, including allowed packages and version, integrated with ldap, with auditing capabilities, they are pretty much the only game in town.

And big companies want that.

throwaway2037
2 replies
14h28m

I agree. Real question: Why didn't ActiveState (ActivePython) win this battle? In my corporate experience, they are invisible.

pjmlp
0 replies
11h49m

They were quite big in the 2000, on my bubble they faded away as .NET and Java with their IDEs came into the picture, alongside Tcl, Perl and Python improving their Windows support story.

They were never big outside Windows shops.

BiteCode_dev
0 replies
11h33m

Because Continum had a loss leader with anaconda that, in 2010, was solving a ton of packaging problems.

Today I would say anaconda brings more problems than it solves (https://www.bitecode.dev/p/why-not-tell-people-to-simply-use), but at the time, we didn't have wheels for everything, and it was a life saver for anybody that wanted to use c extensions.

So anaconda became first popular because it solved a real end user problem, then it moved on to be the corporation providers because it already was well known.

It was a very good strategy.

ivirshup
0 replies
4h6m

Even bigger now that their distribution is integrated in Excel[1]

[1]: https://www.anaconda.com/blog/announcing-python-in-excel-nex...

pininja
3 replies
20h3m

NPM comes to mind. I’m imagining private package management and team support https://www.npmjs.com/products

plorkyeran
0 replies
15h21m

npm, Inc. was an unsuccessful business that got acquired for a pittance because MS can afford to throw away a few million dollars in case it turns out that having control over npm is useful. The team working on it isn't very large but I'd still be surprised if it's actually operating at a profit.

jitl
0 replies
16h20m

NPM did not go well, and selling to Microsoft happened after the team there fell apart. In my view some of that is leadership issues, and some of that is pressure from a struggling business.

Kwpolska
0 replies
19h55m

NPM’s got Microsoft/GitHub behind it. I doubt those features bring in any serious money, given the abundance of free alternatives.

lacker
0 replies
16h40m

Sell build-related services to companies. Imagine GitHub Actions, but it's cleanly built into your Python tooling in some reasonable way, so it's just the natural thing to use. I think it's pretty straightforward, although we'll see whether it works out for them.

LtWorf
0 replies
20h3m

I'm curious to see what the pydantic start up will do.

SushiHippie
8 replies
22h20m

Isn't this basically what pixi wants to be? Wouldn't it be better to work together?

https://github.com/prefix-dev/pixi/

theLiminator
6 replies
22h11m

Pixi is even more ambitious but with a different philosophy (I think? Specifically thinking about how pixi went with their own lock and manifest format as well as first class conda support). I'd definitely prefer if they worked together instead or instead dedicated their time to type checking python which imo there still isn't a great solution for.

fydorm
2 replies
13h48m

Have you tried Pyright? It made me actually enjoy Python development.

RMPR
0 replies
10h33m

For type checking it is very good, however, the error messages are sometimes not very human-readable. For example:

    def _deep_merge(updates: dict[str, str]) -> None:
        for key, update_info in updates.items():
            if isinstance(update_info, dict) and "values" in update_info:
                value = update_info["values"]
 
Errors out with Pyright:

    - error: Argument of type "Literal['values']" cannot be assigned to parameter "__key" of type "SupportsIndex | slice" in function "__getitem__"
        Type "Literal['values']" cannot be assigned to type "SupportsIndex | slice"
      "Literal['values']" is incompatible with protocol "SupportsIndex"
        "__index__" is not present
      "Literal['values']" is incompatible with "slice" (reportGeneralTypeIssues)
1 error, 0 warnings, 0 informations

It took me a great amount of starring to figure out that changing the signature of updates to dict[str, dict] was what it was complaining about.

Cannabat
0 replies
10h14m

I don't know if I'd use the word "enjoy", but it certainly makes python tolerable. With all due respect, it blows mypy out of the water.

SushiHippie
2 replies
22h2m

Especially the conda support is IMO a cool thing.

As conda/pip interop is just not great. And even micromamba (C++ implementation of conda) is relatively slow to resolve compared to pip.

But agreed either work together or create a type checker. I use mypy currently but it definitely slows down my editor.

theLiminator
0 replies
21h53m

Yeah, pixi has decent mixed support for conda/pypi, it currently solves and installs conda (then locks it) and then solves and installs pypi. I think it's on their roadmap to solve them together, which would be a killer feature.

davidhalter
0 replies
19h56m

I have been working on a faster type checker for 3.5 years now. It's coming, I promise :)

BiteCode_dev
0 replies
7h11m

Yes but pixi is setting itself for failure because it attempt to mix anaconda and pypi packages, which are fundamentally incompatible.

Hence they will always trigger user errors, and their image will be stained by it.

notatallshaw
7 replies
22h12m

There are a couple of promising tools written in Rust looking to replace Pip for most users.

Rip (https://github.com/prefix-dev/rip/issues) which is more of a library for other rust tools to be on top of like Pixi (which is looking to replace both Pip and Conda).

And now uv, which seems to be looking to replace Pip, Pip-Tools, and eventually Poetry and PDM.

A lot of the explosion in tools in the Python world is coming from the desire for better workflows. But it has been enabled by the fact that build configuration and calling has been standardized and tool makers are able to follow standards instead of reverse engineering easy install or setup tools.

I know a lot of people are put off by there being so many tools, but I think in a few years the dust will settle and there will emerge a best practice work flow that most users can follow.

As a primarily Python developer and someone who occasionally contributes to Pip to solve complex dependency resolution issues it does make me wonder if I should hang my hat on that and learn enough rust to contribute to one of these projects eventually.

ShamelessC
6 replies
22h0m

My experience with Rust developers who dwell in Python land is that they fundamentally disagree with most of the language that is Python and think they know better than incumbents what belongs and what doesn't.

klardotsh
1 replies
21h38m

The Rust ecosystem gets so much right that honestly, even as a career-long Python developer myself (and Rust for many years, but that's less of my point), they honestly probably do know how to build a good devxp better than much of the Python ecosystem.

Put other ways: the Python ecosystem has had 30+ years to figure out how to make packaging not suck. It has continually failed - failed less and less over time, sure, but the story is still generally speaking a nightmare ("throw it all in an OCI container" is an extremely reasonable solution to Python packaging, still, in 2024). I welcome advances, especially those inspired by tooling from languages that focused heavily on developer experience.

theLiminator
0 replies
18h24m

Tbf, being able to start from scratch makes it much easier to get those things right.

Being compatible with the mess that exists is where the difficulty comes from.

daxfohl
1 replies
21h1m

I'm surprised anyone who has gone from python to rust still finds contributing back to the python ecosystem worthwhile.

__MatrixMan__
0 replies
13h45m

There's a lot of important work that happens in python. Most of it isn't being done by software engineers. I think the idea of improving things for that group is plenty meaningful.

sophacles
0 replies
21h35m

I have written a couple million lines of python professionally. Some of that is still in prod.

The last 5 or so years I've mostly done work in rust... so I guess I'm a rust developer now.

Do the last 5 years invalidate my opinions, thoughts, or skills w.r.t. python somehow?

drawnwren
0 replies
21h48m

To be fair, python package management is in such a poor state that I would expect any outside opinion to not be worse.

epage
7 replies
22h13m

Congrats!

Similarly, uv does not yet generate a platform-agnostic lockfile. This matches pip-tools, but differs from Poetry and PDM, making uv a better fit for projects built around the pip and pip-tools workflows.

Do you expect to make the higher level workflow independent of requirements.txt / support a platform-agnostic lockfile? Being attached to Rye makes me think "no".

Without being platform agnostic, to me this is dead-on-arrival and unable to meet the "Cargo for Python" aim.

uv supports alternate resolution strategies. By default, uv follows the standard Python dependency resolution strategy of preferring the latest compatible version of each package. But by passing --resolution=lowest, library authors can test their packages against the lowest-compatible version of their dependencies. (This is similar to Go's Minimal version selection.)

uv allows for resolutions against arbitrary target Python versions. While pip and pip-tools always resolve against the currently-installed Python version (generating, e.g., a Python 3.12-compatible resolution when running under Python 3.12), uv accepts a --python-version parameter, enabling you to generate, e.g., Python 3.7-compatible resolutions even when running under newer versions.

This is great to see though!

I can understand it being a flag on these lower level, directly invoked dependency resolution operations.

While you aren't onto the higher level operations yet, I think it'd be useful to see if there is any cross-ecosystem learning we can do for my MSRV RFC: https://github.com/rust-lang/rfcs/pull/3537

How are you handling pre-releases in you resolution? Unsure how much of that is specified in PEPs. Its something that Cargo is weak in today but we're slowly improving.

charliermarsh
2 replies
21h10m

Thanks Ed! Your work as always is a big source of inspiration.

Do you expect to make the higher level workflow independent of requirements.txt / support a platform-agnostic lockfile? Being attached to Rye makes me think "no".

Yes, we absolutely do. We don't do this today, the initial scope is intentionally limited. But in the next phase of the project, we want to extend to multi-platform and multi-version resolution.

How are you handling pre-releases in you resolution? Unsure how much of that is specified in PEPs. Its something that Cargo is weak in today but we're slowly improving.

This is something we talked with Jacob about quite a bit. Turns out (as you know) it's a very hard problem. For the initial release, we added a constraint: our default behavior is that if you want to use a pre-release, you _have_ to specify the package as a first-party dependency and use a pre-release marker in the version specifier. (We also support globally enabling and disabling pre-releases.) So, we basically don't support "transitive" pre-releases right now -- but we give you a dedicated error message if your resolution fails for that reason.

thaliaarchi
1 replies
17h33m

Any plans to tackle the Python version installation side of things and make it as seamless as rustup has? I've previously used `pyenv install` for this, but it would be nice to fold it into one tool.

charliermarsh
0 replies
15h52m

Yeah, this is very much on our roadmap and probably one of the first things we'll tackle next.

techwizrd
1 replies
20h51m

PyTorch doesn't work well with platform-agnostic lockfiles. It's a constant source of issues for me when using Poetry.

jvolkman
0 replies
19h1m

The vast majority of pypi packages are not PyTorch, however.

woodruffw
0 replies
21h9m

How are you handling pre-releases in you resolution? Unsure how much of that is specified in PEPs.

The living version of PEP 440 has a bit on how pre-releases are handled[1]. The basic version is that the installer shouldn't select them at all, unless the user explicitly indicates that they want a pre-release. Once opted into, they're ordered by their phase (alpha, beta, rc) and pre-release increment (e.g. `.beta.1 > `.alpha.2`).

[1]: https://packaging.python.org/en/latest/specifications/versio...

OJFord
0 replies
21h11m

That is great. Fuzzing would be cool too - just completely randomise the versions within the claimed compatibility constraints.

amelius
6 replies
21h37m

Iirc, some pip packages require compilation which depends on entire toolchains with e.g. gcc, g++, and with dependencies like gtk, Qt, etc. How do they intend to make that less error prone and thus more user-friendly?

mixmastamyk
2 replies
18h47m

Wheels have been a thing for a decade+. If not available from the developer the complexity doesn't reduce much.

wiredfool
1 replies
8h12m

That’s the really pointy end of packaging. Pillow (python +c+ External dlls) produces something like 50 wheels per release, and we’ve still got platforms where you’ve got to compile your own. Since they tend to be lower volume, they also tend to have more interesting compiler issues.

amelius
0 replies
6h19m

Yes, I'm using Python on an nVidia Jetson system with PyTorch, which depends on CUDA, and all my other dependencies are quite challenging to get working in concert. Every time pip3 starts compiling under the hood, I'm just praying that it will work.

cavallo71
2 replies
20h51m

That's exactly what conda/mamba/micromamba is covering: the whole toolchain.

amelius
0 replies
20h42m

I've used conda/mamba, but often ended up with broken installs (random coredumps) that were mysteriously fixed by just using pip3 instead.

ZeroCool2u
0 replies
20h28m

That's what they attempt to cover, but they're not super successful at it.

0cf8612b2e1e
6 replies
22h26m

Coupling with Rye is very interesting, as that or the PyBi idea seem inevitable to take over. Managing different interpreters is just too much effort which each tool is handling differently.

Of course, the PSF will remain silent on the issue and let yet another tool exist in the ecosystem. I really do not care who wins, but we need a standard to be decided.

theyinwhy
2 replies
19h31m

Why do people here claim that the PSF has any leverage to stop tools being developed?

0cf8612b2e1e
1 replies
18h26m

I don’t want the PSF to declare tools cannot be made. I want the PSF to pick a winner so we can finally consolidate on a workflow.

Just today, I was helping a beginner setup Python on their laptop. A nightmare collection of tooling complexity (so you have Python, but you need to worry about virtual environments, to do that, I use this combination of pipx+virtualenvwrapper, but to do actual projects, you need Poetry, but…) all the while noting that this is an opinionated workflow that I happen to utilize because of the tradeoffs that matter to me. Ask someone else, and they will definitely have a different configuration. Entire first lesson was just to get Python installed with tons of elided complexity.

I should be able to point to a document on Python.org that says, “This is the way.” Yet for decades, they have refused to take a stand.

theyinwhy
0 replies
12h15m

Isn't it your job as an engineer to pick the right toolchain for the job?

OJFord
2 replies
20h57m

I hadn't heard of Rye before I don't think, but sounds like a strange arrangement to me - basically the same project & idea, so the new one to start switched the old one to use it 'under the hood', 'took it over', and will replace it 'when the time is right'? Why.. any of that, why not just buy it and let it be uv instead of starting uv, or just leave it alone?

the_mitsuhiko
1 replies
20h49m

I wrote down a small little FAQ at the bottom, hope it answers some questions: https://lucumr.pocoo.org/2024/2/15/rye-grows-with-uv/

OJFord
0 replies
20h44m

That is helpful, and fair enough, thanks!

woodruffw
5 replies
21h34m

This is very exciting! Congratulations to the Astral team on their work here.

I have historically expressed trepidation towards external attempts to "fix" Python packaging, largely because each attempt has historically accreted another layer of incompatibilities and hacks onto the giant pile of hacks that are already needed to keep the ecosystem running. As such, it makes me happy to see that compatibility is a priority here: the Astral team has gone out of their way to emphasize both formal (PEP) and informal (pip CLI) compatibility with existing tooling and standards.

(Separately, I share concerns about VC funding and sustainability. But everything I've seen so far indicates that Charlie and the other folks at Astral are not in it to screw or otherwise leverage the Python community. So I choose to be optimistic here.)

eviks
4 replies
21h30m

Prioritizing compatibility with a giant incompatible pile of historical hacks is how you grow the pile

fwip
2 replies
20h56m

Well, they do explicitly list certain historical features that they don't intend to support, like installation from eggs or editable installs. So they're doing some work to trim the pile while they're there.

SonOfLilit
1 replies
19h56m

They do support editable installs, but only from local directories. So you have to perform git clone yourself and then install.

charliermarsh
0 replies
15h36m

We do actually support direct installation of Git and direct URL dependencies. Like, you can do run `uv pip install` with:

``` black @ git+https://github.com/psf/black ```

We also support editable installs for local directories, like you mentioned:

``` black @ ../black ```

The thing we don't support is using _editable_ installs for Git and direct URL dependencies, like:

``` -e git+https://github.com/psf/black ```

In my experience, these are really rare, and we didn't see a reason to support them.

woodruffw
0 replies
21h25m

I don't think that's true, at least in the case of Python packaging. The pile has historically grown because of informal or non-existent standards, along with multiple independent implementations attempting to align their observed (but not specified) behaviors.

bagels
5 replies
17h50m

What is wrong with pip that people thought we needed conda, and all these other managers?

__MatrixMan__
1 replies
13h41m

Have you ever seen a pip failure that required you to go use a different package manager and then come back and try pip again? That's what.

bagels
0 replies
9h4m

I haven't. The only packaging problem I saw deploying python applications was with one particular package that was only available through conda.

I was able to build useful things with pip + virtualenv

okanat
0 replies
17h40m

Imperative dependency (mis)management. Pip resolves what needs to be installed while it installs things depending on the setup script of the package. It requires a full execution and it is neither repeatable nor reproducible. So it is impossible to know what needs to be pulled from Pypi ahead of time.

Pip is also strongly tied to the specific Python version. If you upgraded Python interpreter on a system, congrats all virtual environments and all PyPi packages are broken now. Hopefully you saved the minimum set of requirements and they don't have some catastrophic dependency breakage. You run pip install and pray.

What? Native binaries? Building them on the fly while installing a pip package? Oh you're brave.

lolinder
0 replies
17h40m

A long list of terrible defaults with workarounds that get you halfway to a decent package manager:

* Defaults to installing packages globally instead of per project.

* requirements.txt feels very much like a second class citizen when you contrast it with package.json, Cargo.toml, build.gradle, pom.xml, or most other dependency systems in other package managers.

* No native support for lockfiles short of pinning every version of every project in requirements.txt. This solution is inferior to what is available in ~every other package manager, because a pip freeze doesn't distinguish between explicit dependencies and transitive dependencies.

I'm sure there are others, but they're all along the same lines—pip is strictly inferior than basically every other package manager out there for other languages. Things that people expect to be part of a package manager have ugly hacky workarounds instead that aren't uniformly applied across projects.

duped
0 replies
17h42m

If you want to learn every lesson a package manager author has learned in the last twenty years by way of counterexample, use pip.

Global dependencies by default (fixed by venvs). Shipping precompiled binaries that are not portable across platforms or python versions (kinda fixed by wheels, I forget the PEP number). No lock files with checksum verification by default (kinda fixed by requirements.txt with require hashes, but not really). Also, there's a bunch of weird shit that can happen if the pip version and python version disagree due setup tools.

It's got a great interface for installing stuff. But the likelihood that what it installs breaks something on your machine is pretty high the longer you use it.

Edit: the real, fundamental problem with pip is that it can't be saved. Too many dockerfiles and server provisioning scripts are relying on it - it's interface, semantics, CLI output, etc. It's not 100% broken, which is why it's still useful. But any true fix has to be done by creating a separate tool.

drcongo
3 replies
22h25m

I've been dying to see what they came up with next given how in love I am with Ruff. In my wildest dreams I didn't expect it to be a package resolver. I'm very excited about this.

edit: As a side note, if anyone from Astral happens by, any chance of an RSS feed on your blog?

singhrac
2 replies
22h10m

I was kind of hoping it was a better type checker, something like mypy + pyright. To be fair that's an incredible amount of work, so maybe we'll see that later. Still very excited.

fydorm
0 replies
13h29m

Why include mypy at all? Pyright is amazing and all you need.

drcongo
0 replies
22h7m

That was actually my first guess, and while I certainly would love to see that, packaging is such a headache for so many people that this could be amazing. Even with poetry I still occasionally get mysterious errors and wildly wrong resolver claims about versions not existing.

singhrac
2 replies
20h43m

Something that is super super important to me is editable installs, and local references to those editable installs. Often I have several editable packages set up in a virtualenv which I think of as like my "base" virtualenv.

One reason I struggled with Rye earlier is that it kind of felt like this wasn't entirely supported, possibly because I couldn't parse the information on the workspaces or virtual projects page. Maybe someone else figured this out? It does seem common enough that I would be surprised if it wasn't supported.

charliermarsh
0 replies
15h39m

I think having a good story for local workspaces is gonna be a key piece of this next phase of the project, where we grow uv from a "pip replacement" to a "Poetry replacement" -- or, phrased differently, to something that looks more like Cargo. (Cargo does a bunch of nice things here that I want to learn from.)

Zizizizz
0 replies
11h56m

This does editable installs, I tried it myself yesterday

marclave
2 replies
19h39m

Like pip-compile, uv generates a platform-specific requirements.txt file (unlike, e.g., poetry and pdm, which generate platform-agnostic poetry.lock and pdm.lock files). As such, uv's requirements.txt files may not be portable across platforms and Python versions.

really curious on the reasoning behind this decision :)

anentropic
0 replies
6h24m

https://github.com/astral-sh/uv?tab=readme-ov-file#multi-ver...

says:

uv does not yet produce a machine-agnostic lockfile.

so maybe the non-portable requirements.txt is just a first milestone

Too
0 replies
11h41m

Hard to give a concrete example but you can end up in dependency deadlocks, with combination of packages that require new features vs packages that don’t work on newer versions.

Mostly it’s useful right after a new release where prebuilt wheels aren’t available for all packages yet and you have users who may care or not about this.

If you only have single requirements file, you get forced to choose which platform to support. With multiple, you can break out if needed. These breaking changes and deadlocks are rare. It’s still good to have an escape hatch.

gary_0
2 replies
20h56m

Something I've been waiting to see from language package managers is more attention paid to security. I believe both cargo and pip packages can run arbitrary code as the local user the instant they are installed, and malicious packages have existed in the wild for years. I also recall a blog post where someone was scanning PyPI for malicious Python packages, only to realize that `pip download` also executed code.

Just downloading a library to a local source code directory should not cause arbitrary code to run on your system, and there should be safeguards in place so developers are not one typo away from a malicious package pwning their entire system. Supply chain attacks remain a huge area of concern.

Instead, when I "ctrl+f security" the homepages of any of these new packaging systems, I get 0 results. Not good.

woodruffw
1 replies
20h49m

I also recall a blog post where someone was scanning PyPI for malicious Python packages, only to realize that `pip download` also executed code.

I think you're thinking of this post[1]. The code being searched for wasn't malicious, just buggy :-)

[1]: https://moyix.blogspot.com/2022/09/someones-been-messing-wit...

gary_0
0 replies
20h27m

Thanks, that's the one! I was actually just trying to find it.

Heh, I forgot about the anime catgirl part.

12_throw_away
2 replies
21h6m

This article says they don't support "platform-agnostic lockfiles" YET. And platform-agnostic lockfiles have been mentioned in several comments here, times too. I haven't encountered these before (with python, anyway) - how do they work platform-specific dependencies, e.g., binary wheels?

For instance, there are >30 dists for numpy 1.26.4 [1], with different artifacts for different platforms and python interpreters. Which of those hashes ends up in a platform-agnostic lockfile? Do you have to stick with sdists only?

[1] https://pypi.org/project/numpy/#files

orf
0 replies
21h0m

All of them - poetry adds them as an array, and the right one is installed for the platform.

jvolkman
0 replies
20h54m

PDM and Poetry would include hashes for all possible wheels, and would also follow transitive paths for platforms other than the current host. E.g. if my host is Linux and I depend on foo, and foo depends on bar only on windows, and bar depends on baz, my lock file will still contain baz.

posix_monad
1 replies
1h50m

Why is this written in Rust and not Python?

Does this show deficiencies of Python as a general purpose language?

liveoneggs
0 replies
1h31m

I don't know if the world needs extra examples to prove that python is very slow but this is certainly one more

maxvt
1 replies
22h7m

I'd be curious to see how "packaging for production" is addressed. As described, this is a tool for development purposes:

Think: a single binary that bootstraps your Python installation and gives you everything you need to be productive with Python, bundling not only pip, pip-tools, and virtualenv, but also pipx, tox, poetry, pyenv, ruff, and more.

A lot of that dev tooling is not needed in prod, in fact it is a liability (features available for misuse, code/image size, higher resource requirements). Would there be a "uv-prod" to only deal with the minimum subset of environment setup / container building that enables production use? Would uv build packages that are runnable in prod but themselves don't contain uv? It'd be interesting to hear the plans.

radus
0 replies
16h0m

My approach for this is to use multi-stage dockerfiles. Build tools live and operate in a build stage, and you copy the /venv into your runtime stage. I think this same approach should work with uv so long as it doesn't bundle itself into the venv.

duped
1 replies
22h4m

Do you plan to support platform-agnostic lockfiles? From the wording it sounds like "yes" but also "this is a feature not a bug." Which is it?

Checksum verification and portable lockfiles are kind of table stakes for me, which is why I use poetry everywhere possible. I can't give up correctness for speed.

sidenote: this would be super useful as a library so I could plug it into build systems. Managing python dependencies in standards-compliant ways is a pain, so shelling out to tools like poetry and pip are used even though it would be nicer to write that code eg in Rust and just use the parts you care about.

charliermarsh
0 replies
15h42m

Yes, we absolutely want to support platform-agnostic lockfiles. Very important thing.

The "feature not a bug" tone is probably because we intentionally limited the scope of this release to _not_ support platform-agnostic resolution, since it adds a lot of complexity (and we were still able to ship something that's immediately useful for those that rely on pip and pip-tools today).

the__alchemist
0 replies
20h46m

Very cool! Of note, I made something along these lines a few years ago, although with a slightly broader scope to also include managing and installing python versions. I abandoned it due to lack of free time, and edge cases breaking things. The major challenge is that Python packages that aren't wheels can do surprising things due to setup.py running arbitrary code. (https://github.com/David-OConnor/pyflow)

For some more context, at the time, poetry and pipenv were both buggy, and had some limitations their teams listed as "wont-fix". The sort of thing where you hit a breaking bug immediately on certain systems. These have since been fixed.

rengler33
0 replies
22h8m

This is great. Just adopted the pip-tools approach on a project and that has been great. Excited to give this a try.

ofek
0 replies
21h37m

Exciting stuff! I view Hatch [1] as becoming the Cargo for Python because it's already close and has an existing (and growing) user base but I can definitely see depending on this for resolution and potentially not even using pip after it becomes more stable.

[1]: https://hatch.pypa.io/latest/

nmca
0 replies
15h15m

Initially I was confused by the business model here, but I eventually figured it out --- improved python tooling is going to accelerate AI so much that they can just rely on the future gratitude of the machine god.

nindalf
0 replies
21h12m

Feel like I called this 11 days ago - https://news.ycombinator.com/item?id=39251014

I had to guess, that’s the path that the Astral team would take as well - expand ruff’s capabilities so it can do everything a Python developer needs. So the vision that Armin is describing here might be achieved by ruff eventually. They’d have an advantage that they’re not a single person maintenance team, but the disadvantage of needing to show a return to their investors.
murkt
0 replies
21h54m

An ability to override dependencies, finally!

marmaduke
0 replies
22h24m

For envs which are pets, who cares, but this will be great for those cattle situations.

iwebdevfromhome
0 replies
21h54m

Very nice! Not long ago we got rid of a bunch of tools at work in favor of ruff. I'm looking forward to this. Specially when the time comes to be able to replace poetry with it, hoping it is a good implementation.

ericra
0 replies
22h25m

While I don't see the immediate need to switch from using pip, I am such a fan of ruff that I'm always happy to check out what you guys come out with.

Please keep doing what you're doing!

ddanieltan
0 replies
16h14m

Any plans to adopt Rye's approach of using standard Python builds to cover the installing of different Python versions?

I feel uv should provide a way to install different python versions to truly cover an end-to-end tool. The current approach of searching for existing virtualenvs of conda envs helps but I was hoping to completely remove the need for another package/dependency manager.

(Taken from the docs)

  If a --python-version is provided to pip compile (e.g., --python-version=3.7), uv will search for a Python interpreter matching that version in the following order:

  - An activated virtual environment based on the VIRTUAL_ENV environment variable.

  - An activated Conda environment based on the CONDA_PREFIX environment variable.

  - A virtual environment at .venv in the current directory, or in the nearest parent directory.

  - The Python interpreter available as, e.g., python3.7 on macOS and Linux. On Windows, uv will use the same mechanism as py --list-paths to discover all available Python interpreters, and will select the first interpreter matching the requested version.

  - The Python interpreter available as python3 on macOS and Linux, or python.exe on Windows.

davidthewatson
0 replies
9h34m

I've complained about the resemblance to the Ruby community's packaging conflict a decade ago almost two years ago now. I may have been in the minority, but was apparently not alone:

https://news.ycombinator.com/item?id=32116649

Python packaging has gotten asymptotically worse in the last 90 days from my experience having used the language for 20 years now and working on a multi-platform triumvirate of Lin-Mac-Win that has not changed in twenty years.

I can honestly say that, given how daily failures in the packaging ecosystem such as pip failing to build egg, pkg-config, and similar problems, I'm happy to see a solution from rust, because I sent an intern into the polars underworld 2 years ago to great success and have only grown my own reliance on rust infrastructure more broadly since.

This may very well be the only packaging innovation in python that I actually look forward to trying tomorrow. I'd already be working in rust full-time if I could only get past the syntax that reminds me too much of perl after bathing in the semantic whitespace light for this long.

chrisshroba
0 replies
21h29m

Obligatory two xkcd's, but I'm super excited about this project! Definitely going to try it out.

https://xkcd.com/1987/ https://xkcd.com/927/

capital_guy
0 replies
14h31m

A lot of uncalled for pessimism here in the comments. If python folks reading haven’t used ruff yet, i can highly recommend it. The astral folks have proven themselves to me already. Looking forward to more and more of their rust built python tooling.

alfalfasprout
0 replies
21h55m

While we don't particularly care about pip lockfiles, etc. as we're moving to the conda/mamba ecosystem a faster pip is most welcome. We invoke it a lot programmatically and it's very, very slow.

ZeroCool2u
0 replies
20h1m

So uv uses pubgrub-rs, a rust implementation of the pubgrub version solving algorithm originally written for the dart language. I suppose I shouldn't be surprised, but I always loved Pub the Dart/Flutter packaging tool. It's the only thing out there that comes close to cargo that I know of. Fun to see these chains of inspiration reach across languages.

SamEdwardes
0 replies
21h31m

Sign me up! A single binary to manage Python is something I have been hoping for.