I'm a C and C++ dev, using some python tooling here and there to make my life easier. I've been using -m since forever. I kinda settled into it naturally. What issues can it cause?
-m has very few drawbacks. One I encountered is when a cmd tool doesn't provide a __main__.py file, in which case -m will just not work. It's rare, but it does happen. Another one is that there is no completion on dotted path, so calling nested scripts means you have to type the thing yourself.
I have to tell, first you make me mad! I did not agree. pyenv - works, packaging work, installing on system wide with pip works... Then I realize you kind of right...
I'm maintainer a small windows desktop app (probably used by less then 100 people). It available as python package at PyPI, but be installed with pip. I did what most developer would do release as whl, allow to install and been updated with pip, simple stuff.
But it turn out most of users are non-developers... I realize it when I got really simple question like: that is terminal, what is cmd, where to put command (pip install dcspy), I got weird output, it is correct or not.... Once, one user reinstall whole windows because he got some strange message/question (normal from developer perspective behavior).
So, yes... we developers solve dozen of simple problem daily, don't realize it... love to tinkering...
So now I just deliver one-file executable (done with pyinstaller/nuitka) and keep wheel as backward compatibility.
Thanks for the fun read, however I don't understand how you are suggesting to manage different Python versions without pyenv (or similar). eg. new projects that have been written to use tomllib for configuration and require Python 3.11 or newer. The user is on Debian 11, which ships with Python 3.9.
You say to use -m but that won't help me get Python 3.11 or above. You say we should be using binaries. From where? python.org only provides source for anything but Mac and Windows. What you are proposing seems to create an entirely new problem that should not exist.
You have seemingly worked with about a hundred paying clients, so I suspect part of the problem is that your experience is somewhat skewed. It doesn't seem to match up with the needs of the Python and free software communities that arguably care about Python the most.
Feel free to correct me if I have misunderstood something.
As I understand it, you seem to be making a huge assumption that normal people cannot use Debian? Debian and other distributions are just an operating system, and very user friendly these days. It should be much easier than a programming language!
As the article also states:
"But with a single tutorial, it will help the most people, on the biggest number of configurations."
Also, "help the most people" and " the biggest number of configurations" are contradictory goals. If the article just wants to help the most people, it doesn't need to care about anyone that doesn't run Windows.
I don't have a a definition for "normal people" other that the Gauss law, and in this context, I'm confident the center of the bell curve matches those expectations.
As a Linux user myself, I am well aware the majority of Python users are using Windows or Mac, not linux.
Also, as explained before, a huge chunk of the community are not professional Python devs.
Now if you take the subset of professional Python dev using linux, and being in the particular situation where they have to deliver software not for them, but for a third party user, and it happens that there is no choice for the deployment to use Debian in particular, and a version that doesn't have access to a Python compatible with your software.
Then yes, we are talking about a smaller group of people (that I am part of), which are mostly composed of people who are supposed to have the engineering capabilities of dealing with the problem, and therefore, should self-exclude from the article. Or write it.
Regarding the claim "the majority of Python users are using Windows or Mac, not linux", I question how you could truly claim to know this? Logically, this seems very unlikely.
If there are more people using a GNU/Linux desktop than a Mac desktop, combined with the knowledge that Python is also used on many more devices such as the Raspberry Pi that can't even run macOS or Windows, combined with the knowledge that large parts of many GNU/Linux distros are written in Python, I don't know how you could prove this.
Even if it were possible to prove that more people "developed" on macOS (for example), the chances are that, with the popularity of servers and embedded/SOC systems running GNU/Linux (which have a very low barrier of entry to get into these days), more people will be attempting to "run" Python on a GNU/Linux box at some point.
That a huge portion of the Python community are not professional Python devs is my point, and this problem is certainly not exclusive to the Debian distribution. For example, look at the gamers running a Steam Deck, and just booting to "Desktop mode" to run a script from GitHub. That's a lot of people who are not experts, but should be able to run modern Python code with minimal fuss. The Steam Deck currently includes Python 3.10.8 FWIW. Even Arch was stuck on 3.10 until April, so all the more user-friendly derivatives would also likely have been affected for quite a while (for the specific example issue I highlighted).
I appreciate you sharing your views, but implying that since somebody is on GNU/Linux, that they're supposed to be an expert, is quite the cop-out. It is not a stance that I am prepared to support.
I can see from your comment that you are part of the "If you disagree, it’s likely that you are too good at handling complexity to realize how many problems you solve every day." crowd.
I won't deny that I have a good amount of technical knowledge, but my concern remains with those that do not.
I would rather a procedure that's a few more steps with higher risk of failure but potentially supports everyone, as opposed to a procedure that is easier but leaves a lot of people completely screwed.
This is especially true when it's the free software operating systems that are primarily impacted by the decision. As fans of Python, it's in our interest to demonstrate the importance of free software and promote user freedom. That means, making things easier for people there. What was proposed seems to do the opposite of that in some ways.
To put it another way, if somebody does not know that pyenv exists because we consciously didn't tell them, we are sacrificing the freedom of some users to use a program for the convenience of other people.
So the way I see it, there are technical concerns, there are concerns purely from a free software activist perspective, and there are even ethical concerns.
Perhaps at some point, someone will make a program in something like Go that is compiled for all operating systems and architectures, and provides an isolated runtime environment for Python to execute in safely and easily? That way, you could effectively have pyenv functionality without the Bash dependency, without having to compile anything, and without having to lose anyone along the way.
Jun 20, 2023·edited Jun 20, 2023Liked by Bite Code!
Thanks for that article. Though I've only slimmed over it, I've found a lot of great points.
I've always used pipenv for my projects. And the "only" problem that I sometimes have is with "pipenv lock". But your article made me aware now that this is actually not good, because when I one day will have to port all of my projects to a new computer, I might deeply regret not having working pipfile.lock files for all of my projects. Must admit that I haven't taken this serious enough. But I've already figured out a workaround.
Probably I'll use venv from now on. No hassle with pipenv.lock not working.
But one thing makes me clueless: How do you achieve building venvs via "python -m venv name_of_venv" with a specific python version when not using pyenv? Do you install the wished version globally then and select the wished python binary when creating the venv? Not sure if it's even possible or advised to install multiple Python versions globally. I'm on macOS and always use the official installer from the Python website to install Python globally. But I've never tried installing multiple of them. Is that good or even possible?
EDIT: Ah ok, reading parts of your article again, on Mac you should do something like "python3.8 -m venv .venv". But does that mean I can install several Python versions safely using the official installer?
I believe it's worth to read your article completely. Will do it soon.
I must say I really like your blog a lot! Have already found some very interesting articles with interesting in depth clarifications.
I'm coding in Python since 2019, and it has become my fav language. I always try to learn new things in small chunks, but as your blog title has already realized, there is always limited time for that.
I knew that using venv is still the standard way to do virtual env things. But I've been fine using pipenv, or at least I believed that so far. I think I'll switch now completely to venv. Will also solve some other issues for me. And actually installing pyenv and other tools is also always a bit of work, e.g. when setting up a new Linux VPS. So now I can save that time.
I also like your tldr at the beginning of your posts. That way it's possible to grab the essential infos fast, and read the details later.
Thanks again! Great work!
P.S. Wondering what's your opinion about pipx. I use it almost exclusively when installing global packages that are cli relevant. But I recently seemed to have some strange issues with it, though it was not too clear if it was because of pipx or because of something else.
I should have include pipx in the article, because you are not the first person to ask this.
Basically, I think it was a good concept, but it brought as many issues at it solved:
- you need to install it first, and it's not a stand alone executable. So many ways for people to mess that up.
- the way it patches the PATH is imperfect, so sometimes you just don't have the intended effect. When people read the doc and actually patch the PATH in the first place.
- when you install pipx, there is a subtle trap: you install it with a certain version of Python. This version condition all the packages you are going to install with it, which mean you may install mypy or black for python 3.7, then run them on a 3.11 codebase, and get an error because of the new syntax. But most people don't know this.
- has you said, it's has random issues, and so I eventually stopped using it because I need something more reliable.
All in all, its the same problem as with other tools: it solves some issues, but bring more modes of failure.
Thanks again for your reply! You seem to be a true Python expert. I've read in one of your blog posts that you have started out with Python 2.4 (I hope I remember this right), so you must have more than a decade of experience, and a lot of teaching experience.
It was really difficult for me to find answers for my pipx problem a few months ago. A user on Reddit gave me a kind of good, possible explanation. But yours here is by far better.
Maybe you could write an own article about pipx? Just as a suggestion. I believe a lot of people would be interested.
Or you could just ad a hint in your article to this comment concerning pipx, because I think you have already explained it pretty well here.
So you are just using standard pip global installation for cli relevant packages (like "pip install <package> --user")?
I'm really a person who always tries to collect the best news and knowledge resources. And I must say that I have the impression that your blog here is the by far best I've come across in the Python realm for a long time. Thanks again!
You should install every tool each virtualenv of each project. It will take more space on disk, and seems like a waste, but eventually it will save you a lot of trouble.
If you need something at the system level (which again, should be very, very rare. black, mypy or pylint should not, and apps should provide another format than wheels), you can fallback on a big venv for your whole system tools and little scripts.
I've played with your methods a bit, but unfortunately I believe I've discovered a weakness.
I took the opportunity and read a bit about venv in the Python docs (though not strictly in depth). It looks like that venv basically adds the Python interpreter of the virtual environment to the beginning of the PATH variable (speaking Linux here b.t.w, I've tested this on a Linux VPS). And "source venv_name/bin/activate" is just a bash script which makes that change of the PATH variable (along with some other stuff that I don't fully understand, because my bash skills are mediocre).
I'm into web3 hacking, where e.g. I use a Python tool called "solc-select". Which is assumed to be installed globally via "pip install solc-select --user". That tool has a cli shortcut, for example you can do "solc-select install 0.8.15", which will install version 0.8.15 of the solidity compiler and add it to a folder in your home directory, which is "~/.solc-select".
It's now useful to do something like "solc-select install all", which will install all existing Solidity compilers. That is better for your daily work. All these compilers together are around 750 MB (getting bigger with each newly released Solidity version). I've found out that the "solc-select" cli command is just another bash script, which will be stored in "~/.local/bin/solc-select", which is in your PATH.
When I now follow your suggestion, setting up everything fresh in a venv for each project, the solc-select script will be stored in "venv_name/bin". You can call it then normally via the "solc_select" command as long as the virtual environment is activated.
But when you now do "solc-select install all", the Soldity compilers won't be downloaded to "~/.solc-select", but instead to "venv_name/.solc-select".
All Solidity compilers together are currently 775 MB. Size continiously increasing, cause development is fast paced in that space.
That means that I would have to download 775 MB for each new project, and keep that stored. That's a bit overkill.
I conclude now: While your method makes sense, the problem is that the Python community is already focussed on methods like global installation via pip (or pipx, I know projects where pipx is explicitly recommended for installation). Especially global tools are designed as global tools. The devs of these tools don't expect that people will install them in a virtual environment.
Maybe I could install global tools to a global venv as you have suggested. But you can activate only one venv at a given time. While you can add the path of that global venv to your PATH permanently to be able to call scripts like "solc-select", the next problem is that I'm also using other global Python tools like "slither", which rely on a globally installed solc-select, and will check if it is installed.
At this point, it is getting too complicated for me. Going against the flow of the Python community sometimes seem to come with a price that is too expensive.
Nevertheless, I've learnt a lot today. I might try to find some compromises.
Ok cool. There are some further questions popping up in my mind, but I believe I know enough now to figure out the rest by myself.
You have really very interesting takes on all this. I have never come along something similar before in the web. I also totally agree to the "keep it simple" and "keep it to the basics" mindset. Nothing worse and time consuming than surprising errors where you have to put tons of time into to find out what it is.
Python is anyhow a lot about "keep it simple". So it suits to the language as well.
Jun 20, 2023·edited Jun 20, 2023Liked by Bite Code!
I reduced my complexity by retiring and now just program my own projects.
The complexity I once ranted about was compiled (and linked!) programs. Keeping track of all the parts was a pain (where's the code for that exe?). Having everything in one place with a script (yea python) was great. I love git, but a self contained script and normal incremental backups worked fine too.
I couldn't figure out what venv was trying to solve, until I started to install new versions of python to try out. Boy did that mess things up. Never figured out a good recipe for deploying a python program with a venv. I just pay better attention to which python version I'm using (tell vscode which to use) and what is installed at the deployment site. Never use the latest and greatest python. Never.
I am tempted to get back to using types with python type hints, but a python strength was it could figure out types just fine. My first foray with type hints and I spent more time tracing type issues that just then went away when I finally ripped out all the type hints. I do want to hover over a variable and see the full type. <sigh>
Very verbose (as I often am) but still interesting to read.
Only empathic progrmmers understand this misery, when watching how knowledge is aquired the wrong way and everybody has a strong opinon on a direction to solve it. You described it so cleary, that i wished i could. A really wonderful article about a fundamental problem in IT and for the masses of people touching programming interfaces and languages that we now are.
I was reading your article while dealing with an Streamlit Cloud deployment ordered at last minute that should be ready for the next day. The installation was a pain, but your article was a relief. Thanks for your advice.
Big fan of pyenv-virtualenv when showing other developers who are less familiar with python how to get set up (appreciate a different audience to normies)
I'm a C and C++ dev, using some python tooling here and there to make my life easier. I've been using -m since forever. I kinda settled into it naturally. What issues can it cause?
-m has very few drawbacks. One I encountered is when a cmd tool doesn't provide a __main__.py file, in which case -m will just not work. It's rare, but it does happen. Another one is that there is no completion on dotted path, so calling nested scripts means you have to type the thing yourself.
I have to tell, first you make me mad! I did not agree. pyenv - works, packaging work, installing on system wide with pip works... Then I realize you kind of right...
I'm maintainer a small windows desktop app (probably used by less then 100 people). It available as python package at PyPI, but be installed with pip. I did what most developer would do release as whl, allow to install and been updated with pip, simple stuff.
But it turn out most of users are non-developers... I realize it when I got really simple question like: that is terminal, what is cmd, where to put command (pip install dcspy), I got weird output, it is correct or not.... Once, one user reinstall whole windows because he got some strange message/question (normal from developer perspective behavior).
So, yes... we developers solve dozen of simple problem daily, don't realize it... love to tinkering...
So now I just deliver one-file executable (done with pyinstaller/nuitka) and keep wheel as backward compatibility.
Great reed...
I wrote a comment here last night which didn't seem to appear, but it did appear on Notes, if anyone is interested:
https://substack.com/profile/62817-bjkeefe/note/c-17558557
Thanks for the fun read, however I don't understand how you are suggesting to manage different Python versions without pyenv (or similar). eg. new projects that have been written to use tomllib for configuration and require Python 3.11 or newer. The user is on Debian 11, which ships with Python 3.9.
You say to use -m but that won't help me get Python 3.11 or above. You say we should be using binaries. From where? python.org only provides source for anything but Mac and Windows. What you are proposing seems to create an entirely new problem that should not exist.
You have seemingly worked with about a hundred paying clients, so I suspect part of the problem is that your experience is somewhat skewed. It doesn't seem to match up with the needs of the Python and free software communities that arguably care about Python the most.
Feel free to correct me if I have misunderstood something.
As the article state:
"The people that will understand it’s not for them will self-exclude from the recommendations, as they should, since they know what to do anyway."
As I understand it, you seem to be making a huge assumption that normal people cannot use Debian? Debian and other distributions are just an operating system, and very user friendly these days. It should be much easier than a programming language!
As the article also states:
"But with a single tutorial, it will help the most people, on the biggest number of configurations."
There are more desktop GNU/Linux users than Mac users these days, based on web stats. eg. https://www.w3counter.com/globalstats.php?year=2023&month=5
Also, "help the most people" and " the biggest number of configurations" are contradictory goals. If the article just wants to help the most people, it doesn't need to care about anyone that doesn't run Windows.
I don't have a a definition for "normal people" other that the Gauss law, and in this context, I'm confident the center of the bell curve matches those expectations.
As a Linux user myself, I am well aware the majority of Python users are using Windows or Mac, not linux.
Also, as explained before, a huge chunk of the community are not professional Python devs.
Now if you take the subset of professional Python dev using linux, and being in the particular situation where they have to deliver software not for them, but for a third party user, and it happens that there is no choice for the deployment to use Debian in particular, and a version that doesn't have access to a Python compatible with your software.
Then yes, we are talking about a smaller group of people (that I am part of), which are mostly composed of people who are supposed to have the engineering capabilities of dealing with the problem, and therefore, should self-exclude from the article. Or write it.
Regarding the claim "the majority of Python users are using Windows or Mac, not linux", I question how you could truly claim to know this? Logically, this seems very unlikely.
If there are more people using a GNU/Linux desktop than a Mac desktop, combined with the knowledge that Python is also used on many more devices such as the Raspberry Pi that can't even run macOS or Windows, combined with the knowledge that large parts of many GNU/Linux distros are written in Python, I don't know how you could prove this.
Even if it were possible to prove that more people "developed" on macOS (for example), the chances are that, with the popularity of servers and embedded/SOC systems running GNU/Linux (which have a very low barrier of entry to get into these days), more people will be attempting to "run" Python on a GNU/Linux box at some point.
That a huge portion of the Python community are not professional Python devs is my point, and this problem is certainly not exclusive to the Debian distribution. For example, look at the gamers running a Steam Deck, and just booting to "Desktop mode" to run a script from GitHub. That's a lot of people who are not experts, but should be able to run modern Python code with minimal fuss. The Steam Deck currently includes Python 3.10.8 FWIW. Even Arch was stuck on 3.10 until April, so all the more user-friendly derivatives would also likely have been affected for quite a while (for the specific example issue I highlighted).
I appreciate you sharing your views, but implying that since somebody is on GNU/Linux, that they're supposed to be an expert, is quite the cop-out. It is not a stance that I am prepared to support.
I can see from your comment that you are part of the "If you disagree, it’s likely that you are too good at handling complexity to realize how many problems you solve every day." crowd.
I won't deny that I have a good amount of technical knowledge, but my concern remains with those that do not.
I would rather a procedure that's a few more steps with higher risk of failure but potentially supports everyone, as opposed to a procedure that is easier but leaves a lot of people completely screwed.
This is especially true when it's the free software operating systems that are primarily impacted by the decision. As fans of Python, it's in our interest to demonstrate the importance of free software and promote user freedom. That means, making things easier for people there. What was proposed seems to do the opposite of that in some ways.
To put it another way, if somebody does not know that pyenv exists because we consciously didn't tell them, we are sacrificing the freedom of some users to use a program for the convenience of other people.
So the way I see it, there are technical concerns, there are concerns purely from a free software activist perspective, and there are even ethical concerns.
Perhaps at some point, someone will make a program in something like Go that is compiled for all operating systems and architectures, and provides an isolated runtime environment for Python to execute in safely and easily? That way, you could effectively have pyenv functionality without the Bash dependency, without having to compile anything, and without having to lose anyone along the way.
Thanks for that article. Though I've only slimmed over it, I've found a lot of great points.
I've always used pipenv for my projects. And the "only" problem that I sometimes have is with "pipenv lock". But your article made me aware now that this is actually not good, because when I one day will have to port all of my projects to a new computer, I might deeply regret not having working pipfile.lock files for all of my projects. Must admit that I haven't taken this serious enough. But I've already figured out a workaround.
Probably I'll use venv from now on. No hassle with pipenv.lock not working.
But one thing makes me clueless: How do you achieve building venvs via "python -m venv name_of_venv" with a specific python version when not using pyenv? Do you install the wished version globally then and select the wished python binary when creating the venv? Not sure if it's even possible or advised to install multiple Python versions globally. I'm on macOS and always use the official installer from the Python website to install Python globally. But I've never tried installing multiple of them. Is that good or even possible?
EDIT: Ah ok, reading parts of your article again, on Mac you should do something like "python3.8 -m venv .venv". But does that mean I can install several Python versions safely using the official installer?
I believe it's worth to read your article completely. Will do it soon.
Yes, you can safely install several Python versions on the same machine using the official installer. I have an article about installing Python for this very purpose: https://www.bitecode.dev/p/installing-python-the-bare-minimum
Thanks for your answer!
I must say I really like your blog a lot! Have already found some very interesting articles with interesting in depth clarifications.
I'm coding in Python since 2019, and it has become my fav language. I always try to learn new things in small chunks, but as your blog title has already realized, there is always limited time for that.
I knew that using venv is still the standard way to do virtual env things. But I've been fine using pipenv, or at least I believed that so far. I think I'll switch now completely to venv. Will also solve some other issues for me. And actually installing pyenv and other tools is also always a bit of work, e.g. when setting up a new Linux VPS. So now I can save that time.
I also like your tldr at the beginning of your posts. That way it's possible to grab the essential infos fast, and read the details later.
Thanks again! Great work!
P.S. Wondering what's your opinion about pipx. I use it almost exclusively when installing global packages that are cli relevant. But I recently seemed to have some strange issues with it, though it was not too clear if it was because of pipx or because of something else.
I should have include pipx in the article, because you are not the first person to ask this.
Basically, I think it was a good concept, but it brought as many issues at it solved:
- you need to install it first, and it's not a stand alone executable. So many ways for people to mess that up.
- the way it patches the PATH is imperfect, so sometimes you just don't have the intended effect. When people read the doc and actually patch the PATH in the first place.
- when you install pipx, there is a subtle trap: you install it with a certain version of Python. This version condition all the packages you are going to install with it, which mean you may install mypy or black for python 3.7, then run them on a 3.11 codebase, and get an error because of the new syntax. But most people don't know this.
- has you said, it's has random issues, and so I eventually stopped using it because I need something more reliable.
All in all, its the same problem as with other tools: it solves some issues, but bring more modes of failure.
I can deal with modes of failure, it's my job.
But most people don't want to.
Thanks again for your reply! You seem to be a true Python expert. I've read in one of your blog posts that you have started out with Python 2.4 (I hope I remember this right), so you must have more than a decade of experience, and a lot of teaching experience.
It was really difficult for me to find answers for my pipx problem a few months ago. A user on Reddit gave me a kind of good, possible explanation. But yours here is by far better.
Maybe you could write an own article about pipx? Just as a suggestion. I believe a lot of people would be interested.
Or you could just ad a hint in your article to this comment concerning pipx, because I think you have already explained it pretty well here.
So you are just using standard pip global installation for cli relevant packages (like "pip install <package> --user")?
I'm really a person who always tries to collect the best news and knowledge resources. And I must say that I have the impression that your blog here is the by far best I've come across in the Python realm for a long time. Thanks again!
You should install every tool each virtualenv of each project. It will take more space on disk, and seems like a waste, but eventually it will save you a lot of trouble.
If you need something at the system level (which again, should be very, very rare. black, mypy or pylint should not, and apps should provide another format than wheels), you can fallback on a big venv for your whole system tools and little scripts.
Hi again,
I've played with your methods a bit, but unfortunately I believe I've discovered a weakness.
I took the opportunity and read a bit about venv in the Python docs (though not strictly in depth). It looks like that venv basically adds the Python interpreter of the virtual environment to the beginning of the PATH variable (speaking Linux here b.t.w, I've tested this on a Linux VPS). And "source venv_name/bin/activate" is just a bash script which makes that change of the PATH variable (along with some other stuff that I don't fully understand, because my bash skills are mediocre).
I'm into web3 hacking, where e.g. I use a Python tool called "solc-select". Which is assumed to be installed globally via "pip install solc-select --user". That tool has a cli shortcut, for example you can do "solc-select install 0.8.15", which will install version 0.8.15 of the solidity compiler and add it to a folder in your home directory, which is "~/.solc-select".
It's now useful to do something like "solc-select install all", which will install all existing Solidity compilers. That is better for your daily work. All these compilers together are around 750 MB (getting bigger with each newly released Solidity version). I've found out that the "solc-select" cli command is just another bash script, which will be stored in "~/.local/bin/solc-select", which is in your PATH.
When I now follow your suggestion, setting up everything fresh in a venv for each project, the solc-select script will be stored in "venv_name/bin". You can call it then normally via the "solc_select" command as long as the virtual environment is activated.
But when you now do "solc-select install all", the Soldity compilers won't be downloaded to "~/.solc-select", but instead to "venv_name/.solc-select".
All Solidity compilers together are currently 775 MB. Size continiously increasing, cause development is fast paced in that space.
That means that I would have to download 775 MB for each new project, and keep that stored. That's a bit overkill.
I conclude now: While your method makes sense, the problem is that the Python community is already focussed on methods like global installation via pip (or pipx, I know projects where pipx is explicitly recommended for installation). Especially global tools are designed as global tools. The devs of these tools don't expect that people will install them in a virtual environment.
Maybe I could install global tools to a global venv as you have suggested. But you can activate only one venv at a given time. While you can add the path of that global venv to your PATH permanently to be able to call scripts like "solc-select", the next problem is that I'm also using other global Python tools like "slither", which rely on a globally installed solc-select, and will check if it is installed.
At this point, it is getting too complicated for me. Going against the flow of the Python community sometimes seem to come with a price that is too expensive.
Nevertheless, I've learnt a lot today. I might try to find some compromises.
Ok cool. There are some further questions popping up in my mind, but I believe I know enough now to figure out the rest by myself.
You have really very interesting takes on all this. I have never come along something similar before in the web. I also totally agree to the "keep it simple" and "keep it to the basics" mindset. Nothing worse and time consuming than surprising errors where you have to put tons of time into to find out what it is.
Python is anyhow a lot about "keep it simple". So it suits to the language as well.
I reduced my complexity by retiring and now just program my own projects.
The complexity I once ranted about was compiled (and linked!) programs. Keeping track of all the parts was a pain (where's the code for that exe?). Having everything in one place with a script (yea python) was great. I love git, but a self contained script and normal incremental backups worked fine too.
I couldn't figure out what venv was trying to solve, until I started to install new versions of python to try out. Boy did that mess things up. Never figured out a good recipe for deploying a python program with a venv. I just pay better attention to which python version I'm using (tell vscode which to use) and what is installed at the deployment site. Never use the latest and greatest python. Never.
I am tempted to get back to using types with python type hints, but a python strength was it could figure out types just fine. My first foray with type hints and I spent more time tracing type issues that just then went away when I finally ripped out all the type hints. I do want to hover over a variable and see the full type. <sigh>
Very verbose (as I often am) but still interesting to read.
So true but I’d love to just use pip, but there is no way I can get tensor flow or scikit learn to work on my machine without conda.
Give a chance to: https://www.bitecode.dev/p/relieving-your-python-packaging-pain
And attempt to install it once again using this procedure.
Maybe this time you will get better luck.
Only empathic progrmmers understand this misery, when watching how knowledge is aquired the wrong way and everybody has a strong opinon on a direction to solve it. You described it so cleary, that i wished i could. A really wonderful article about a fundamental problem in IT and for the masses of people touching programming interfaces and languages that we now are.
Switched to C# a couple of years ago, don't miss this nonsense. Should have done it WAY sooner, like around the PY3K disaster.
I really like this article and the "Relieving your ...pain" article before that. Can't wait for you to publish all those other ones you mentioned ...
I was reading your article while dealing with an Streamlit Cloud deployment ordered at last minute that should be ready for the next day. The installation was a pain, but your article was a relief. Thanks for your advice.
Big fan of pyenv-virtualenv when showing other developers who are less familiar with python how to get set up (appreciate a different audience to normies)
In the previous article on this substack the author covers off ‘python -m’ nicely