#Python package management question: Is there some way to have pip
(and tools using it, e.g. Nox/Tox) use local sources and ideally distro installs, and only download from PyPI after explicit confirmation? I'm well aware of venv etc., my point is about the pip
calls to set those up.
Ideally, what I'd want pip
to do to resolve a package, in order:
1. Check a list of locally configured source directories. As in, if I have the repository for package A, I add that directory to a list/mapping, and then A always gets installed from the source as present there, or a specific tag.
2. Use the package as installed by my distro. This is similar to pipx install --system-site-packages
, but I'd only want the explicitly installed packages available in the venv (or pipx
install, etc.), not all, so tests can confirm that the dependencies listed in pyproject.toml
are actually complete.
3. Use a local cache of PyPI (and optionally other sources) packages.
4. Fetch to/update cache only after explicit approval.
index-url
or find-links
don't cover this, because they require pre-built sdist/wheel packages. I want to be able to work offline as far as possible, and keep track of the exact sources I'm using. Partially inspired by #Buildroot, which gives me an archive of all sources used in my firmware build as part of the concept.
As far as I see it now, if I really want this without heavily patching pip
, my only option would be to implement all this as something that provides a web interface I can use as index-url
, and I'm not excited about that idea. :neocat_laugh_sweat: