@brodriguesco can #Rix replaced with #Conda ? 🤔
@brodriguesco can #Rix replaced with #Conda ? 🤔
Des problèmes de licence rendent l'usage de #conda peu sûr à l'Université d'un point de vue juridique (suite à un changement de CGU en mars 2024 qui rend la solution payante pour les firmes de + de 200 ETP) https://mamot.fr@rupdecat@fediscience.org/114476678505453616 ) Préférez les alternatives libres (#Miniconda pour des petits projets, sinon passer par le channel conda-forge pour ne charger avec conda que des packages open source) #Python. https://www.cdotrends.com/story/4173/anaconda-threatens-legal-action-over-licensing-terms
@brunopostle I thought we could start a discussion here on the development of a Python package for IFC-to-gbXML-conversion, with the aim of making it available on PyPI and Conda. By having the discussion here, we might attract the interest of other contributors.
You've already done great work on your fork of MSVisschers' original repo. For reference, I'll link to your repo here: https://github.com/brunopostle/IFC-to-gbXML-converter
Ah yes, yet another 🐍 #Python library, because clearly we didn't have enough already, right? #Wetlands promises to streamline managing #Conda environments, but honestly, it feels like they wrote a whole novel about installing a lightbulb. 😏🔧 Good luck navigating that "minimal" example buried under a mountain of fluff! 📚
https://arthursw.github.io/wetlands/0.2.0/ #Libraries #Development #Programming #Humor #HackerNews #ngated
Wetlands – a lightweight Python library for managing Conda environments
https://arthursw.github.io/wetlands/0.2.0/
#HackerNews #Wetlands #Python #Conda #Library #Environment #Management #Lightweight
How to best create, maintain and archive custom environments from within Jupyter? .. just updated the documentation for Carto-Lab Docker with examples for Python [1] and R [2].
The tricky part is linking Kernels from custom envs with a Jupyter kernelspec (specifically if the Jupyter server and the Kernel are in two different environments). However, most of this can be stored in Jupyter notebook cells, for reproducibility.
There's also a section on archival of package versions with Conda's `env export` (yml approach) and `conda list --explicit` (full archival).
[1]: https://cartolab.theplink.org/use-cases/#create-your-own-environment-in-a-bind-mount-and-install-the-ipkernel
[2]: https://cartolab.theplink.org/use-cases/#example-create-an-environment-with-a-specific-r-version
It doesn't matter how long you play the open source game, the first bug report from a downstream redistributor for a new project is a somewhat surreal experience. For all its many challenges, the development of free and open source software is an utterly astonishing feat of global collaboration. #python #venvstacks #conda #foss
Pixi v0.47.0 is out!
🔒 Making solving even more reproducible by setting the date with exclude-newer!
💪 More powerful task dependencies by sharing arguments.
@guenther
Disclamer: not a Rust pro by any chances, but following the plot for some time and having 'Learn Rust' in my plans.
Post looks like:
- Your language and ecosystem promotes #bloat and breaking compatibility between lib versions.
- But what about other languages and cross-platform glue-and-sticks ??!!!
Static linking (and it's substitutes like flathub etc.) is evil - see windows ecosystem. Yes, it's cheaper for developer by the cost paid by users (and multiplied by N of users).
Communities, where breaking compatibility, yes, python, I look at you, has become a norm are trapped in the BS of "separate environment for each tiny piece of code" forever. And it creates horrible overhead. #Conda envos with 100k files for each tool which each user creates separately are a nightmare on #HPC systems. Containers with a few GB overhead for each tiny tool are better, but still a massive waste.
Distros, package managers and stable APIs (or even ABIs) are there for a reason and Rust reinvents/abandons all this just because atrocities like npm became normal and it was easier to cut that edge.
Praca z #Conda-forge:
Czekasz 15 godzin, aż paczka się zbuduje. Tuż przed końcem testowania czas się kończy i budowanie zostaje przerwane.
Zwiększasz dopuszczalny czas budowania w conda-forge.yml. Znów czekasz 15 godzin. Znów budowanie zostaje przerwane.
Tym razem pamiętasz, żeby dać rerender. Miejmy nadzieję, że się uda…
#Conda-forge work be like:
You wait 15 hours for a build to finish. It times out after 15 hours, just barely before it finishes running the test suites.
You increase the timeout in conda-forge.yml. You wait 15 hours again. The build times out again.
You remember to rerender this time. Let's hope it will finish successfully this time…
#HDF5 jest super. W skrócie:
1. Oryginalnie, projekt używał systemu budowania autotools. Instalował binarkę h5cc, która — obok bycia nakładką na kompilator — miała dodatkowe opcje do uzyskiwania informacji o instalacji HDF5.
2. Później dodano alternatywny system budowania #CMake. W ramach tego systemu budowania instalowana jest uproszczona binarka h5cc, bez tych dodatkowych funkcji.
3. Każdy, kto próbował budować przez CMake, szybko odkrywał, że ta nowa binarka psuje większość paczek używających HDF5, więc wracano do autotools i zgłoszono problem do HDF5.
4. Autorzy zamknęli zgłoszenie, stwierdzając (tłum. moje): "Zmiany w h5cc przy użyciu CMake zostały udokumentowane w Release.txt, kiedy ich dokonano - kopia archiwalna powinna być dostępna w plikach z historią."
5. Autorzy ogłosili zamiar usunięcia wsparcia autotools.
Co stawia nas w następującej sytuacji:
1. Praktycznie wszyscy (przynajmniej #Arch, #Conda-forge, #Debian, #Fedora, #Gentoo) używa autotools, bo budowanie przy pomocy CMake psuje zbyt wiele.
2. Oryginalnie uznano to za problem w HDF5, więc nie zgłaszano problemu innym paczkom. Podejrzewam, że wiele dystrybucji nawet nie wie, że HDF5 odrzuciło zgłoszenie.
3. Paczki nadal są "zepsute", i zgaduję, że ich autorzy nawet nie wiedzą o problemie, bo — cóż, jak wspominałem — praktycznie wszystkie dystrybucje nadal używają autotools, a przy testowaniu budowania CMake nikt nie zgłaszał problemów do innych paczek.
4. Nawet nie mam pewności, czy ten problem da się "dobrze" naprawić. Nie znam tej paczki, ale wygląda to, jakby funkcjonalność usunięto bez alternatywy, i tym samym ludzie mogą co najwyżej samemu zacząć używać CMake (wzdych) — tym samym oczywiście psując swoje paczki na wszystkich dystrybucjach, które budują HDF5 przez autotools, o ile nie dodadzą dodatkowo kodu dla wsparcia tego drugiego wariantu.
5. Wszystko wskazuje na to, że HDF5 jest biblioteką, której autorów nie obchodzą ich własni użytkownicy.
#HDF5 is doing great. So basically:
1. Originally, upstream used autotools. The build system installed a h5cc wrapper which — besides being a compiler wrapper — had a few config-tool style options.
2. Then, upstream added #CMake build system as an alternative. It installed a different h5cc wrapper that did not have the config-tool style options anymore.
3. Downstreams that tried CMake quickly discovered that the new wrapper broke a lot of packages, so they reverted to autotools and reported a bug.
4. Upstream closed the bug, handwaving it as "CMake h5cc changes have been noted in the Release.txt at the time of change - archived copy should exist in the history files."
5. Upstream announced the plans to remove autotools support.
So, to summarize the current situation:
1. Pretty much everyone (at least #Arch, #Conda-forge, #Debian, #Fedora, #Gentoo) is building using autotools, because CMake builds cause too much breakage.
2. Downstreams originally judged this to be a HDF5 issue, so they didn't report bugs to affected packages. Not sure if they're even aware that HDF5 upstream rejected the report.
3. All packages remain "broken", and I'm guessing their authors may not even be aware of the problem, because, well, as I pointed out, everyone is still using autotools, and nobody reported the issues during initial CMake testing.
4. I'm not even sure if there is a good "fix" here. I honestly don't know the package, but it really sounds like the config-tool was removed with no replacement, so the only way forward might be for people to switch over to CMake (sigh) — which would of course break the packages almost everywhere, unless people also add fallbacks for compatibility with autotools builds.
5. The upstream's attitude suggests that HDF5 is pretty much a project unto itself, and doesn't care about its actual users.
What's better: VS code or JupyterLab?
#python #conda #FungiFriday #fungi #mycology2025 #Bioinformatics
🥳:qgis: @qgis 3.42 has landed in #condaforge
https://anaconda.org/conda-forge/qgis
My favorite way to install #QGIS
NVIDIA's RAPIDS has turbocharged their @NVIDIA CUDA enabled package building by moving from conda-build to rattler-build, unlocking incredible speedups. 🚀
Source: https://github.com/rapidsai/build-planning/issues/47#issuecomment-2695583003
#nvidia #rapids #rapidsai #packagemanagement #conda #rattler