#HPC

US Research Software Engineersus_rse@fosstodon.org
2025-12-11

The #NAIRR pilot has evolved from concept to a real shared #AI research platform that connects researchers, educators, industry partners, and national labs. Sandra Gesing, executive director of US-RSE, highlights how this 2-year-old collaborative infrastructure is lowering barriers for teams to fuel creativity, discovery, and possibility.

👉 Read more: govtech.com/education/higher-e

#RSE #RSEng #ResearchSoftwareEngineering #Innovation #Collaboration #HPC

Snakemake Release Robotsnakemake@fediscience.org
2025-12-11

Beep, Beep - I am your friendly #Snakemake release announcement bot.

There is a new release of the Snakemake executor for #SLURM on #HPC systems. Its version now is 2.0.3!

Give us some time, and you will automatically find the plugin on #Bioconda and #Pypi.

If you want to discuss the release, you will find the maintainers here on Mastodon!
@rupdecat and @johanneskoester

If you discover any issues, please report them on github.com/snakemake/snakemake.

See github.com/snakemake/snakemake for details. Here is the header of the changelog:
𝑅𝑒𝑙𝑒𝑎𝑠𝑒 𝑁𝑜𝑡𝑒𝑠 (𝑝𝑜𝑠𝑠𝑖𝑏𝑙𝑦 𝑎𝑏𝑏𝑟𝑖𝑔𝑒𝑑):
𝐁𝐮𝐠 𝐅𝐢𝐱𝐞𝐬

* ci slurm check: github.com/snakemake/snakemake

Snakemake HPC logo for Mastodon
2025-12-11

Ora stiamo controllando qualche ventilatore, sempre con GraceHopper. #HPC #Linux

2025-12-11

#HPC Child labor…. Mio figlio sta lavorando con GraceHopper.

Austrian Scientific Computing (ASC)asc.ac.at@bsky.brid.gy
2025-12-11

CALL FOR ABSTRACTS for #ASHPC26 is open! Share your work related to #supercomputing with the community: • applications of #HPC • research done on HPC clusters • systems engineering • other topic related to HPC More info & submission: ashpc.eu/event/27/abs...

2025-12-11

The feeling when noticing that really expensive #gracehopper requires #swap for reclaiming memory. Flushing nearly 200GB of memory via swap must be really efficient...
docs.nvidia.com/grace-perf-tun
#HPC

2025-12-10

A new Admin Update landed in my inbox. It included the interestingly-titled article "Fishing with Remora" - admin-magazine.com/Articles/Fi

This immediately brought a Gary Larson cartoon… and, in a way, it is totally appropriate. Image source in alt text.

#Linux #Sysadmin #HPC #PerformanceTesting

A scan of a book or newsprint edition of Gary Larson's "The heartbreak of remoras" cartoon. A great white shark is drawn, standing in front of mirror-topped chest of drawers. In the reflection the shark appears to have a despairing look, and three attached remoras are visible on its 'chest'. The implication is the parasitic/vampire-like remoras are akin to debilitating/fatal human conditions such as multiple sclerosis, stage 4 cancers, or type 1 diabetes.

Image sourced from ifunny - https://ifunny.co/picture/pond-the-heartbreak-of-remoras-bJ1EoxMsA
2025-12-10

🌟 So sánh A100 và H100 trong hiệu năng lưu trữ địa phương cho mô hình nhiều GPU: PCIe Generation 4 gây nghẽn nghiêm trọng ở cold start!
- **A100**: PCIe Gen4, tốc độ giảm mạnh (0.2 GiB/s) khi mở rộng 4 GPU.
- **H100**: PCIe Gen5, duy trì 2.2 GiB/s, xử lý nhanh 10x so với A100.
⚠️ Lời khuyên: Khi xây dựng hệ thống hoặc thuê máy, đừng chỉ nhìn vào FLOPS, kiểm tra PCIe và I/O ổ đĩa để tối ưu cold start!

#AI #HPC #GPU #NhàTưVấnCôngNghệ #Benchmark #PCIe #LậpTrìnhML

reddit.com/r

Orhun Parmaksız 👾orhun@fosstodon.org
2025-12-10

Managing cluster jobs... from the terminal 💯

🌀 **turm** — A TUI for the Slurm Workload Manager.

🔥 Supports all squeue flags, auto-updates logs & more!

🦀 Written in Rust & built with @ratatui_rs

⭐ GitHub: github.com/kabouzeid/turm

#rustlang #ratatui #tui #hpc #slurm #observability #devops #terminal

SEANERGYS projectseanergys_project
2025-12-10

Meet our consortium: Today, we are pleased to present Cineca, Italy’s largest supercomputing centre, that is globally recognized for its leadership in .

In SEANERGYS, Cineca will play a leading role in:
– the release and evaluation of the CMI (Comprehensive Monitoring Infrastructure) implementation,
– the full implementation of all components of the DSMR (Dynamic Resource Management System),
– the deployment of the SEANERGYS software stack on production systems and its final evaluation.

2025-12-10

Hỏi về cài đặt và mở rộng LLM trên phần cứng hiện tại (Tesla V100 32GB) + lên đời GPU (V100 → RTX 5090?). Cần lời khuyên về kích thước mô hình tối đa (7B, 13B, 30B?), khung làm việc tốt nhất (vLLM, TensorRT-LLM...), tối ưu hóa phần cứng cũ và nên nâng cấp lên RTX 5090 hay GPU data-center (A100, H100). #AI #LLM #GPU #KỹThuậtHàngĐầu #Tech #MachineLearning #DeepLearning #HPC

reddit.com/r/LocalLLaMA/commen

2025-12-09

big chip feels smol when you take off all the air cooling.

#gpu #hpc #building

rtx 5090 gpu with air cooling removed prep for a water block
Christian Meestersrupdecat@fediscience.org
2025-12-09

@Dutch_Reproducibility_Network

In fact, I am a #Snakemake co-maintainer and teacher. I was not aware of WorkflowHub - and that was an omission on my part. We actually support and favour this kind of registration: snakemake.readthedocs.io/en/st

In my original post, I also neglected to mention the integration of WorkflowHub with #RO-Crate and in turn, the integration of RO-Crates with nanopubs. I am actively working on a better support for #nanopub and RO-Crates with @fbartusch. The question, how I teach that stands: The #HPC world (at least my bubble) is not really supportive for #ReproducibleComputing . All #OpenScience shenanigans are frowned upon. And PIs in my vicinity are still on this level: phdcomics.com/comics/archive.p - so, how do we educate the educators?

2025-12-08

Day off from #code to get into some #hardware fun. Fixing a leak on my 4 GPU machine and building a new server to sit in my office for some model building. That one will have a 5090 and 32 core threadrippa’

Trying a new German company for the liquid cooling components since EKWB seems to be belly up. My interactions with them have been stellar, and I’m excited to check out the quality of their gear!

#GPU #StructuralBiology #cryoEM #HPC

Snakemake Release Robotsnakemake@fediscience.org
2025-12-08

Beep, Beep - I am your friendly #Snakemake release announcement bot.

There is a new release of the Snakemake executor for #SLURM on #HPC systems. Its version now is 2.0.2!

Give us some time, and you will automatically find the plugin on #Bioconda and #Pypi.

If you want to discuss the release, you will find the maintainers here on Mastodon!
@rupdecat and @johanneskoester

If you discover any issues, please report them on github.com/snakemake/snakemake.

See github.com/snakemake/snakemake for details. Here is the header of the changelog:
𝑅𝑒𝑙𝑒𝑎𝑠𝑒 𝑁𝑜𝑡𝑒𝑠 (𝑝𝑜𝑠𝑠𝑖𝑏𝑙𝑦 𝑎𝑏𝑏𝑟𝑖𝑔𝑒𝑑):
𝐁𝐮𝐠 𝐅𝐢𝐱𝐞𝐬

* partition cluster selection: github.com/snakemake/snakemake

Snakemake HPC logo for Mastodon
ilhan özgen xianhanbruder@ecoevo.social
2025-12-08

Daniel's talk on the current state of SERGHEI, our #HPC hydrodynamic code that leverages the #kokkos framework:
youtu.be/scI8jB2e5UQ

Client Info

Server: https://mastodon.social
Version: 2025.07
Repository: https://github.com/cyevgeniy/lmst