#MOEmodels

2025-10-31

If you have 2 MI50s and a 7900XT, can you use the 7900XT for prompt processing in MOE models while leveraging the faster VRAM of the MI50s? Or combine them for better performance? Thrilling tech experimentation! #TechTalk #VRAM #MOEmodels #GamingPC #Multigpu #C FamilienTechTalk #VRAMbộnhiền #môihínhthùng #GamePC #bộgpuđa

reddit.com/r/LocalLLaMA/commen

2025-10-15

Cập nhật: Kiểm tra nhiều mô hình MoE (mix of experts) gamle 10-35B trên miniPC với iGPU. Thông số: OS Kubuntu 25.10, CPU Ryzen 6800H, RAM 64GB DDR5, GPU AMD Radeon. Kết quả: Ling-Coder-lite.i1-Q4_K_M (t/s: 399-487), Qwen3-30B (t/s: 171-292).

Tags: #MoEmodels #AI #Tech #Vietnamese #Gaming #LLM

reddit.com/r/LocalLLaMA/commen

Client Info

Server: https://mastodon.social
Version: 2025.07
Repository: https://github.com/cyevgeniy/lmst