#ServerBuild

Server build incoming! 🛠️

Joining forces with LogArtithmic for some collaborative chaos. Expect cybersecurity deep dives, Linux love, and general chill vibes.

Let's build something awesome together! 🔴

#Cybersecurity #Linux #ServerBuild

kick.com/chiefgyk3d

🔴 LIVE • 1 viewers • Just Chatting
RandomGondolatokrandomdolgok
2026-02-04

Almost there.
The server is basically ready — it just needs a little more tweaking.
One silent fan in the power supply, and everything will finally run as it should.
Patience, small upgrades, and a lot of love for quiet hardware.

2026-01-28

Cập nhật chi tiết server AI giá ~17.000 USD: 512GB DDR4, 256GB VRAM (8x3090+2x5090), CPU 64 lõi Threadripper Pro 3995WX. Case Core W200. Video hướng dẫn xây dựng và thử nghiệm hệ thống. Nhấn mạnh tính sáng tạo với linh kiện người dùng thay vì máy chủ đắt tiền. Xem video trên YouTube!

#AI #XâyDựngServer #CôngNghệ #Vietnam #ServerBuild #TechInnovation #SángTạoCôngNghệ #AIHosting

v.redd.it/trvmg2cpp5gg1

2025-11-08

Xây dựng server Lenovo P920 để chạy mô hình LLaMA với khả năng xử lý context cực lớn. Cấu hình包括 dual Xeon 6134, 512GB RAM, và 2 GPU MI50 32GB. Mục tiêu đạt được window context từ 256k đến 512k. #LLaMA #AI #MachineLearning #ServerBuild #LenovoP920 #DeepLearning

reddit.com/r/LocalLLaMA/commen

2025-10-08

Một người dùng đang tìm kiếm gợi ý bo mạch chủ Mini-ITX hỗ trợ ECC cho homeserver cá nhân. Họ dự định dùng CPU Ryzen 7 Pro 2700 và RAM DDR4 ECC, trong vỏ case mini PC cũ. Server ban đầu chạy Minecraft và sẽ được mở rộng chức năng sau này. Bạn có gợi ý nào không?
#homeserver #serverbuild #MiniITX #ECC #Ryzen #tech #mayserver #laprapserver #bo_mach_chu #congnghe

reddit.com/r/selfhosted/commen

2025-09-26

New blog post!

In which I write about how I built a rack-mounted 2U NAS from easily available and sometimes second hand parts, and what obstacles I witnessed along the way.

Did you know that even if something *technically* fits, it does not mean it fits in a way that allows it to work? Now I know that.

https://stfn.pl/blog/82-2u-rack-mounted-server/

#homelab #selfhosted #serverbuild

A closeup of computer hardware. There's a motherboard with two sticks of RAM, and a small tower cooler with the text COOLSERVER 2 BALL BEARING on the fan hub.
Debby ‬⁂📎🐧:disability_flag:debby@hear-me.social
2025-09-13

Hey everyone 👋

I’m diving deeper into running AI models locally—because, let’s be real, the cloud is just someone else’s computer, and I’d rather have full control over my setup. Renting server space is cheap and easy, but it doesn’t give me the hands-on freedom I’m craving.

So, I’m thinking about building my own AI server/workstation! I’ve been eyeing some used ThinkStations (like the P620) or even a server rack, depending on cost and value. But I’d love your advice!

My Goal:
Run larger LLMs locally on a budget-friendly but powerful setup. Since I don’t need gaming features (ray tracing, DLSS, etc.), I’m leaning toward used server GPUs that offer great performance for AI workloads.

Questions for the Community:
1. Does anyone have experience with these GPUs? Which one would you recommend for running larger LLMs locally?
2. Are there other budget-friendly server GPUs I might have missed that are great for AI workloads?
3. Any tips for building a cost-effective AI workstation? (Cooling, power supply, compatibility, etc.)
4. What’s your go-to setup for local AI inference? I’d love to hear about your experiences!

I’m all about balancing cost and performance, so any insights or recommendations are hugely appreciated.

Thanks in advance! 🙌

@selfhosted@a.gup.pe #AIServer #LocalAI #BudgetBuild #LLM #GPUAdvice #Homelab #AIHardware #DIYAI #ServerGPU #ThinkStation #UsedTech #AICommunity #OpenSourceAI #SelfHostedAI #TechAdvice #AIWorkstation #LocalAI #LLM #MachineLearning #AIResearch #FediverseAI #LinuxAI #AIBuild #DeepLearning #OpenSourceAI #ServerBuild #ThinkStation #BudgetAI #AIEdgeComputing #Questions #CommunityQuestions #HomeLab #HomeServer #Ailab #llmlab

What is the Best used GPU Pick for AI Researchers?
 GPUs I’m Considering:
| GPU Model            | VRAM          | Pros                                      | Cons/Notes                          |
| Nvidia Tesla M40          | 24GB GDDR5        | Reliable, less costly than V100              | Older architecture, but solid for budget builds |
| Nvidia Tesla M10          | 32GB (4x 8GB)     | High total VRAM, budget-friendly on used market | Split VRAM might limit some workloads |
| AMD Radeon Instinct MI50   | 32GB HBM2         | High bandwidth, strong FP16/FP32, ROCm support | ROCm ecosystem is improving but not as mature as CUDA |
| Nvidia Tesla V100         | 32GB HBM2         | Mature AI hardware, strong Linux/CUDA support | Pricier than M40/M10 but excellent performance |
| Nvidia A40                | 48GB GDDR6        | Huge VRAM, server-grade GPU                  | Expensive, but future-proof for larger models |
2025-08-16

I managed to do a almost setup, the only hw I couldn't buy nonAmerican is the processor, I have chose for a Ryzen5, I looked for a Loongson 3A5000 but I didn't find it 😂

Updated setup:
ASRock B450M-HDV
AMD Ryzen 5 4500
Fractal Design Node 304 cube
Cooler Master HTK-002
2 x Kingston DDR4 SODIMM FURY Impact 2x8GB 3200
be quiet! Pure Power 12 550W
1 x 240gb ssd hd for OS
2 x 12TB hd for NAS video-audio
2 x 4TB ssd for data

Ivan Todorovivantodorov
2025-07-25

Specs for now:

• Gigabyte Z390 UD
• Intel Pentium Gold G5420
• 16GB DDR4
• 447GB NVMe

Still figuring out how to mount extra 3.5" drives up front (AirMount system long gone) — might reuse 5.25" brackets or print custom cages, but we'll see. Any advice welcome really!

Motherboard in the PC case.
:nfld_tri: 🇨🇦 CowMan 🇪🇺 🇺🇦 🇲🇽cowman@nfld.me
2025-04-15

There we go, addition of a #Bhutan flag emblem, couple stickers on the PSU and ... slide in a 6-drive SSD bay ~~~ this machine is pretty much done, asides from a little cleanup (next time it is offline).

Interestingly, if I don't have a pair of SSDs in the top, the case pressure blows out the front.

( I posted pics to #CowServer hashtag as I went along )

[ #HomeServer #HomeLab #ServerBuild #PCbuild ]

View inside server, messy wires. 'Crystal' rgb strip and motherboard lights in pink.Drive bay opened up. The top was dented and required some straightening with a chisel. Drive bay fans here replaced by noctua 40x20 guysTwo servers, in the rack. New one on top
Debby ‬⁂📎🐧:disability_flag:debby@hear-me.social
2025-02-12

@hendrik @pandasiusfilet

Hey there! 😊 Sorry to bother you, but I’m considering upgrading my homelab with some P40 or M100 GPUs. I mainly need more VRAM for my projects, and I was really inspired by the channel "Homelab AI" and their video on building a DIY 4x Nvidia P40 homeserver with 96GB of VRAM! If you haven’t seen it yet, you can check it out here: DIY 4x Nvidia P40 Homeserver for A youtu.be/dHTvpUlWFbk . I’d love to hear your thoughts or any tips you might have? Thanks a lot! 🙌 #ai #machinelearning #artificialintelligence #Homelab #AI #Nvidia #GPUs #P40 #M100 #VRAM #DIY #TechProjects #MachineLearning #DataScience #ServerBuild #TechInspiration #HomeServer #CloudComputing

:csgoskull:​ k͕̽/\̲̻͍ͬ̾̽tn͑̿͐ǰi̘͕̒̿å :dizzy2:​katnjia@monsterpit.net
2019-12-23

There's something extremely odd and very satisfying running hardware on one system that have a wide generational gap between them. :geblobeyes:​

My server build now is filled up with:

  • A 19 year old sound card
  • A 13 year old graphics card
  • An 11 year old CPU and motherboard
  • A 10 year old serial controller.
  • An 8 year old SAS controller.
  • A 3 year old PSU.

On top of that, I added a 30~ year old floppy drive, for good measure.

Phew~ :geblobsweats:​

I'll remove some of the hardware tho. I doubt I'll have much use for the audio card or the serial controller. But it's still fun to see all of them work. :blobcathappy:​

On that note, the audio card works phenomenally well? It's an SoundBlaster Live! 5.1 card. It has support for old Win 9x/DOS, but also works on modern Linux without a hiccup! :ablobcathappypaws:​

Client Info

Server: https://mastodon.social
Version: 2025.07
Repository: https://github.com/cyevgeniy/lmst