#HBM3

🔘 G◍M◍◍T 🔘gomoot@mastodon.uno
2025-03-19
Computer Engineering JMUce@mastodon.acm.org
2024-12-17

🚀 Excited to present our paper on DRAMPower 5 at the HiPEAC conference this January! It introduces a revamped DRAM power simulator supporting DDR5, LPDDR5, and HBM3 with improved speed and accuracy. Check out the open-source tool on GitHub!

github.com/tukl-msd/DRAMPower

We thank the @bmbf_bund for funding within the DI-DERAMSys project.

#HiPEAC2024 #DRAM #DDR5 #LPDDR5 #HBM3 #opensource

Shows the paper and some DRAM DIMMs
Benjamin Carr, Ph.D. 👨🏻‍💻🧬BenjaminHCCarr@hachyderm.io
2024-11-21

#AMD crafts custom #EPYC #CPU with #HBM3 memory for #Microsoft #Azure – CPU with 88 #Zen4 cores and 450GB of HBM3 may be repurposed #MI300C, four chips hit 7TB/s
#HBv series of Azure #VM are focused on delivering high amounts of memory bandwidth; Microsoft calls it “biggest #HPC bottleneck.” Previously, Microsoft had used #MilanX and #GenoaX with AMD 3D #VCache to provide extra bandwidth, but for latest #HBv5 VMs, Microsoft clearly wanted something even more performant.
tomshardware.com/pc-components

The Japan Timesthejapantimes
2024-07-24

Nvidia has cleared Samsung Electronics' fourth-generation high bandwidth memory for use in its processors for the first time, three people briefed on the matter said. japantimes.co.jp/business/2024

🔘 G◍M◍◍T 🔘gomoot@mastodon.uno
2024-05-15

💻 Samsung e SK Hynix cessano la produzione di memorie DDR3 per soddisfare la domanda di memorie HBM3

gomoot.com/samsung-e-sk-hynix-

#blog #ddr3 #DRAM #HBM #HBM2E #HBM3 #HBM4 #ia #news #picks #tech #tecnologia

2024-02-27

SK Hynix, Micron e Samsung sono tre giganti nel settore della produzione di memoria. Attualmente stanno facendo passi significativi per soddisfare la crescente domanda di soluzioni di memoria ad alte prestazioni (HBM3E) per applicazioni di intelligenza artificiale (IA). Questi sviluppi sono cruciali per il progresso tecnologico e l’innovazione in vari campi. Si […]

https://gomoot.com/la-corsa-alloro-sulle-memorie-hbm3e-di-sk-hynix-micron-e-samsung/

Nvidia H200HBM3eHBM3E micron
ComputerBaseComputerBase
2024-02-26
🔘 G◍M◍◍T 🔘gomoot@mastodon.uno
2023-11-04

💡 SK Hynix e Samsung in corsa per il dominio della memoria HBM3
Il promettente futuro delle memorie HBM3 perl'IA e il ruolo chiave di SK Hynix e Samsung

gomoot.com/sk-hynix-e-samsung-

#amd #datacenter #server #HBM3 #HBM3e #ia #ai #intelligenzaartificiale #nvidia #Samsung #skhynix

HPC GuruHPC_Guru
2023-08-03

Samsung is currently working with Nvidia on technical verification of for and advanced packaging services

Currently, TSMC packages SKHynix’s HBM3 with Nvidia GPUs

businesskorea.co.kr/news/artic

WinFuture.deWinFuture
2023-07-27

kann einen neuen Geschwindigkeitsrekord bei verfügbaren Speichermodulen vermelden. Mit seinen Chips der zweiten Generation kommt man auf 1,2 TByte/s Bandbreite: winfuture.de/news,137608.html?

ComputerBaseComputerBase
2023-07-26
HPC GuruHPC_Guru
2023-07-26

Micron's New is World's Fastest at 1.2 TB/s, Teases Next-Gen at 2 TB/s+

o Highest-capacity 8-high stack at 24GB (36GB is coming)

o Most power efficient with a 2.5x improvement-per-watt compared to HBM2E

tomshardware.com/news/microns-

Benjamin Carr, Ph.D. 👨🏻‍💻🧬BenjaminHCCarr@hachyderm.io
2023-06-15

#AMD Has a #GPU to Rival #Nvidia’s #H100
#MI300X is a GPU-only version of previously announced #MI300A supercomputing chip, which includes a #CPU and #GPU. The MI300A will be in El Capitan, a supercomputer coming next year to the #LosAlamos #NationalLaboratory. El Capitan is expected to surpass 2 exaflops of performance. The MI300X has 192GB of #HBM3, which Su said was 2.4 times more memory density than Nvidia’s H100. The SXM and PCIe versions of H100 have 80GB of HBM3.
hpcwire.com/2023/06/13/amd-has

Benjamin Carr, Ph.D. 👨🏻‍💻🧬BenjaminHCCarr@hachyderm.io
2023-06-14

#AMD Instinct#MI300 is THE Chance to Chip into #NVIDIA #AI Share
NVIDIA is facing very long lead times for its #H100 and #A100, if you want NVIDIA for AI and have not ordered don't expect it before 2024. For a traditional #GPU, MI300 is GPU-only part. All four center tiles are GPU. With 192GB #HBM, & can simply fit more onto a single GPU than NVIDIA. #MI300A has 24 #Zen4, #CDNA3 GPU cores, and 128GB #HBM3. This is CPU deployed in the El Capitan 2+ Exaflop #supercomputer.
servethehome.com/amd-instinct-

Yes!Onlineyesonline
2023-04-21

RT @HPC_Guru: SK hynix Develops Industry's First 12-Layer , Provides Samples To Customers

t.co/YUvfANp35u

ComputerBaseComputerBase
2023-04-20
heise online (inoffiziell)heiseonline@squeet.me
2022-06-09
SK Hynix hat mit der Serienproduktion von HBM3-RAM begonnen. Nvidia ist der erste große Abnehmer.
HBM3: SK Hynix liefert Stapelspeicher für Nvidias Hopper-GPUs

Client Info

Server: https://mastodon.social
Version: 2025.04
Repository: https://github.com/cyevgeniy/lmst