16GB is a practical limit of the RAM bus.
There is no larger capacity with 32 bit wide bus. So there must be either a 64 bus or multiple RAM channels. Both solutions needs a lot of changes, so we may not see them for a long time.
But how do you get your AI model into RAM? So you want more PCI 5 lanes... and better cooling solutions, so we can run your 8 x 128GB HBM Pi 6 cluster in full speed for some minutes.
Great education device with local LLM teacher. I can wait six months to buy my Pi X for 10X$ at X-mas![Wink ;)]()
There is no larger capacity with 32 bit wide bus. So there must be either a 64 bus or multiple RAM channels. Both solutions needs a lot of changes, so we may not see them for a long time.
But how do you get your AI model into RAM? So you want more PCI 5 lanes... and better cooling solutions, so we can run your 8 x 128GB HBM Pi 6 cluster in full speed for some minutes.
Great education device with local LLM teacher. I can wait six months to buy my Pi X for 10X$ at X-mas
Statistics: Posted by crumble — Thu Jun 26, 2025 6:30 am