After looking at Micron Server DRAM and seeing how massively expensive it is, I went to look for the differences.
This 11 year old post states that there are differences in reliability, ECC support, and "the ability to have them replaced when they start warning of failure rather than after failing" between server RAM and consumer RAM.
This blog is from 2020, stating that most consumer PCs only use 32bit architecture, ECC support, and dual-channel support as the differences.
And all the other posts here asking if server RAM works with their PC with/without ECC.
So, it seems like the data is either outdated or wrong:
- Consumer motherboards have started supporting ECC (like my X570).
- I doubt the differences in reliability (lifetime) since it seems they're all made mostly the same.
- I don't know why anyone would be running 32bit for ANYTHING other than support some random legacy software that you could run anyway since 64bit is backwards-compatable.
- I don't know of any motherboard made this decade without dual-channel support.
So now I'm thinking the differences are only: capacity per stick, and the supposed 'failure warning' feature that that one post alleged 11 years ago, and maybe reliability. And cost. Mainly cost.
Has there been new advancements I don't know about? Why is Micron Server DRAM so expensive? Why not just use consumer DRAM and motherboard with ECC and DIMM support (other than needing more DIMM slots)?
Edit: This was suggested, saying Intel phased out ECC in consumer products. That could explain why server DRAM is expensive (since ECC is now much more of a premium spec). Now, does this mean that RAM with ECC is now just 'server ram'? Looking at one of Micron's distributors shows that the only notable difference in upgrading to this kind of RAM is their warranty, it's cheaper(?), and extended lifetime.
I'm going to call this solved. Server RAM basically just gives you peace of mind.