It is astonishing how much RAM modern systems can have. You could store many databases entirely in memory.
https://www.phoronix.com/review/intel-xeon-6980p-performance
It's obviously important to push the cutting edge, and it's imaginable that someday a chip like this could be in a personal computer (over 100 cores and over 4TBs of memory!).
The only things I can think, which I don't have experience with so I don't know if it's reasonable, would be high performance instances at data centers, supercomputing, and high performance main frames (is that a thing?).
I imagine cloud service providers would buy it.
On a price/performance chart, you'd think you'd be better off buying consumer-grade Core i9 CPUs, or even i7, but that ignores everything else that must be bought, like a motherboard and RAM, and doesn't even account for the added rack space.
Put 256 GB of RAM next to that CPU and AWS can host 64 c7i.large (2 core, 4 GB) instances for $64/month each, $4,096/month. They'd see RoI in under 5 months for the CPU.
But really, this CPU is great for any embarrassingly parallel task that doesn't run well on a GPU.
make -j 256
I'll bet folks like Linus Torvalds would get reasonable time-savings with his kernel compiles.I would imagine any of the big cloud providers, for the ones providing x86, would probably love to buy something like this to improve the density of what they can offer...
Of course, that's not even their biggest instance. For a whopping $109/hr, you can get a u-12tb1.112xlarge with 448 cores and 12 TB of RAM. I really have no idea what you would use that for. Machine learning with neural networks that would put OpenAI to shame?