I recently learned that Britain is spending £36 million to upgrade a supercomputer:

https://www.bbc.com/news/articles/c79rjg3yqn3o

Can’t you buy a very powerful gaming computer for only $6000?

CPU: AMD R9 9950X3D

Graphics: Nvidia RTX 5080 16GB

RAM: 64GB DDR5 6000MHZ RGB

https://skytechgaming.com/product/legacy-4-amd-r9-9950x3d-nvidia-rtx-5090-32gb-64gb-ram-3

This is how this CPU is described by hardware reviewers:

AMD has reinforced its dominance in the CPU market with the 9950X3D, it appears that no competitor will soon be able to challenge that position in the near future.

https://www.techpowerup.com/review/amd-ryzen-9-9950x3d/29.html

If you want to add some brutal CPU horsepower towards your PC, then this 16-core behemoth will certainly get the job done as it is an excellent processor on all fronts, and it has been a while since have been able to say that in a processor review

https://www.guru3d.com/review/ryzen-9-9950x3d-review-a-new-level-of-zen-for-gaming-pcs/page-29/

This is the best high-end CPU on the market.

Why would you spend millions on a supercomputer? Have you guys ever used a supercomputer? What for?

  • blackbelt352@lemmy.world
    link
    fedilink
    arrow-up
    11
    ·
    1 day ago

    A super computer isn’t just a single computer, it’s a lot of them networked together to greatly expand the calculation scaling. If you can imagine a huge data center, with thousands of racks of hardware, CPUs, GPUs and RAM chips all dedicated to the tasks of managing network traffic for major websites, its very similar to that but instead of being built to handle all the ins and outs and complexities of managing network traffic, it’s purely dedicated to doing as many calculations for a specific task, such as protein folding as someone else mentioned, or something like Pixar’s Render Farm, which is hundreds of GPUs all networked together dedicated solely to the task of rendering frames.

    With how big and complex any given 3d scenes are in any given Pixar film one single GPU might take 10 hours to calculate the light bounces in a scene to render a single frame, assuming a 90 minute run time, that ~130,000 frames, which is potentially 1,300,000 hours (or about 150 years) to complete just 1 full movie render on a single GPU. If you have 2 GPUs working on rendering frames, you’ve now cut that time down to 650,000 hours. Throw 100 GPUs at the render, we’ve cut time to 13,000 hours, or about a year and a half. Pixar is pretty quiet about their numbers but at least according to the Science of Pixar traveling exhibit during the time of Monster University in 2013, their render farm had about 2000 machines with 24,000 processing cores, and it took 2 years worth of rendering time to render that movie out, and I can only imagine how much bigger their render farm has gotten since then.

    Source: https://sciencebehindpixar.org/pipeline/rendering

    You’re not building a super computer to be able to play Crysis, you’re building a super computer to do lots and lots and lots of math that might take centuries of calculation to do on a single 16 core machine.