

They still probably need a ton of customization and tuning at the driver level and beyond, which open source allows for.
I am sure there is plenty of existing “super computer”-grade software in the wild already, but a majority of it probably needs quite a bit of hacking to get running smoothly on newer hardware configurations.
As a matter of speculation, the engineers and scientists that build these things are probably hyper-picky about how some processes execute and need extreme flexibility.
So, I would say it’s a combination of factors that make Linux a good choice.









The media (Blu-ray, dvd, whatever…) didn’t matter so much. Adding depth fields to existing media works, but it isn’t exactly perfect. The tech should be much better now, but it took a fuck ton of manual labor to convert films to be compatible with 3D. Back when 3D TVs were being pushed, studios had to film movies in 3D as well, which took more time and more equipment.
Here is an old pic I took during the conversion of Titanic into 3D since it wasn’t filmed in 3D from the start. Each frame needed to have depth fields mapped, by hand, in a room filled with jr level staff. This work was split across multiple studios.