When I see comparisons like this, the first thought I have is not the benchmarks, but rather what the most “heroic” real-world calculation of the day would have been on something like the Cray-1, and how to replicate those calculations today on something like a RPi. Weather/climate models? Rad-hydro?
The fidelity would almost certainly be super low compared to modern FEA software, but it would be a fun exercise to try.
One of the early customers was the European Centre for Medium-Range Weather Forecasts, so, wild guess, they probably used it for medium-range weather forecasts.
in europe
I thought the ECMWF models were (and always have been) global?
FWiW Australia used a CDC Cyber 205 for occassional weather modelling and other mathematical work in the early 1980s.
( There was a seperate dedicated weather computer, this one was used for 'other' jobs like speculative weather modelling, monster group algebraic fun, et al.)
https://en.wikipedia.org/wiki/CDC_Cyber
The UK was the first customer:
Numerically, I’m currently what this would have looked like. I’m talking about the governing equation set, discretization methods, data, etc. It would be a fun project to try and implement a toy model like that.
If you really want a challenge, do it using pen, paper and a slide rule, like in the old days[1]. Just make sure to apply appropriate smoothing of the input data first[2].
[1]: https://www.smithsonianmag.com/history/how-world-war-i-chang...
[2]: https://arxiv.org/abs/2210.01674
nuclear weapons simulations
the first machine went to Los Alamos
https://www.theatlantic.com/technology/archive/2014/01/these...
These are incredibly expensive even on today’s hardware. If you look through some of the unclassified ASCI reports from the early 2000s, 3D calculations of this equation set were implied to be leadership-class computations. At the time of the Cray, it must’ve been coarse-grid 1D as the standard, with 2D as the dream.
The demand for the huge calculations for the design of nuclear weapons started in WW II already:
https://ahf.nuclearmuseum.org/ahf/history/human-computers-lo...
"The staff in the T-5 group included recruited women who had degrees in mathematics or physics, as well as, wives of scientists and other workers at Los Alamos. According to Their Day in the Sun: Women of the Manhattan Project, some of the human computers were Mary Frankel, Josephine Elliot, Beatrice “Bea” Langer, Augusta “Mici” Teller, Jean Bacher, and Kay Manley. While some of the computers worked full time, others, especially those who had young children, only worked part time.
General Leslie R. Groves, the Director of the Manhattan Project, pressured the wives of Los Alamos to work because he felt that it was a waste of resources to accommodate civilians. As told by Kay Manley, the wife of Los Alamos physicist John Manley, the recruitment of wives can also be traced to a desire to limit the housing of “any more people than was absolutely necessary.” This reason makes sense given the secretive nature of Los Alamos and the Manhattan Project. SEDs, a group of drafted men who were to serve domestically using their scientific and engineering backgrounds, also worked in the T division."
A Cray-1 could execute an infinite loop in 7.5 seconds!
Quite impressive, but can't avoid noticing you did not, go near a higher challenge, like compiling a C++ in under 4 weeks... \s
In a similar way how Chuck Norris counted to infinity.. twice?
You could always start with loading up Spec ‘06, which contains micro kernels of such “heroic” workloads.
I toured an NCAR (National Center for Atmospheric Research) facility in Boulder around 1979; got to sit on a seat on their Cray-1. So yes, weather and climate calculations.
3-D rendering? We had a super computing club in early 90s high school. I remember creating wireframe images, uploading then to a Cray XMP at Lawrence Livermore for the computation, and then downloading finished results.
You could get some vintage matrices from SuiteSparse (formerly the university of Florida sparse matrix collection).