Had a heterogeneous programming class, where we had to implement various things on the PS3, like a basic MPEG-ish encoder and such.
Was an interesting experience, though all I remember now is how it required careful exploitation of the vector units in the SPEs to get any decent performance out of it, and how annoying it was to synchronize between the SPEs and the PPE.
For each assignment the prof would include a benchmark, so we could compare the performance in class. Was a huge difference between the students that spent time optimizing and those which did a basic implementation.
I'm super curious what program/course was set up with developer/jailbroken PS3s to use as lab material. Was this specifically about game/console dev?
I believe the PS3 had an option to install Linux on it at some point. No? That would've made it a neat option for classes like this.
Oh right... I think somewhere around 1st gen they were cool with that. But didn't they end up getting killed in the market because they had to sell it for less than Five Hundred and Ninety Nine U.S. Dollars, effectively at a loss, and people were making compute centres out of them?
I think a later gen locked it right down.
For the most part the cluster of PS3s supercomputer was a myth used to hype up how powerful the PS3 was. The amount of RAM per compute was so low that it limited the type of workloads it was good at. Then server CPUs became more powerful as the PS3 processor stayed the same.
However PS3 clusters did find niche success as a super computer. There are a handful of notable examples. The fastest one ever built was by the US Airforce, consisted of 1,760 PS3s, and was the 33rd fastest super computer at the time. It was used for satellite image processing.
I always thought it was a bit odd that we went from super-compute clusters built out of PS3 to the PS4/Xbone generation consoles being meh-grade laptop chips in a fancy case.
It's the performance vs programmability tradeoff.
The article references it.
As one goes back in console history, the performance to achieve contemporarily visually-stunning results required exotic architectures + hand-tuned code.
Over time, state of the art PC + GPU hardware improved, to the point there wasn't a functional difference vs exotic architectures. Additionally, the cost of developing leading-edge custom architectures exploded.
Microsoft really pushed this with the original "PC in a box" XBox, but you saw the console OEMs outsource increasing parts of their chip solutions after that.
Why keep / buy entire chip design teams, when AMD will do it for you?
Especially if the performance difference is negligible.
The article also references the cost of developing a AAA "HD" title is so high that it pretty much needs to be released on multiple platforms to be profitable.
When I was paying attention during the 90s and 2000s I remember this being the hype in magazines / on the web for pretty much EVERY upcoming console. Namely that it was so outrageously powerful that it might be considered a supercomputer in its own right. Needless to say, I was hyped up and obsessing about them myself.
Let's not forget the Japanese government applying export restrictions on the PS2 because it might be powerful enough to be used for military purposes [1]. It even went so far as claims that Iraq ordered a bunch of them for that purpose [2]. That's some serious hype marketing haha.
[1] http://news.bbc.co.uk/2/hi/asia-pacific/716237.stm
[2] https://www.theregister.com/2000/12/19/iraq_buys_4000_playst...
There was a time when supercomputers had no GPUs. Nowadays the top list is all GPu accelerators.
OtherOS was a way for Sony to work around higher tarifs. General computing devices (enabled by OtherOS) had a lower schedule than consoles.
It was retroactively removed when people started getting close to enabling full GPU access. This was previously crippled under OtherOS to prevent games developed without the PS3 SDK.
As PS2Linux owner I think that was a reaction to the way the Linux community handled PS2Linux.
Instead of following the Yaroze footsteps, and being a way for a new indies generation to learn how to create console games, the majority used PS2Linux for classical UNIX stuff and play games on emulators.
Note that PS2Linux had GPU access, via PSGL and an easier high level API for the Emotion Engine.
So, no GPU access, no emulators.
It was retroactively removed when Sony started worrying about "security" concerns - aka people running unlicensed commercial games, which is usually the killer app for any firmware that supports customization and programming.
It was also advertised on the original PS3 box.
Ultimately there was a lawsuit and some PS3 owners got $10 apiece.
Ah interesting. I do recall we never utilized the GPUs on them, just the Cell.
Reportedly the US government had no problem paying $599 per PS3 to build ML/AI supercomputers out of them.
Any source on that? It sounds unlikely to me. There was a supercomputer that IBM built for Los Alamos that used the Cell processor: https://en.m.wikipedia.org/wiki/Roadrunner_(supercomputer)
Probably referring to:
https://phys.org/news/2010-12-air-playstation-3s-supercomput...
Each ps3 was cheaper than the per-CPU cost of what you'd get from IBM. I'm sure you'd make it up in power usage pretty quick though.
Rumor has it that Sony axed OtherOS in response to Geohot using it to exploit & compromise the PS3 hypervisor.
If I recall correctly that was a stated fact. OtherOS was removed due to "security vulnerabilities".
Yes: https://blog.playstation.com/archive/2010/03/29/ps3-firmware...
(From wikipedia references)
Yes. My research lab in grad school had a cluster of 24 PS3s. The back of that rack was hot.
I built a cluster of 20,000 PS5 APU's. The problem was that they didn't have ECC memory, so it was pretty hard to repurpose them for any real work. I tried really hard to find some use cases, but unfortunately getting funding to run them was going to take longer than we had time. It all got shut down.
They did work great for mining ethereum though.
Please tell me you guys called it "SuperCELL"
I was out of school for quite a bit by the time the PS3 came out but I remember installing linux on mine and being pretty excited to program on it.
I did a little bit, but it ended up being a super PITA because I didn't want to put my PS3 on my desk and give up playing games on it on my TV.
Because I wanted to keep it with my TV I had tried to use it with Bluetooth KB/Mouse so I could use it from the couch.
And well typical Linux you always end up spending time on the most mundane thing. Bluetooth on the PS3 in Linux at that time didn't seem to work right.
That to me is like peak Linux right there. At some point I had said to myself "I wanted to play with the cell and instead I'm working on Bluetooth" and I lost interest cause of course I had real work too. Then eventually Sony disabled the whole thing.
It's really not 'peek Linux' it's just that many manufacturers refuse to even provide specifications for using their product (basically it's API) without charging money and/or requiring an NDA etc.
This makes any third party trying to support them face nightmare after nightmare of unknowns, and a complex device like a computer of any sort is nearly always a complex web of interlinked mini-computers each doing their own thing. Like the damned bluetooth chip; which worse probably has government regulated transmission compliance stuff baked into it's software rather than hardware. That adds a whole other nightmare you might see with various firmware blobs and driver support, most often seen with other WiFi adjacent projects such as OpenWRT (and supported hardware).
It would be very nice if some law enforced free 'edge technical specification' for all products to anyone who buys them or a product containing them. That would help a level and fair market for competitors as well as right and ability to repair for consumers / device owners. That sounds a lot like effects desired by most (many?) Libre software licenses, like the GPL (and LGPL), and Creative Commons.
Yes, it was called OtherOS.
The first generation devkit PS3s cost something outrageous like 10000 dollars and were the size of large pizza boxes IIRC (shrunk from three pizza boxes stacked on top of each other - eventually they shrunk the devkit down) and has some pretty stringent requirements to acquire. On the team I worked on only the graphics developer got a PS3 devkit, the rest of us lowly programmers got PS3 test kits, that allowed us to deploy/debug but without the same level of support tools, I think it was the same situation with the XBox 360 but the Microsoft developer tools were excellent even with just the test kits. I vaguely remember the cost of the test kits being double or triple the retail price of a regular console, and we had to warranty my test xbox 360s quite regularly since they would red circle of death.
Not OP, but we had PS3s at school as well. Early units ran linux. It was sandboxed to a degree in order to prevent it from being used for software piracy. OP's project was cool, ours was basically implementing PCA on one.
I got the impression that Sony encouraged this use of the console early on. They were probably aiming to establish themselves in the scientific computing niche. We also had nvidia-sponsored labs where they taught CUDA (or at least tried to - it was pretty difficult to do without support and most people took the class for access to computer labs equipped with amazing gaming machines). We all know nvidia won that war.
They had Linux on them. Don't think I ever saw them, just had network access.
I really wanted do take that course. I had taken the OS course where you build an OS from scratch, so this seemed like the next thing to do. The OS course was for 20 points, but the heterogeneous programming course was only 10 points. People said it was way more work than 10 points though and I ended up dropping it because I couldn't justify spending that much time on it when I had other courses to take and a thesis to write.
Here is the OS course: https://www.uio.no/studier/emner/matnat/ifi/IN4000/index-eng...
Hah yea I quickly discovered that matnat[1] study points weren't like other study points. Felt like we had to do 2-5x the work compared to the students I knew going to other faculties. But in general the courses were very rewarding I feel.
I heard about the OS one, sounded super cool and also a lot of work from what I gathered. Heard about it too late and required a buddy. Oh well, can't have it all.
[1]: faculty of math and natural sciences
Sounds like the UK too. All my school friend doing non-STEM courses would ask me how many "contact hours" we had per week.
I can't remember exactly how many, but once you included lectures, scheduled labs and tutorials, was not that far from 9-5 (minus the standard university-wide Wednesday afternoon sport slot).
The friends would go quiet and say "oh, we have 12".
And that didn't even incude the "non-contact" hours in the labs grinding the coursework until security shooed us out late at night so they could lock the building. Good times (in retrospect, especially).
Yeah, I did Physics in the UK and it was the same.
Plus in one term there was some miscommunication between the professors and they set us one experiment every week instead of every two weeks, I remember literally working like 12 hour days on the weekends to get all the data analysis and reports done.
We at NTNU used to complain/joke about how you at UiO even got 10 points for your classes, when we had a flat 7,5 sp for each class in Trondheim...
I say used to, but still do. Even 10 years later me and my girlfriend (that studied at UiO) still tease each other about who is the smartest or had the hardest courses, or she mocks my ring, heh.
Just out of curiosity what kind of hard projects did you do at the bachelor level? I had to build a unix shell, implemented some CPU scheduling algorithms (but didn't actually integrate it into a proper OS), did some B-tree stuff as well as implementing JPEG-like algorithms. Probably the one that took the most time was designing a CPU that worked in a real FPGA with its own instruction set and assembler (it was such a pain in the ass to troubleshoot, my crappy laptop took 30 min to compile the project and run simulations).
When I talk to people outside my home country it feels most don't do these kind of things at the bachelor level. But, like you, people who go on a masters then might do one of these things and go much more in-depth than I did.
I designed a cut down ARM core that ran on a FPGA. Then wrote a compiler to run mmy own code on it.
That's "full stack" development :)
Also wrote a BBC Micro emulator, which was great fun.
Got anY blog or write-up on that?
That sounds fun, I designed the instruction set on my CPU myself and I didn't get around writing a compiler for it (I only wrote an assembler).
Since I designed the instruction set I made some very quirky design decisions to make the test programs I was writing easier, instructions were 16bit but only 6 bits were used for the instruction so I had a lot of single-word instructions that did a lot of heavy lifting for my programs.
Did you really do that at the bachelor level? I feel like my uni was a bit abnormal on how many hardware classes and projects we had at a bachelor level for computer science.
that sounds like a super fun class. wonder if it’s still offered?
Seems like they still do[1], though I doubt the PS3 is still in the mix.
Though from the published material for 2024 it seems not to have changed too much in principle, eg the codec63 I recall implementing parts for.
edit: oh and yes, one of the highlights of my degree.
[1]: https://www.uio.no/studier/emner/matnat/ifi/IN5050/index-eng...
The Program Optimization course I followed did something similar with regular CPUs / GPUs. It started out as "don't do stupid things", but it quickly went into SoA vs AoS, via hand-writing AVX intrinsics, into eventually running OpenCL code on the GPU.
Part of your grade depended on the performance of your code, with a bonus for the fastest submission. It led to a very nice semi-competitive semi-collaborative atmosphere.