It's incredible how influential Ben Eater's breadboard computer series has been in hobby electronics. I've been similarly inspired to try to design my own "retro" CPU.
I desperately want something as easy to plug into things as the 6502, but with jussst a little more capability - few more registers, hardware division, that sort of thing. It's a really daunting task.
I always end up coming back to just use an MCU and be done with it, and then I hit the How To Generate Graphics problem.
I was about to recommend the Parallax Propeller (the first one that's available in DIP format), but arguably, that one is way more complex to program for (and also significantly more powerful, and at that point you might as well look into an ESP32 and that is "just use an MCU" :))
And yeah, video output is a significant issue because of the required bandwidth for digital outputs (unless you're okay with composite or VGA outputs, I guess they can still be done with readily available chips?). The recent Commander X16 settled for an FPGA for this.
I feel like the CX16 lost its way about a week after the project started and it suddenly became an expensive FPGA-based blob. But at the same time, I'm not sure what other option there is for a project like that.
I always got the impression that David sort of got railroaded by the other members of the team that wanted to keep adding features and MOAR POWAH, and didn't have a huge amount of choice because those features quickly scoped out of his own areas of knowledge.
The first choice was the Gameduino, also an FPGA-based solution. I have misplaced my bookmark for the documentation covering the previous hardware revision, but current version 3X is MOAR POWAH just on its own, this seems to be a natural tendency: https://excamera.com/sphinx/gameduino3/index.html#about-game...
Edit: found it! https://excamera.com/sphinx/gameduino/index.html
Modern retro computer designs run into the problem of generating a video signal. Ideally you'd have a tile and sprite based rendering. And you'd like to support HDMI or at least VGA. But there are no modern parts that offer this and building the functionality out of discrete components is impractical and unwieldy.
A FPGA is really just the right tool for solving the video problem. Or some projects do it with a micro-controller. But it's sort of too bad as it kind of undercuts the spirit of the whole design. If you video processor is orders of magnitude more powerful than the rest of the computer, then one starts to ask why not just implement the entire computer inside the video processor?
It's one of the funny things of the Raspberry Pi Pico W: The Infineon CYW4343 has an integrated ARM Cortex-M3 CPU, so the WiFi/BT chip is technically more advanced than the actual RP2040 (which is a Cortex-M0+) and also has more built-in ROM/RAM than what's on the Pico board for the RP2040 to use.
And yeah, you can't really buy sprite-based video chips anymore, and you don't even have to worry about stuff like "Sprites per Scanline" because you can get a proper framebuffer for essentially free - but now you might as well go further and use one microprocessor to be the CPU, GPU, and FM Synthesizer Sound Chip and "just" add the logic to generate the actual video/audio signals.
I think so too - it must have been a great learning experience for him though, but for me, the idea of "The best C64-like computer that ever existed" died pretty quickly.
He also did run into a similar problem that I ran into when I tried something like that as well: Sound Chips. Building a system around a Yamaha FM Synthesizer is perfect, but I found as well that most of the chips out there are broken, fake, or both and that no one else makes them anymore. Which makes sense because if you want a sound chip in this day, you use an AC97 or HD Audio codec and call it a day, but that goes against that spirit.
I think that the spirit on hobby electronics is really found in FPGAs these days instead of rarer and rarer DIP parts. Which is a bit sad, but I guess that's just the passage of time. I wonder if that's how some people felt in the 70s when CPUs replaced many distinct layouts, or if they rejoiced and embraced it instead.
I've given up trying to build a system on a breadboard and think that MiSTer is the modern equivalent of that.
Microcontrollers have taken over. When 8kB SRAM and 20MHz microcontrollers exist below 50-cents and at miniscule 25mm^2 chip sizes drawing only 500uA of current... there's very little reason to use a collection of 30 chips to do equivalent functionality.
Except performance. If you need performance then bam, FPGA land comes in and Zynq just has too much performance at too low a cost (though not quite as low as the microcontroller gang).
----------
Hobby Electronics is great now. You have so many usable parts at very low costs. A lot of problems are "solved" yes, but that's a good thing. That means you can focus on solving your hobby problem rather than trying to invent a new display driver or something.
Another advantage of hobby anything is that you can just do, and reinvent whatever you want. Sure, fast CPUs/MCUs exist now and can do whatever you want. But if you feel like reinventing the wheel just for the sake of it, no one will stop you![1]
I do think some people that remember fondly the user experience of those old machines might be better served by using modern machines (like a raspberry pi or even a standard pc) in a different way instead of trying to use old hardware. That's from the good old Turing machine universality (you can simulate practically any machine you like using newer hardware, if what you're interested in is software). You can even add artificial limitations like PICO-8 or TIC-80 does.
See also uxn:
https://100r.co/site/uxn.html
and (WIP) picotron:
https://www.lexaloffle.com/picotron.php
I think there's a general concept here of making 'Operating environments' that are pleasant to work within (or have fun limitations), which I think are more practical than a dedicated Operating System optionally with a dedicated machine. Plus (unless you particularly want to!) you don't need to worry about all the complex parts of operating systems like network stacks, drivers and such.
[1] Maybe we should call that Hobby universality (or immortality?) :P If it's already been made/discovered, you can always make it again just for fun.
I've been looking into graphics on MCUs and was disappointed to learn that the little "NeoChrom" GPU they're putting on newer STM32 parts is completely undocumented. Historically they have been good about not putting black boxes in their chips, but I guess it's probably an IP block they've licensed from a third party.
The RP2040 is a great MCU for playing with graphics as it can bit bang VGA and DVI/HDMI. There's some info on the DVI here: https://github.com/Wren6991/PicoDVI
I wrote a couple of articles on how to do bit banged VGA on the RP2040 from scratch: https://gregchadwick.co.uk/blog/playing-with-the-pico-pt5/ and https://gregchadwick.co.uk/blog/playing-with-the-pico-pt6/ plus an intro to PIO https://gregchadwick.co.uk/blog/playing-with-the-pico-pt4/
You can do something similar on STM32 parts that have an LCD controller, which can be abused to drive a VGA DAC or a DVI encoder chip. The LCD controller at least is fully documented, but many of their parts pair that with a small GPU, which would be an advantage over the GPU-less RP2040... if there were any public documentation at all for the GPU :(
I used "composite" (actually monochrome) video output software someone wrote on the RP2040 for an optional feature on the PhobGCC custom gamecube controller motherboard to allow easy calibration, configuration, and high-frequency input recording and graphing.
Pictures of the output here: https://github.com/PhobGCC/PhobGCC-doc/blob/main/For_Users/P...
Agreed. It is so, so, so very disappointing. I was deeply surprised (in a non-pleasant way) when I first opened up a Reference Manual for one of those chips and saw that the GPU chapter was, like, four pages. :(
On the ST forum the company clearly said that they will only release to some selected partners. That's sad.
That sucks. There are other MCUs with 2D graphics peripherals, eg the NXP i.MX line.
True, can't think of much else this popular.
He started posting videos again recently with some regularity after a lull. Audience is in the low hundreds of thousands. I assume fewer than 100k actually finish videos and fewer still do anything with it.
Hobby electronics seems surprisingly small in this era.
I wonder if there’s much overlap between people that watch YouTube to get deep technical content (instead of reading), and people that care about hobby electronics.
I’m having trouble wrapping my head around how / why you’d use youtube to present analog electrical engineering formulas and pin out diagrams instead of using latex or a diagram.
I consider YouTube (or rather, video in general) a fantastic platform for showcasing something cool, demonstrating what it can do, and even demonstrating how to drive a piece of software - but for actual technical learning I loathe the video format - it's so hard to skim, re-read, pause, recap and digest at your own speed.
The best compromise seems to be webpages with readable technical info and animated video illustrations - such as the one posted here yesterday about how radio works.
For some things there is a lot of nuance lost in just writing. The unknowm unknowns.
There has been a lot of times where I am showing someone new to my field something and they stop me before I get to what I thought was the "educational" point and ask what I just did.
Video can portray that pretty well because the information is there for you to see, with a schematic or write-up if the author didn't put it there the information isn't there.
Even if you're not much of a tinkerer, Ben Eater's videos are massively helpful if you want to truly understand how computers work. As long as you come in knowing the rudiments of digital electronics, just watching his stuff is a whole education in 8-bit computer design. You won't quite learn how modern computers work with their fancy caches and pipelines and such, but it's a really strong foundation to build on.
I've built stuff with microcontrollers (partially aided by techniques learned here), but that was very purpose-driven and I'm not super interested in just messing around for fun.
Registers can be worked around by using the stack and/or memory. Division could always be implemented as a simple function. It's part of the fun of working at that level.
Regarding graphics, initially output serial. Abstract the problem away until you are ready to deal with it. If you sneak up on an Arduino and make it scream, you can make it into a very basic VGA graphics card [1]. Even easier is ESP32 to VGA (also gives keyboard and mouse) [2].
[1] https://www.instructables.com/Arduino-Basic-PC-With-VGA-Outp...
[2] https://www.aliexpress.us/item/1005006222846299.html
Funny enough, that's exactly where this project started. After I built his 8 bit breadboard computer, I started looking into what might be involved in making something a bit more interesting. Can't do a whole lot of high-speed anything with discrete logic gates, so I figured learning what I could do with an FPGA would be far more interesting.