return to table of content

Let's compile like it's 1992 (2014)

ecshafer
49 replies
4d4h

I learned to program in Borland Turbo C++. One thing from back then that I really miss is how easy it was to do some complex things. I was able to draw to the screen by just calling geometric shape functions, and it made pictures! I found out that if I drew a shape, then called the xor function on the shape a new but sligtly different shape, I could make animations. So making little sprites that looked like they were running made out of only 1000 lines of C++ code was awesome. Some friends and I we got together and made a final fantasy like game using these tricks, hand crafted sprites and a game world that you could walk across, every map was a whole screen, and you would go to an adjacent map when you hit the sides, and every step if you had a natural 1 you would go into a fight with some enemies.

This was all pretty tedious and everything, but it was a lot of fun to high schoolers. And if we knew more computer science and software engineering we would have done more probably, and it would have worked better. But unlike today we didn't have to learn SFML or or activex or opengl to just start playing and get stuff working, we could just call circle.

xyzelement
27 replies
4d3h

I suspect that if we wanted to find an environment for kids to do simple shape programming, we can find something like that.

The difference seems to be that there's been a greater divergence between "professional" tools and "kids/intro tools" whereas Turbo C++ (or Turbo Pascal in my days) were kinda both.

mikepurvis
12 replies
4d3h

I've been struggling with getting my ten-year-old across this gap. He's outgrown Scratch/Roblox and I don't think PICO-8 is quite the right set of abstractions (no built in entity system, seriously?); we're working with Godot right now and he's making progress, but it's definitely a lot more of a learning curve figuring out how to do stuff in a tool that has everything.

xyzelement
8 replies
4d3h

I have a 3 and 1 year olds but starting to think about this.

If we really think basic / turbo pascal has this right. Why don’t we just teach our kids those things. We can get the environments running I am sure.

ecshafer
7 replies
4d3h

I think something like basic or pascal is right. The "hard" part I think that Borland did so well was make it so easy to just get things right. A Neo Turbo Pascal that you could could type draw(x,y,z,r,"red") and see a red circle on the screen is the ideal, without having to mess with a very complicated workflow like in unity or unreal.

fuzztester
3 replies
4d

Python's turtle graphics or Logo is good for this.

Free Logos are available even today.

zozbot234
2 replies
3d8h

LOGO is a great take on LISP-like languages overall, unfortunately it uses dynamic scope. Of course this only comes up in larger programs but I do wonder if anyone has made a lexically-scoped LOGO variant and what it might be like to teach coding in it.

fuzztester
1 replies
2d21h

What is the difference between dynamic and lexical scope? And what are they? I googled it, but could not figure it out on a quick look, at least.

#lazyweb

trealira
0 replies
2d21h

Dynamic scope means you that whoever last bound the variable at runtime determines its value, not whichever outer scope surrounds it.

I guess it would be something like this in pseudo code:

  var dynamic = 5

  function foo() {
    print(dynamic)
  }
  
  function bar() {
    var dynamic = 6
    // foo() prints 6 due to the binding above
    foo()
    // After bar() exits, "dynamic" will go back to being 5
  }

mikepurvis
2 replies
4d3h

I mean, PICO-8 is basically that:

https://pico-8.fandom.com/wiki/Circ

But I'm just not sure if pixel-by-pixel drawing is the right abstraction layer at this point in time.

xyzelement
1 replies
4d1h

that looks cool to me, thanks for sharing. I think something that works well for kids is instant feedback on their code/tweaks, looks like this has it.

mikepurvis
0 replies
4d1h

PICO-8 sits in an interesting spot where it's not exactly a "toy" language/environment the way Scratch is, nor is it especially geared at learning, but on the other side of things it is based on a real programming language (lua) and there's an interesting scene for it where people flex pushing it to its limits making demos, demakes, etc, many of which are far beyond the NES-level capabilities it's meant to have.

philiplu
0 replies
3d23h

I’ve got a 15 yo who decided to start programming in Python using Pythonista on his iPhone. He refuses to take any input from me; just wants to learn on his own. Pythonista comes with some nice game-programming modules. So far he’s shown me a Pong game, 2048 clone, air hockey, and more.

khrbrt
0 replies
4d2h

Would processing[0] be a good fit? It's designed to be easy to use and learn but powerful enough for professional use. Very quick to get cool stuff moving on a screen and the syntax is Java with a streamlined editing environment.

[0] https://processing.org/

bombcar
0 replies
3d23h

Throw him off the deep end into Minecraft modding ;)

If you search around there are some pretty decent "download InteliJ and go" things.

mschaef
5 replies
3d23h

The other divergence is in expectations.

A kid in the early to mid-80's could get enough working that they could imagine it possible to match a store bought game.

Once hardware capabilities went up, so did the need for more specialized skills and longer development cycles. Even though it might have still been possible to draw a box on the screen with a single command, the relationship of that box to a valuable outcome was a lot less obvious.

ncruces
4 replies
3d20h

Only if people loose the ability to appreciate simple things. 2048 is a lot of fun. So is wordle. Or the xor game that made HN a week ago.

ambigious7777
1 replies
3d17h

I didn't see that "xor game" (probably becuase I went on vacation), would you mind linking it for me?

mschaef
0 replies
3d19h

Flappy Bird too, for that matter.

jasonfarnon
0 replies
3d17h

tetris, 1948

josephg
3 replies
3d21h

When I was teaching programming my go-to was the JavaScript canvas api. It’s 2d only, and very simple. And being on the web once a student has made something we can host it for them and they can show their friends.

I have a ~20 line html harness which sets up a page with a full screen canvas element and gives you global window width & height variables. That’s all you need to get started. And it’s real JavaScript - so students learn a useful programming language as a result, and the advanced students can go nuts and add sprites, sound and networking if they really want to.

xyzelement
2 replies
3d16h

Any chance you have an example of what you are talking about

mct
0 replies
3d12h

Another option is to use the p5.js library. They also have a nice online editor, at https://editor.p5js.org/, which makes it easy for students to get up and running quickly.

josephg
0 replies
3d15h

Yeah this is the template I use. As I say - its like, ~20 lines of boilerplate:

http://josephg.com/canvas_f.zip

I tell people to leave the html file alone and edit the javascript, using the canvas API to make it draw whatever they want. Canvas supports text, shapes and images and translations / transformations. If they want, I show them how to animate their work as well.

People make fun cursed things like this:

https://home.seph.codes/public/niamh/

phendrenad2
0 replies
4d1h

At the very least, you can always run Borland Turbo C++ on Windows 2000 in a VM.

ecshafer
0 replies
4d3h

There has been a lot of effort put into making kids versions of technology, which I think is probably the right move for elementary school aged kids. But once you get into middle and highschool, I think as a kid there is a "coolness" factor to using the same stuff that the pros use. Since at that point you are trying to be an adult, you are trying to do stuff the right way, and hey you might be getting a real job in the field just a couple years anyways. So there is some value I think in making just more friendly of an environment and on boarding process to a tool.

anta40
0 replies
3d16h

I'll most likely pick FreeBASIC or Python for teaching kids simple graphics programming. FreeBASIC has a QB-compatiblity mode, so using old QB tutorials shouldn't be a problem.

Or Python for 3D stuffs.

Solvency
0 replies
4d

It's called Processing.

zozbot234
7 replies
4d3h

Modern 2D graphics are not based on plotting pixels to a framebuffer, they have "textures" or "surfaces" as a native part of the system and any compositing is done as a separate step. So if anything making "simple sprites" has become a bit easier since you can just think of any composited surface as a more generic version of a hardware sprite.

hnthrowaway0328
4 replies
3d17h

I'm curious how does the texturing work in the SDL backend? I did not read the code but am curious now.

Back in the day I heard about of a friend who claimed to write some 2d framework which was faster than Direct2D -- that was a pretty early version of Direct2D I think.

krapp
3 replies
3d17h

SDL is a cross platform library, so it works by just using whatever native rendering context is available, abstracted to a common API. In more general terms AFAIK it uses the geometry API under the hood and just pushes triangles/quads to the GPU whenever possible.

hnthrowaway0328
2 replies
3d15h

Thanks a lot. Is it the same for 2d?

krapp
1 replies
3d8h

I was refering to 2D. for 3D, SDL2 has OpenGL and Vulkan APIs.

SDL3 is going to have a general purpose 3D API with its own shader language AFAIK.

In my experience, if what you want is to just get a window open and render some sprites, have some basic low level stuff handled and not have your hand held, SDL2 is ridiculously easy for it. There's also Raylib[0] and SFML[1], neither of which I've used but I hear good things about.

[0]https://www.raylib.com/

[1]https://www.sfml-dev.org/

hnthrowaway0328
0 replies
2d18h

Thanks, I'm not familiar with the graphics internals so sorry for the confusion.

Yeah I made half of a Ultima spinoff with SDL2 and it's pretty easy. Basically I load a texture which is a spritesheet and it's trivial to present a tilemap.

ecshafer
1 replies
4d3h

I think this is more simple from the pov of making a game engine from scratch or a game with complex effects and graphics. But is it more simple from the pov of a high schooler that just wants to get some flat colored shapes on the screen?

zozbot234
0 replies
4d3h

Even rendering "flat colored shapes" efficiently can be a bit non-trivial if you expect pixel-perfect results, like you'd get by plotting to an ordinary framebuffer - the GPU's fixed rendering pipeline is not generally built for that. The emerging approach is to use compute shaders, and these are not yet fully integrated with existing programming languages - you can't just edit ordinary C++/Rust code and have it seamlessly compile for CPU and GPU rendering. But we're getting closer to that.

JohnFen
3 replies
4d1h

I learned to program in Borland Turbo C++.

That still reigns as my favorite IDE of all time by a country mile.

pjmlp
1 replies
3d23h

Same here, going into UNIX back in the early 1990's, after using the Borland IDEs across MS-DOS and Windows 3.x (and being aware of their OS/2 versions), felt like time travel to the genesis of programming, CP/M style.

Thankfully a professor pointed us to XEmacs, which I managed to get my Borland experience back, which became my UNIX companion until KDevelop, Eclipse, Netbeans came to rescue.

anthk
0 replies
3d2h

Early 90's, but later you had WPE and XWPE.

qingcharles
0 replies
4d

My first real IDE and still my favorite IDE.

zackmorris
2 replies
4d

One thing from back then that I really miss is how easy it was to do some complex things.

This might be my biggest disappointment with "modern" programming. I want direct access to the hardware with stuff like $100 1GHz 100+ core CPUs with local memories and true multithreaded languages that use immutability and copy-on-write to implement higher-order methods and scatter-gather arrays. Instead we got proprietary DSP/SIMD GPUs with esoteric types like tensors that require the use of display lists and shaders to achieve high performance.

It comes down to the easy vs simple debate.

Most paradigms today go the "easy" route, providing syntactic sugar and similar shortcuts to work within artificial constraints created by market inefficiencies like monopoly. So we're told that the latency between CPU and GPU is too long for old-fashioned C-style programming. Then we have to manage pixel buffers ourselves. We're limited in the number of layers we can draw or the number of memory locations we can read/right simultaneously (like how old arcade boxes only had so many sprites). The graphics driver we're using may not provide such basic types as GL_LINES. Etc etc etc. This path inevitably leads to cookie cutter programming and copypasta, causing software to have a canned feel like the old CGI-BIN and Flash Player days.

Whereas the "simple" route would solve actual problems within the runtime so that we can work at a level of abstraction of our choosing. For example, intrinsics and manual management of memory layout under SSE/Altivec would be substituted for generalized (size-independent) vector operations on any type with the offsets of variables within classes/structs decided internally. GPUs, FPUs and even hyperthreading would go away in favor of microcode-defined types and operations on arbitrary bitfields, more akin to something like VHDL/Verilog running on reprogrammable hardware.

The idea being that computers should do whatever it takes to execute users' instructions, rather than forcing users to adapt their mental models to the hardware/software. Cross-platform compilation, emulation, forced hardware upgrades that ignore Turing completeness, vendor/platform lock-in and planned obsolescence are all symptoms of today's "easy" status quo. Whereas we could have the "simple" MIMD transputer I've discussed endlessly in previous comments that just reconfigures itself to run anything we want at the maximum possible speed. More like how a Star Trek computer might run.

In practice that would mean that a naive for-loop on individual bytes written in C would run the same speed as a highly accelerated shader, because the compiler would optimize the intermediate code (i-code) into its dependent operations and distribute computation across a potentially unlimited number of cores, integrating the results to exactly match a single-threaded runtime.

The hoops we have to jump through between conception and implementation represents how far we've diverged from what computing could be. Modern web development, enterprise software, a la carte microservice hoards like AWS that eventually require nearly every service just to work, etc etc etc, often create workloads which are 90% friction and 10% results.

Just give me the good old days where the runtime gave us everything, no include paths or even compiler flags to worry about, and the compiler stripped out everything we did't use. Think C for the Macintosh mostly worked that way, and even Metrowerks CodeWarrior tried to have sane defaults. Before that, the first fast language I used, called Visual Interactive Programming (VIP), gave the programmer everything and the kitchen sink. And HyperCard practically made it its mission in life to free the user of as much programming jargon as possible.

I feel like I got more done between the ages of 12 and 18 than all the years since. And it's not a fleeting feeling.. it's every single day. And forgetting how good things were in order to focus on the task at hand now takes up so much of my psyche that I'm probably less than 10% as productive as I once was.

zozbot234
0 replies
3d23h

Microcode? I don't think that's how modern μarch works. You can definitely make modern compute accelerators more like a plain CPU and less bespoke, and this is what folks like Tenstorrent and Esperanto Technology are working on (building on RISC-V, an outstanding example of "simple" yet effective tech) but a lot of the distinctive featuresets of existing CPUs, GPUs, FPUs, NPUs etc. are directly wired into the hardware, in a way that can't really be changed.

a-dub
0 replies
3d17h

sounds like you want transmeta and the jvm to have a post-oop baby that natively understands matrix and vector math, simd and massive parallelism.

m000
2 replies
4d3h

I kind of envy you for having access to Borland Turbo C++ and learning resources for it as a kid. The closest I could get to it was reading a review on my local computing press. Even assuming I could magically get a copy, I still wouldn't know what to do with it without reading material. And even if I had the reading material, I'm not sure how much I would make out of it with my fledgling knowledge of English at the time.

jerry_tk
0 replies
4d1h

Borland Turbo C++ (and Pascal as well) had great help documentation. Every function was thoroughly explained and there was a lot of examples. I learned C just by reading Turbo C++ help docs. I miss that time.

elzbardico
0 replies
4d1h

I wouldn't have access too if it were not for the local pirate scene. Even owners of local software houses doing professional accounting systems didn't mind copying a few disks of software they paid for to a kid who knew how to ask. If you got something new, you'd immediately share with your colleagues.

And everybody kept in mind that if we ever started making money from our hobby, one of our first investments would be into buying properly licensed copies of the tools we used.

Keyframe
1 replies
3d22h

good old conio.h and Borland's BGI graphics.h :) SDL is kinda there for that today, albeit not as simple.

cmpxchg8b
0 replies
3d19h

xlib > bgi. That thing was so "powerful" it destroyed my monitor using a mode with suspect timings :D

Arnavion
0 replies
3d20h

Computer Science classes in Indian schools (fifth to eighth grades in my case) taught programming using Borland Turbo C++. That was back in the 2000s, but I wouldn't be surprised if they still use it today.

Borg3
14 replies
4d6h

Yeah, thats great in retro systems. If you gather the tools, you can rebuild them. I recently recompiled good old Uplink game for fun. All I needed is Visual Studio 6.0. The devel archive contained all necessary deps to build game. It was a bit tricky to do it right but after an hour of fixing deps (I got rusty) I made it and game works fine :) I even fixed some minor glitches and added little improvements here and there.

Now imagine touching todays stuff.. Madness :)

saidinesh5
6 replies
4d

I think that's only with Linux's package management mess?

Most commercial embedded/windows stuff I have seen usually vendored their dependencies in a thirdparty/ folder AND fixed the toolchains they use. So the setup is always reproducible.

Borg3
2 replies
3d23h

Dunno, but for example I tried to compile Stalker SoC and it was very messy adventure. It took me whole day to build it (deps problems, compile errors) and it was such an disappointment because game just crashed. Maybe I will try again one day.

Levitating
1 replies
3d8h

Well that's the X-ray engine for you

Borg3
0 replies
3d5h

Hehe, yeah ;) I ended up disassembling it, then looking at disasm + source code I removed dynamic shadows in DX8 renderer (I have weak GFX card, and in DX8 shadows look not good anyway) and patched the binary :)

dormento
0 replies
3d23h

I do it that way with toolchains for exotic architectures. Its easier.

(I'm from the "Hey as long as it works" school of thought).

bombcar
0 replies
3d23h

A ton of work has to be done to build old minecraft mods from source.

Doable, but often tricky.

anthk
0 replies
3d2h

Linux and most projects will just compile straight with few dependencies or have the dependencies as submodules.

I compiled a 1994 game just straight with few header changes.

With Windows your need a huge VB6 IDE. Or a DOS emulator.

Also, Direct Draw stuff under Windows > 7 will run at atrocious speeds, slower than literally emulating Windows 95 on qemu with no hardware acceleration. And lots of DX 7-8 games will be glitchy as hell.

sillywalk
3 replies
3d23h

I recently recompiled good old Uplink game for fun.

Cool. Cool game. I didn't know the source code was available.

gattilorenz
2 replies
3d23h

It’s available on the Internet Archive, and iirc there’s also a source-available multiplatform port/new engine on github

davikr
1 replies
3d18h

source-available multiplatform port/new engine on github Can you link it, please?
ido
2 replies
4d4h

These days a lot of what you need is simply the right version of Unity or Unreal installed. You’re definitely correct for pre-unity times tho.

TeMPOraL
1 replies
3d7h

Given that both seem to be continuously self-updating subscriptionware, can you even install a specific version that's older than a year or two?

ido
0 replies
2d19h

Unity is not self updating, you have to update yourself. And they have versions from a long time ago available for install.

pjc50
8 replies
4d5h

Borland C++ was extremely good: a C++ compiler, standard library, and IDE with debugger that fit in about five megabytes. With the cozy yellow-on-blue Borland colour scheme.

jahnu
5 replies
4d4h

OWL was pretty good too.

pjmlp
4 replies
3d23h

I was quite surprised how bad MFC was by comparison, then OWL got replaced by VCL, still quite good, while MFC continued bad as always.

While Borland had an approach that we could get nice high level frameworks in C++ (a sentiment that Qt also shares), Microsoft C++ folks seemed keen in keeping their beloved MFC as low level as possible.

I read somewhere that MFC originally was similar in concept, but faced too much resistance internally, thus the outcome and was rewriten.

And to this day Microsoft hasn't been able to deliver a C++ framework that is as nice to use as those from Borland.

mschaef
1 replies
3d19h

I read somewhere that MFC originally was similar in concept,

This is what AFX was. They reused some of the core pieces, so the AFX name persisted into MFC.

but faced too much resistance internally, thus the outcome and was rewriten.

The concern was that developers who had just ascended the Win16 API learning curve would now have to ascend another totally different learning curve to understand the framework. MFC developed from a (supposedly) nice object oriented framework into a way to avoid explicitly passing handles to API calls. (It also replaced message cracking and a few other things.)

By the time Visual C++ rolled around, Microsoft started adding higher level abstractions to MFC and building it back out a bit, but the underlying damage was done.

And to this day Microsoft hasn't been able to deliver a C++ framework that is as nice to use as those from Borland.

ATL was supposedly quite nice, as was the mostly unsupported WTL derivative (that supported complete app development).

pjmlp
0 replies
3d11h

Yeah, that was the thing with Afx, kind of forgotten it, thanks.

ATL was and is, anything but nice, unless one is into deep love with COM without tooling, to this day Visual Studio still doesn't offer anything to alleviate the pain of dealing with COM, IDL and generating C++ stubs, because just like MFC and Afx, it seems there are many internal feuds against having nice tools.

The only time Microsoft finally created something nice to use COM from C++ (C++/CX), those internal feuds managed to kill it, replace it with a developer experience just as bad as ATL (C++/WinRT), and then when bored left the project to play with Rust/WinRT.

datavirtue
1 replies
3d19h

MFC completely turned me off of Windows programming until WPF was released. By then it was too late and I was kneck deep in Swing.

Gibbon1
0 replies
3d12h

MFC made me run away and do embedded hardware and firmware full time. Because I would have had to go all in. And that would drive me insane.

rurban
0 replies
4d3h

Zortech was much better

Findecanor
0 replies
3d19h

It also came with printed manuals that included a C++ tutorial. I used it to learn C++ back in the day.

Isamu
8 replies
4d4h

Installing to the C drive! Luxury! We swapped those floppies back and forth out of the A and B drives because we had no hard drive and it kept prompting for the part of the compiler swapped out! And we enjoyed it! Kids these days don’t believe you when you rant about it!

devoutsalsa
6 replies
4d4h

In 1992, I had a 5.25 inch, double height hard drive with 8.9 MB of storage. I also had two 5.25 inch, 360 KB floppy drives.

I was jealous of my friend's Mac w/ a 720 KB disk drive and 30 MB hard drive (although his system cost probably cost 6x what my PC cost).

toast0
2 replies
4d3h

Likely just full height, not double height. What has become the normal height for a 5.25" drive is half height. I've seen plenty of full height 5.25" hard drives, but I don't think I've seen double height (would take 4 bays in a modern computer case, if modern computer cases had 4 bays)

o11c
0 replies
3d15h

I was going to say "my current desktop has 4 CD-capable bays", but I checked and it's actually only 3 bays followed by unused space (and then 2.5" and 3.5" bays below that).

devoutsalsa
0 replies
3d23h

It's been a while. It was twice the size of two 5.25 floppy drives. Whatever size that is.

xorcist
0 replies
4d2h

The full height drives were the 8" floppies. They hardly existed anymore in the 90s, not on PC systems.

Wikipedia has a nice article with pictures: https://en.wikipedia.org/wiki/Floppy_disk

Even if you didn't have that 3.5" drive for your computer, perhaps you can take solace in the fact that you understand intuitively why the disks were called "floppy"!

michaelcampbell
0 replies
4d3h

although his system cost probably cost 6x what my PC cost

How times have (not) changed, eh?

datavirtue
0 replies
3d19h

Mmmm...back when your computer had a smell. You mentioned that hard drive and the memories of computer aroma flashes in my snoot. I had a friend that would seal up his Nintendo to trap the aroma.

mixmastamyk
0 replies
4d2h

‘92, not ’82.

yaky
7 replies
4d5h

I highly recommend his book on Wolfenstein 3D (mentioned at the end of the article). Even though it's technical, it is not dry or boring. And there are lots of old-school tricks and optimizations, like 64 unwrapped functions for scaling wall textures, storing the sprites "sideways", managing the wacky graphics card, and hacking the graphics modes to even be able to display something like a game in the first place.

bombcar
0 replies
3d23h

https://fabiensanglard.net/about/ has all of them at the bottom - there is one for Wolf3D and one for Doom - highly recommended.

The CP-System is also interesting, even for someone like me who never used one. But not quite as interesting (to me) as the other two.

raid2000
0 replies
3d22h

Thanks for the recommendation. I just downloaded the 431 page pdf on the game engine. It's even got a foreward by John Carmack.

endgame
0 replies
3d11h

His follow-up book on Doom is also a fantastic read.

busfahrer
0 replies
2d6h

Wholeheartedly agree, I did not know books like this even exist, and enjoyed both (Doom as well) of them thoroughly. Too bad he’s never going to write the Quake/i586 book.

sfc32
5 replies
4d5h

Had some problems opening the archive

Archive: wolfsrc.zip End-of-central-directory signature not found. Either this file is not a zipfile, or it constitutes one disk of a multi-part archive. In the latter case the central directory and zipfile comment will be found on the last disk(s) of this archive. unzip: cannot find zipfile directory in one of wolfsrc.zip or wolfsrc.zip.zip, and cannot find wolfsrc.zip.ZIP, period.

giantrobot
3 replies
4d4h

With curl you'll likely need to give the '-L' flag to follow a redirect automatically.

Yup. The link gives a 301 redirect to the HTTPS site. With wget it follows redirects automatically.

ozymandias1337
2 replies
4d4h

wget FTW.

giantrobot
1 replies
4d3h

I don't dislike wget at all but I appreciate the explicit behavior of curl.

ozymandias1337
0 replies
3d20h

Fair 'nuff.

pjmlp
4 replies
4d5h

Sweet memories of Borland's MS-DOS IDEs.

datavirtue
1 replies
3d19h

I keep a box with the manuals on my very previous bookshelf space. Turbo C

pjmlp
0 replies
3d11h

I have still all manuals from all Borland products, they were great.

In rainy days, I still browse through them.

For those that never saw them, they are available at the Internet Archive.

xyzelement
0 replies
4d3h

If you have nostalgia for this, check out Free Pascal. It's an open-source Pascal/Delphi clone, and includes a clone of the blue DOS Turbo Pascal editor that will give you the vibe you crave.

I used it for a few of the Advent of Code days this year, but man the nostalgia wore off. Both the limitations of the IDE and the verbosity of Pascal weren't "fun" to operate in daily coming from the modern world. But definitely sweet memories.

Dwedit
0 replies
3d11h

I think RHIDE still has source code available, even if it's 23 years since the last update?

Real problem with implementing such a terminal-mode GUI is that Unix/Linux terminals clung to a terminal standard which didn't provide all the features of an MS-DOS text mode program. In particular, it can't recognize the Alt key by itself, or Alt+Letter key combinations. Or Ctrl+Shift+Left/Right keys.

sebastianconcpt
2 replies
4d

Oh the memories..

One thing that we need to rescue/reinvent from that are the TUIs which will look amazing in higher resolution screens.

bombcar
1 replies
3d23h

The speed you could get on those "box character TUIs with light up Alt-key shortcuts" was insane - I can remember going to Fry's Electronics, where everything was managed by some (probably mainframe) computer that was interacted with via terminals (later a terminal emulator on windows) and the employees who knew it could get you a printed out quote for the cage before the thing had even finished drawing the first screen.

sebastianconcpt
0 replies
3d22h

dot matrix printer background noises

Sorry I didn't get the last part of what you said, come again?

jpm_sd
1 replies
3d22h

Gosh, I really miss that DOS text-based UI. Reminds me of my old reliable PS/2, which I kept using up until about 1997!

einpoklum
0 replies
3d21h

You can use midnight commander on your terminal, at least...

aidos
1 replies
4d6h

After seeing it mentioned on here a bunch of times over the years, I finally read Master of Doom a couple of months back. Great book!

The games they were producing were incredibly exciting to play at the time, but it’s even more inspiring looking back at the history to see what a handful of scrappy kids could create.

busfahrer
0 replies
2d6h

Also, Romero recently released his memoir “Doom Guy”, has anybody here read it? Wondering if it’s any good.

visysl
0 replies
3d17h

It is excellent book that can bring you throughout the game development in the old day again. I like the book very much and I finished reading it while waiting in the line in the cafeteria years ago.