Turbo Pascal is what got me into programming. I remember spending hundreds of German marks on a license for Borland Pascal 7.0 and later Delphi 1.0 and 2.0. I ended up developing my first “commercial” software that I sold for money.
In the DOS era, Turbo Pascal was probably the easiest way to get into programming, outside of Basic. And on Windows 3.1/95, Delphi was eye-opening how easy GUI programming could be.
In many ways, I feel like we have gone backwards from there. How was it possible for the Turbo Pascal / Delphi compiler to produce a small binary for a fully-featured GUI program when today similarly powerful software is orders of magnitude larger in size?
There was at least one elephant missing in that executable - Unicode support. It would not fit into PC RAM back then probably.
Unicode support is cheap memory wise. The expensive thing if having the fonts. The fonts can live on the hard drive until actually needed.
I suspect the issue is more of the runtime overhead of supporting it over the entire stack. That's a bigger cost that people would only want to pay if they were actually going to use it. So then you're in the situation of having to support unicode and non unicode versions of everything.
Most programs written in Pascal were UI programs (even in client-server model client was typically a specialized program, not browser), so rendering a string would indeed require a font file. You could pre-render and cache in RAM some frequently used glyphs (locale-specific alphabet, digits etc), but hitting HDD every time to render an emoji won’t be fast enough. Modern Unicode was simply not feasible.
HDD and memory sizes were growing very fast back then. So it would've been feasible even on fairly low-end hardware, starting from the mid-1990s or so. If you could have "multimedia" or "DTP" software on such PC's, modern fonts ought to have been possible. The flip side is that old computers became obsolete very quickly back then, a few years were enough for very real generational changes.
Low-end hardware in 1995 was something like i386 with 1 Mb RAM, meaning that just a modern Unicode font would consume most of the memory available to a program, probably leaving no enough space for the rendering code (which is not small by standards of that time). In 2nd half of 1990s I still maintained a classroom with 15 IBM PC XTs, which were still doing their jobs (our most modern hardware were Pentiums with 16 Mb RAM IIRC).
That isn't unicode though. That's a font. You can have bitmap unicode fonts if you want.
Let's put it this way. Say you have a unicode aware library and only ever use the ASCII compatible codes. You aren't using more space for fonts.
If you want to read a Chinese document, yes you would need to then install Chinese fonts. That would take space yes. But it's possible. If you only speak Chinese that's something you have to deal with.
Could you have a font for every unicode point at the same time, probably not, but most people don't need to read most code points most of the time.
If you use an emoji load it into ram then. You aren't (generally) going to be using Arabic, Chinese, Japanese, Sanskrit, ancient Egyptian and emoji all in one document on one screen. If you're using Japanese load the Japanese font, if Chinese, use the Chinese font.
Extended ASCII was the way forward then.
Yes because Unicode did not exist at the time
Today you can just use Lazarus and recompile. The lower of the bottom machines are Atom netbooks, Core Duo's and Raspbery Pi's. I'd consider using an Atom or RpiB+ the same as a 496 in 1999.
On fitting in RAM's, in depends. From 1993 to 1998 the changes where huge.
That's because the "fully featured" of the 90s would be barely usable today. Or to rephrase: the frameworks and programs of today are not "similarly powerful" to the ones from the mid 90s, even if you just recompiled the app from 25 years ago with the current version of your framework (theoretically ;), it would gain support for unicode (codepages where just...), internationalization, accessability, support for networks (who still knows Novell?),... . Except for SAP, they somehow succeeded at combining the user-hostileness of the 90's UIs with the resource consumption of contemporary programs ;).
Turbo Pascal had been the language I used after BASIC, with which I started.
Depends on what you mean for SAP. Reports at least are super easy to write and have better usability than other in-house stuff I saw. UI5 now has components that you can use with React, which give the web experience. SAP GUI is kind of okay, and fast. What do you mean?
probably core ABAP development in some module.
Joel Armstrong once said "You wanted a banana but what you got was a gorilla holding the banana and the entire jungle.". That's a very catchy metaphor, although he was talking about C++ style OOP with bloated classes. Still, it feels like it could be generalized to software bloat, rigth?
What you're arguing seems to be "you think you just wanted a banana, but it turns out the customers needed various banana sizes, levels of ripeness, even special alternative hypoallergenic banana breeds, so instead you get a gorilla to fetch you the right banana from the jungle for each situation."
I do think that there is truth to that, but to be honest I think that in an ideal world that would account for some these orders of magnitude, but leave out the majority of them. It is really, really easy to underestimate just how much faster and bigger computers have gotten in the last thirty years.
What I find a more compelling argument for the majority of this increase in software size and CPU usages is that letting the software bloat and slow down the level that the customers tolerate is a form of externalizing costs for developers. A lot of developer convenience comes at the cost of the end-user, imo. And even if the developers care, then the companies that employ them don't mind saving money that way.
Its about 2 marks to the euro apparently.
For anyone not up to date with ~30 year old exchange rates.
~20. The Euro (or "Teuro", as it was often called in Germany at the time) was introduced in... Uh, 2000 or 2001, thereabouts.
Ok. I googled and it said ~1.9.
I seem to remember it being the late 90s so that's my bad memory.
Funny, we also call it t'euro in Yorkshire too!
Ah, the mark. It's been a while. Maybe this was just in Britain but the name of the mark was never translated into English when spoken or written, it was always just the Deustchmark.
And not only top-down procedural programming as was the norm for Pascal in those days, the Breakout demo was a lesson in OOP.
Changing the code and seeing the results was even on the machines of that time practically immediate with the first versions of the Turbo Pascal, as long as the program and the source were able to fit in the RAM. And the interaction was designed to be immediate too. If I remember correctly the compilation error didn't cause infinite screens of report, it just repositioned me to the line where the error was. If the cause was a typo, I was able to edit recompile and run in a second. Most more recent tools have other ideas, requiring more unnecessary activities from a developer.