return to table of content

Interviewing my mother, a mainframe COBOL programmer (2016)

ptmcc
30 replies
22h42m

I can only imagine the fat paycheck a 20-year old mainframe programmer would get though, because your age in this case would be invaluable.

It's funny, people frequently assume this would be true but reality doesn't really bear this out. It's typically pretty average to even below average which contributes to the talent pipeline problem.

The other side of it is that its not the technical stuff like "knows COBOL" that is so immensely valuble. Any average dev can "learn COBOL" but that's not actually the valuable thing. The anecdotes of COBOL programmers coming out of retirement for 500k/yr contracts has little to do with COBOL, but their accumulated institutional knowledge of the giant ball of business logic encoded in that COBOL.

If these banks and other institutions actually did write fat paychecks to young mainframe programmers the demographic problem they're facing might not be so bad.

Deprecate9151
8 replies
21h57m

I used to work at a large insurance company with a COBOL core system. I completely agree with this. They paid for people who knew all the undocumented idiosyncrasies and foot-guns in the code. Not just "Knowing COBOL". A decent programmer could read a COBOL program and follow it.

They wouldn't know that the reason the key export fell over when extracting data for your 10K filing because someone had decided/assumed/whatever that a certain record count would never go over 3000, so they hard coded the program to just error out if it went above that value.

jSully24
2 replies
16h36m

I just shared this story earlier today with my team. This was Cobol insurance claims processing system, shortly before the Y2K issues.

The claims processing system I worked on used a number system that would only go to 4 digits, it would roll over to 0 if we ever processed more than 9999 claims overnight.

Our VP would not listen to us when we said we needed to change that. He said "Claims process fine every night! There is no problem." I had left before they got to that 10,000th claim (thankfully).

That said, anyone need an old COBOL developer?

Edit: now I’m the VP of Engineering. We do work on tech debt regularly.

julian_t
1 replies
9h0m

Fortran, not COBOL, but I came across a similar thing when helping to update code for Y2K. An insurance company used a 2 digit field for year of birth, assuming that it would always be prefixed with '19'. I sort of forgave them because the code had originally been written in Fortran 2 sometime in the 60s, and who would ever assume that their code might still be in use 40 years later?

trealira
0 replies
3h31m

In the 1960s, there were still 70 year olds born in the 1890s, and 80 year olds born in the 1880s, so it still doesn't seem like a good decision even for something written back then.

CapsAdmin
2 replies
17h54m

I feel like you and the parent are saying a company pays well if you master COBOL as opposed to just "learning" COBOL.

xboxnolifes
1 replies
17h31m

It has nothing to do with mastering COBOL. It has everything to do with mastering the banking systems.

mastazi
0 replies
13h10m

Yeah, that's true of anything related to the finance sector IMHO regardless if it's COBOL or Java or Python.

The language you happen to be using in your system is almost inconsequential in the grans scheme of things, because it's like 1% of everything that's going on and that you need to know.

quickthrower2
0 replies
17h48m

Training a GPT on all that Cobol could be invaluable at least as a “footgun catcher” for less experienced devs to act as a code reviewer.

ecshafer
0 replies
21h31m

When I was working in a finance company that used cobol, sure I could read cobol. I could probably write some cobol. But the extra crap around cobol, how mainframes work, DB2, JCL, etc. is where this gets complicated.

beretguy
6 replies
21h0m

Last time I checked DMV of SC pays $50K salary with no remote work option for COBOL position. And requires 2 years of experience. Granted, it’s a state job, I get it, but still.

uudecoded
4 replies
20h48m

It's probably because they technically have to make that job posting and let it sit before handing it off to a contract firm for $300k.

bdw5204
2 replies
13h5m

What is stopping somebody who'd be inclined to take a COBOL job from just starting a contract firm and taking the $300k from the government? Does the government check if they actually have employees who have 30 years of COBOL experience before handing out those contracts?

kawhah
0 replies
4h11m

Nothing is stopping them; this is exactly what people do.

Spooky23
0 replies
6h45m

There’s a lot of compliance work involved in getting government contracts.

Also, there’s usually 2-3 layers of pimps (prime and subcontractors) who take a vig at each layer. Total comp for the contractor is usually rate divided by 3 or 4.

beretguy
0 replies
17h0m

I think so too.

chiph
0 replies
16h23m

It's been years since I looked, but mainframe jobs at the big banks in Charlotte were a little less than that.

Whenever I hear about a skills shortage (in any field), I automatically append the phrase "at the salary we want to pay" and that completes the equation.

ChrisMarshallNY
3 replies
15h50m

There’s the apocryphal tale of The Retired Engineer:

An engineer retires from his company, after a 40-year career. He left at a senior technical level.

Some time later, he’s contacted by his old company, begging him to help them fix a problem that has the current staff absolutely stymied. They have been at a standstill, for weeks.

He agrees, and shows up. He sits down at a workstation, examines the behavior, looks at some code, then says, after about five minutes: “Here’s your problem. If you just do this, it will fix it.”

He hands in an invoice for $10,000.

The beancounters flip their wig. “We can’t pay $10,000 for five minutes’ work! Itemize it, and tell us exactly why you think it’s worth it.”

He takes the invoice, turns it over, and writes on the back:

    1) Fixing the bug in five minutes. . . . . . . . . . . . . . . . . . . .
 . . . . . . . . . . . .$20
    2) Six years of college, and 40 years of experience, so I can fix the bug in five minutes. . . $9,980

neilv
0 replies
14h6m

A great tale. I usually hear a version about a halted production line, with the punchline invoice something like:

    1. Hit machine with hammer ....... $1

    2. Knowing where to hit it ... $9,999
The production line versions have the element of customer losing money by the minute, which suggests that the customer received the value, and is only confused/petty because it seemed so easy for the engineer to provide that value.

marttt
0 replies
4h11m

Another version, IIRC true story, from a small European country. A legendary artist and book illustrator (real person) meets a client in a cafeteria. The client says, I want a drawing with this, this and this, and it should carry a feeling of something like this.

The artist takes out a drawing pad and says to the client, would you excuse me for 15 minutes, I need to draw in silence and solitude.

The client leaves and sits to a different table.

After 15 minutes, the artist hands him over the drawing and names a price.

So much? cries the client. This is quite expensive. You only drew for 15 minutes.

Well, says the artist. In order to draw this in 15 minutes, I have been practicing drawing for at least 5 hours every day for the last 50 years.

Absolutely love this story (and the mentioned artist as well, he was a huge inspiration for many).

D-Coder
0 replies
11h58m

I had heard of this re Charles Steinmetz, but apparently it goes way back:

"The earliest instance located by QI ((Quote Investigator)) appeared in “The Journal of the Society of Estate Clerks of Works” of Winchester, England in 1908.":

https://quoteinvestigator.com/2017/03/06/tap/

tristor
1 replies
21h2m

If these banks and other institutions actually did write fat paychecks to young mainframe programmers the demographic problem they're facing might not be so bad.

When I first started work I was involved in banking/insurance mainframe stuff, and it was honestly a pretty terrible working environment for reasons that spanned from the mundane (like dress codes) to the esoteric (like the horrible crufty legacy codebase). I would have put up with it if it paid well, but it didn't. In fact, it paid significantly worse than the job I'd had before which was installing physical cable plant (fiber optics and copper ethernet), the reason I took it was at least the office had air conditioning, but it certainly wasn't something I wanted to turn into a career.

As a mid-level windows sysadmin doing customer-facing phone support at a cloud provider, I made nearly 30% more than I had as a junior mainframe guy at a bank/insurance company. The difference in the skillsets and their commonality was massive, yet the pay was significantly worse for the mainframe work despite it being a rare skillset in a high value industry. I remembered when I was in this job that guys who worked on trading desk backend code were far better compensated while working in easier development environments on less esoteric codebases, the mainframe folks were the lowest paid of the developers at that company, at every level of seniority. The only person we worked with who was well compensated was an independent contractor who'd worked there previously for 30 years before retiring.

owlstuffing
0 replies
19h13m

I had similar beginnings. Late 80s transforming literal reams of 360/70 assembly to “modern” cobol 85. Was still earning my cs degree at the time.

As you know, the pay wasn’t so hot for stuff like that back then. But it was an improvement over manually unloading flatbed trucks stacked with steel conduit, which was my prior line of work!

Today I think it’s much different. Some banks and insurance companies and the like still at least partially run on software as old as the Apollo missions. Maybe I’m wrong, but finding people still alive with that knowledge intact must be difficult and I imagine their compensation reflects that simple supply/demand formula. Shrug.

wdb
0 replies
18h46m

You don't want to know how much you can ask when you are one of the few that can help a multinational migrate away from an old database system ;)

thaumaturgy
0 replies
20h21m

Yep. I have COBOL experience (on a mainframe even!), and I'd even be willing to do it again -- and I'm young by the standards of COBOL programmers, so somebody could even get 20 years out of me still.

But the wages being offered for those roles are abysmal relative to what's available for other skills, and modern life is applying a great deal of pressure to chase larger paychecks, sadly.

lowken
0 replies
14h24m

100% true. The value is not knowing COBOL the value is hard core banking or insurance knowledge.

jongjong
0 replies
20h53m

In tech, it's only true if you know some dirty secrets. The more dirt you are exposed to, the more you keep your mouth shut, the more it pays. The thing is that teams are so large and compartmentalized that nobody knows the true horror of what's going on.

jmclnx
0 replies
18h53m

It's funny, people frequently assume this would be true but reality doesn't really bear this out. It's typically pretty average to even below average which contributes to the talent pipeline problem.

This, I went into programming over 40 years ago. At the time I worked in a warehouse, I had to take a 5% pay cut (IIRC). But in the long run it was worth it. Most people there stayed in the warehouse/manufacturing due to the pay.

importantbrian
0 replies
21h18m

I went to grad school with a guy who was in that situation. He worked at a bank and got laid off during the financial crisis. The plan was for the bank to port the old system over to Java or something like that, and they were "close" to the end of the project and they thought at that point the rewrite team was comfortable enough with COBOL that they could do the rest of the rewrite without him. Turns out that yes they understood COBOL just fine, but they desperately needed his institutional knowledge. He ended up agreeing to come back on a part-time basis at some obscene consulting rate for however long the transition took and in the meantime, he did grad school part-time to skill up.

aitchnyu
0 replies
13h13m

Indian companies are still training tons of new graduates in mainframe.

DaveSchmindel
0 replies
21h5m

I can vouch that this is true in some cases.

I worked for a multinational bank headquartered in the U.S., whose corporate team came up with a sexy program to snatch up recent Computer Science graduates. The program offered them a "fair" market rate for an entry-level jobs in industry, and let the graduates try two to three different teams and/or departments in their first year. At the end, if the company still liked them, the candidate got to pick their permanent team for a full-time position. This position, however, did not see a bump in pay, and by that time the program effectively filtered out candidates that weren't performing at an accelerated level.

The results were great for the business: young programmers that were often more proficient in COBOL than their new peers were costing them $60k-$70k a year. The senior, or "tenured," peers were in the $200k-$400k range in some cases.

dgadj38998
9 replies
21h45m

ISPF is directly connected to the mainframe, and there’s no such thing as a local development environment here.

That's pretty crazy

Sounds like having the whole team SSH into one server and doing all the work through the terminal

I'm imagining editing my co-workers files and just removing a random semicolon to mess with them

flyinghamster
5 replies
20h46m

That was the 1960s-80s way of working with computers. Multiple terminals connected to a single timesharing mainframe (or mini, for that matter). You might have a network like DECNET, but mostly it would be hardwired terminals, or if you were less lucky, a dialup modem and an acoustic coupler at a blazing 300 bps. You were sharing one machine with potentially dozens of other developers.

Legacies of this abound in Linux (even the ability to SSH in is a descendant of this world). Commands like "who" are there to show who's logged in, and there's even an entire process accounting system that can be switched on to bill your users for CPU time.

dgadj38998
4 replies
20h30m

Was there a way to message other people who are logged in? Like a slack equivalent?

theodpHN
0 replies
12h43m

Long before Slack, there was TSO SEND. :-)

https://www.ibm.com/docs/en/explorer-for-zos/3.1.1?topic=mes...

ianmcgowan
0 replies
19h57m

talk, or later ntalk in Unix. Or wall to send to everyone. I used a system in the 80's (Pick DB) that had a send-message command. Also tandem which allowed you to take over someone's terminal like screen-sharing now. If you were the super-user you could do it without asking for permission, and start typing messages to the unlucky user..

flyinghamster
0 replies
20h24m

On Unix, there was the "talk" command for split-screen chat, and also "write" to just send a message to someone else's terminal subject to permission with the "mesg" command. Not all systems had such provisions (or, in come cases, they had to be written by users).

debo_
0 replies
14h18m

We used `talk` all the time in our university Solaris labs when I was there from 2000-2005. Mostly insults, but it was fun.

bennysaurus
1 replies
20h9m

Things is source code management these days even in mainframe but many shops still have you log into the same "box" to do your development. It can get pretty fun when trying to coordinate testing and patches.

skissane
0 replies
11h26m

Things is source code management these days even in mainframe

Source code management on mainframes has been around for decades now. Pansophic started selling Panvalet in 1970, and Broadcom still sells CA-Panvalet.

Closer to RCS than to Git in feature set.

but many shops still have you log into the same "box" to do your development

It is very common to have separate LPARs for production, test and development, even if all three are running on the same physical hardware - the isolation is strong enough that it is rare for something running in one LPAR to cause a problem in another.

The largest shops will have physically separate mainframes for production and non-production.

dajt
0 replies
9h24m

That was it. You had to organise amongst yourselves who was editing which program, and there was no version code control other than anything you could come up with using manual copies. It was primitive.

I was using an Amiga at home and had been using Xenix and honest to god AT&T UNIX in my previous job so had some idea of what I was missing. I also used IBM S36s and weird Burroughs things so had a pretty wide education on different types of systems in a short time.

pavel_lishin
7 replies
23h1m

I wish I had talked to my grandmother more before she developed dementia. I knew she was a mathematician and programmer, but briefly speaking to her a few years back, she mentioned that one of her jobs was calculating orbits for satellites in the Soviet Union.

At least, I hope they were satellites!

ca_tech
3 replies
22h45m

I would highly encourage anyone who has even thought about doing this; do it. If you are thinking about how to get started, I recommend the StoryCorp app. It is easy to use they have a bunch of prebuilt questions or you can create your own. You record directly to your phone but if you are inclined, you can upload your interview to the Library of Congress. https://storycorps.org/participate/storycorps-app/

pavel_lishin
1 replies
18h20m

I actually interviewed my mom about ... 15 years ago? About some topics. But the fucking recordings are locked up in a proprietary recorder that requires software running on windows95 to unlock it, and I don't even know if the software exists anymore.

I should make that my December project, to find a way to unlock those files.

wfvr
0 replies
17h21m

Check if they aren't just xor'ed, I've seen that being used before as cheap "encryption".

artemavv
0 replies
22h35m

There is also a project dedicated to saving the wisdom of earlier generations: https://savewisdom.org/the-1000-word-save-wisdom-questions/

CapricornNoble
1 replies
22h36m

I'm in a similar boat. My mother worked with software simulations for the AEGIS system. Not very common work for black women in the 1980s. She earned her mathematics degree in 3 years, graduating at 20, and used to program computers with punch cards in the 70s.

Now she can't tie her shoes consistently.

To everyone reading, capture the stories of your loved ones life accomplishments before it's too late!

Tor3
0 replies
21h49m

This. Not related to the main story here - I'm the first programmer in my family - but my great-grandfather was born in 1865 and lived a long life, he was blind for his last 30 or 40 years but had a phenomenal memory. A neighbor had the insight to use one of the very first tape recorders, just after WW2, together with my father who was a boy at that time, to record my great-grandfather's stories about what he did in life. When my father retired he cobbled together a working tape recorder from two or three old ones, and copied the stories over to cassette, and later to CD. I listened to one of those recordings the other day - my great-grandfather told about how when he as a young boy, approximately around 1872-1873, he met a very old man who told about how he had learned the origin of the name of where they (he and my great-grandfather) lived, from another man when he was a boy. That origin turned out to be very different from what everyone these days are guessing, but immensely more plausible. Brought to me from the 18th century, via my great-grandfather.

rsynnott
0 replies
21h36m

At least, I hope they were satellites!

Could be both: https://en.wikipedia.org/wiki/Fractional_Orbital_Bombardment...

danielodievich
7 replies
22h29m

My grandmother programmed with punch cards, have no idea on what hardware and it's too late to ask. My father did a bunch of fortran and Cobol on USSR mainframes and then did a bunch of y2k here in USA. One of the neatest things I have from this is a printout from one of his fortran programs from I think Minsk-32 mainframe which he ripped into 3 pieces to wrap a developed large format 64mm film of some mountains that he shot a long time ago. The program seems to be called MATR1 and is doing matrix manipulations and referring to topography of the land in comments. I have it framed on my wall near my workstation. I coded in variety of languages most of my life And now my teenager looks to be interested in coding and is doing Java and Python in high school. Here is to 4th generation programmer!

999900000999
6 replies
22h26m

Sounds like a neat movie idea.

The same code base passed down though generations.

Legacy code.

jacobyoder
5 replies
21h44m

The code will only run on some specific soviet-era hardware, and the last hardware was destroyed 20 years ago. There's some new regime that is rumored to have rebuilt the original hardware, and is now on the hunt for the missing source code that is framed above the great grandchild of the original developer, who was executed for treason. The new bad guys are gunning for the source code, and it has to be protected at all costs, to prevent a takeover of the world.

I'm already seeing the inheritance jokes write themselves.

Someone have chatgpt write the screenplay now.

Stratoscope
2 replies
17h42m

Done!

CODE REDUX: Legacy Preserved

"In a world where the past and present collide, a family's legacy becomes the key to saving the future."

https://chat.openai.com/share/2f0c9eec-c3b8-4145-86d7-2c617a...

testplzignore
1 replies
17h28m

They have the code, then need to steal the code, then need to upload the code???

elzbardico
0 replies
4h17m

Now think about the fact that there are startups right now promising to use LLMs to filter job applicants.

micah94
0 replies
21h17m

I think you just wrote the prompt right there. Step 2. ???? Step 3. PROFIT!

ikesau
0 replies
20h49m

Class Warfare, out 1717243201.

shortsightedsid
6 replies
22h52m

What’s your working environment like? - We’ve recently moved to a more “hip” location. We used to have personal desks, but now we have this “pick whatever spot is avaiable” open area. I dislike it a lot.

Somehow this resonates a lot with me even if though I've never worked on Mainframes or anything like that.

artemavv
4 replies
22h36m

I, too, strongly prefer having a personal desk. It is completely natural for a human to set up his environment according to their tastes and preferences, and I'm baffled that some office designers do not account for that.

maximinus_thrax
1 replies
19h22m

Depressing to see how low we've gotten. It's not that people want an office (with a door) to minimize interruptions, you now need to advocate for your own fucking desk?

tomcam
0 replies
11h21m

That jumped right out at me too. When I started at Microsoft a quarter century ago they did their best to get everyone in an office

psunavy03
0 replies
55m

Because there's a subset of designers and architects who feel that they are uniquely enlightened and brilliant, and their job is to dictate to poor benighted proles how they should live their lives, rather than serve and empower human beings who have their own dignity and agency.

Witness modernist architecture: "you shall all live and work in soulless concrete boxes, because I the brilliant architect have decreed that ornament is superfluous, and I can shape you into Properly Thinking People by manipulating your environment."

citrin_ru
0 replies
9h34m

A think everyone prefers to have a personal desk but for hybrid (work from home with office days) it can be wasteful and a few days a month I could put up with a random desk. I see no point in forcing people use a random desk when everyone in the office most of the days.

altacc
0 replies
8h21m

I wonder if most offices are like us, where we have "flexible seating" but everyone has their spot and sits in the same desk whenever their in and most people respect that. So your chair, monitor position and brightness, desk height, etc... are all as you like it. Makes a mockery of the hot desking concept and we'd all much rather they officially abandoned the idea

vanderZwan
5 replies
23h9m

This position is the most important one in the bank, at least from a technical standpoint. If, let’s say, my mother and everyone on her team would quit their job, the bank would go under within a matter of weeks if they’re lucky.

And given how big the market share of Nordea is in Sweden (and other Nordic countries, for that matter) that would probably bring down the Swedish, and possibly Nordic economy. Which would then impact a lot of the EU as well, I guess.

Ever since reading this article I've wondered if these COBOL programmers that keep banks like this running are an enormously underestimated "bus factor" for many of the world's economies, and what kind of back-up plans they have for such a scenario.

[0] https://en.wikipedia.org/wiki/Nordea

bear8642
1 replies
17h29m

programmers that keep banks like this running are an enormously underestimated "bus factor" for many of the world's economies, and what kind of back-up plans they have for such a scenario.

Hmm, makes me wonder about potential similar situation with APL programmers - a language I know quite a few large companies use.

glaucon
0 replies
13h45m

Wow, I'd forgotten APL existed, my finger-in-the-air sense is that while APL occupied/occupies some important places in infrastructure the actual number of lines of code (and so, roughly, maintenance effort required) is significantly smaller ? I'm not doubting that what it does is important but just thinking that the size of the installed base that will, in time, need to be replaced is quite a bit smaller than COBOL.

TomK32
1 replies
22h39m

As a programmer who has been hit by a bus (after riding on that same bus for five hours to it's endstop in western Greece) I'd like to point out that getting hit and killed by a car is a much higher risk. Add the rise of SUV sales which are more dangerous to pedestrians and cyclists, and you might be as surprised as I am that banks don't go down due to dead programmers but due to bad accounting and fraud...

vanderZwan
0 replies
4h39m

If only "SUV factor" rolled of the tongue better, I'd switch in a heartbeat.

I hope you came out with more of a scare than anything else?

rsynnott
0 replies
21h40m

and what kind of back-up plans they have for such a scenario.

I mean, notoriously, stop working, for weeks: https://en.wikipedia.org/wiki/TSB_Bank_(United_Kingdom)#Migr...

(I've got to assume that loss of institutional knowledge was a big factor in that fiasco.)

hyggetrold
4 replies
22h54m

I'm very curious about IMS - a mentor described it as a graph database of sorts back in its day. I was told it was a great system (at least for its time). Does anyone have a perspective?

epc
2 replies
22h7m

Hierarchical, not relational, not what I’d call a graph database. Pre-SQL.

https://en.wikipedia.org/wiki/IBM_Information_Management_Sys...

hyggetrold
1 replies
21h31m

Yeah I think what they actually called it was "network database" which I thought maybe was like a graph.

skissane
0 replies
9h27m

IBM IMS isn't a network model database, it is a hierarchical model database.

The heyday of network model databases was after hierarchical, but before relational took over - basically the 1970s.

Hierarchical databases basically allow you to have parent-child relationships between tables, to represent many-to-one relations - something people commonly model nowadays using RDBMS foreign keys, but RDBMS foreign keys are much more flexible and can model other relationship types too, whereas hierarchical only allows parent-child.

Network model databases generalised hierarchical databases to allow more complex relationships than just parent-child. The biggest difference between them and the relational model, is they didn't have a query language like SQL, they offered a procedural API for processing one record at a time and manually navigating the links from table to table. Network model was standardised by CODASYL-the same committee which invented COBOL, with the result that they became particularly popular with COBOL applications.

The most famous network model database is probably Broadcom's IDMS (previously known as CA IDMS), which nowadays only runs on IBM and Fujitsu mainframes, but in past decades was supported on other mainframe and minicomputer platforms as well. Oracle also offers a CODASYL network model database for OpenVMS, which it got from Digital when it bought Rdb.

Graph databases have something in common with the historical network model, but some significant differences (1) graph databases have much more flexible schemas (defining arbitrary properties on a node as opposed to a rigid structure of fixed format record types), (2) graph databases generally have a query language with support for graph traversal operations, as opposed to the procedural API the network data model offered.

one_buggy_boi
0 replies
16h17m

One of my first projects was to interface with an IMS system using Python of all things, there was only one guy that really knew the system and I had to sit with them for hours to even begin wrapping my head around how a hierarchical DBMS worked, how space is managed, etc. I remember the first comment he made was that the database was created for use as an inventory system for the Apollo missions. The DB I had to work with was created in the 80's. This project was in 2021, so it's still out there, supporting critical infrastructure.

jll29
3 replies
21h41m

Thanks for sharing the story of your developer mom - a very cool mother to have!

COBOL is not a "cool" language, but mainframes have been around long enough to be "retro cool" now, and most run Linux at least as an optional OS under some virtualization (IBM Z).

As a doctoral candidate in the noughties, I purchased a book about FORTRAN due to its "retro-coolness" and read it, and eventually took on a short university gig to earn the money that the book cost back, tutoring architects/engineers in FORTRAN 95 for a bit, which was fun; to date, I could not bring myself to do the same with COBOL, though, or not yet.

Because it is so verbose, if I had to use it I would probably write in another language and transpile to it.

trealira
0 replies
3h16m

My university still has books that are from that era. A book from 1975 about teaching yourself FORTRAN IV, one from 1980 about learning IBM 370 assembly, a book from the 60s called LISP 1.5 primer. These books are still old enough to be "retro cool," and these books are old enough that they'd likely be expensive to buy (though most books are more like from the late 80s and 90s, not as old as these). It seems cool to me, although a greater selection of newer books would be nice.

flyinghamster
0 replies
20h38m

I've probably touched on this before, but back in the 1980s, absolutely none of the computer courses I took ever even so much as touched an IBM mainframe. I almost got the impression that there was an effort not to teach them.

On the other hand, you could expect to learn a new operating system every year. Oh, we don't use DEC gear at this school, check out our spiffy Prime running PRIMOS. Lather, rinse, repeat.

dajt
0 replies
9h46m

My first jobs in the late 80s/early 90s were COBOL programming. At my second one I finally convinced the management to put me onto the tools team who wrote the code generators for the '4GL'.

One of the things I did was write a COBOL pre-processor (in C) to allow re-use of code and variables so the variables looked like locals rather than globals.

hnthrowaway0315
3 replies
22h28m

1TB transaction per year seems pretty small data. I guess IBM stuffs are there for the stability then?

awestroke
1 replies
22h22m

The IBM stuff is pure legacy. Of course this would all perform much better with modern tools, like Postgres.

accra4rx
0 replies
16h59m

Just because IBM stuff is legacy doesnot means it is slow. Almost nobody is moving from a commercial database (Db2 for z/OS) to PostgreSQL because of performance. It is all to do with huge licensing cost. Talk to any veterans nothing matches the speed and stability of mainframe. search about z16 (tell me anything that guarantees 9 nines of availability)

dan-robertson
0 replies
19h59m

Using DB/2 surely makes a lot of sense for integration with the rest of their IBM tech but also something the new system could presumably talk to too. I would guess that it performs ok at the kind of ‘single big OLTP database server’ tasks it was meant for. Notably IBM allows you to publish benchmarks for their databases (so long as you provide detailed methodology and take steps to tune it correctly) whereas Oracle and SQLServer do not allow publishing benchmarks without permission. Though one could argue this is just IBM feeling like benchmark performance wouldn’t matter much to their customers.

derefr
3 replies
21h0m

Banking systems are also extremely advanced. A personal bank account differs a lot from a business bank account, and there are at least 50 different types of bank accounts for each of them.

I wouldn't necessarily call that "advanced"... maybe more, lacking in requirements-analysis-time insight in ways to factor the business-domain into HAS-A component relationships, such that individual components and their ADTs can be shared across parent types [which are really just template/factory objects for a smaller set of actual types], and initialized with simple parameters that get used as formula variables with no piecewise logic.

To be clear, I write this as someone who works for a company that maintains a unified representation of data across the blockchain ecosystem — where each blockchain has its own peculiarities about what an "account" is, what a "transaction" can do, etc. Our data model only has one toplevel account type, one toplevel ledger-transaction type, etc. To handle the peculiarities, our data model instead has a large heirarchy of smaller data-objects hanging off of those toplevel ones; where any given data-object is sort of an "optional extension" that may or may not be there depending on how the toplevel object was created.

This approach allows us to just have one unified code-path that treats every account like every other account, every tx like every other tx, etc. We don't have duplicate code or a hierarchy of subclasses that all do things slightly differently; but instead, for any ledger-transaction, there may or may not be e.g. a strategy-pattern object hanging off that tx — and if there is, it gets used instead of the default static one. It's great for maintainability, testability, predictability, cacheability, and hundreds of other things.

I'd love to know if there's any good reason that a bank would actually want "50 different types of bank account" on an implementation level, rather than these all boiling down to one type with varying values of certain state variables + presence/absence of certain foreign-key relationships.

Other than maybe "some of these account types are actually a part of completely different data models living in third-party systems, that we acquired, and then never merged into our own systems." ;)

boricj
2 replies
20h36m

I'd love to know if there's any good reason that a bank would actually want "50 different types of bank account" on an implementation level.

Not having the benefit of hindsight?

Some banks are hundreds of years old. Most of them were computerized seventy years ago if not earlier, back in the stone age of computers. These kinds of banks won't bet the house on a newfangled system every couple of years because some bright-eyed engineer told them it's the trend nowadays.

Not messing up their bookkeeping is their number one priority. People would riot if their bank told them "sorry, we no longer know how much money you had deposited with us".

fl7305
0 replies
18h40m

Some banks are hundreds of years old.

Since the post is about a Swedish bank, it might be interesting to note that the central bank in Sweden was founded in 1668.

https://en.wikipedia.org/wiki/Sveriges_Riksbank

Kinda makes you wonder if anyone opened an account then and deposited a dollar ("daler" in Swedish), and let it sit and accrue interest for the family for 350 years.

derefr
0 replies
1h31m

These kinds of banks won't bet the house on a newfangled system every couple of years because some bright-eyed engineer told them it's the trend nowadays.

The thing about banking, is that they keep records of everything — not just "state now", but all previous states, and all the state deltas, and all the commands that produced those deltas, and an audit log of who/where/when the requests were made to triggered those commands. Financial-ledger databases are the original CQRS event-streaming reducer systems.

And this actually means that it's very easy to produce a new system that is provably "at parity with" an existing system. No faith required. You take your complete historical CQRS event stream from your existing system, stream it through the new system, and see that it produces the same state that's in the existing system. If it does — and if you're operating on years of real data — then that's more evidence for exact parity than a test suite would ever be.

(You may also want to produce hypothetical CQRS command streams and run them through both the existing and new systems. This would mostly be useful for regression-testing of edge-case logic that is required for e.g. compliance, but so rare that it has yet to ever actually come up in practice.)

Most of them were computerized seventy years ago if not earlier, back in the stone age of computers.

You can do everything I mentioned — factoring out your business rules into simpler rules that apply to a collection of orthogonal sub-ledgers — on paper. These ideas are not "newfangled"; while they were introduced into computing through the formalism of relational algebra (as database normalization), the ideas existed well before the mathematical formalism for them existed. Clever people have been simplifying their paper records into orthogonal sub-ledgers since the invention of double-entry book-keeping in 1494.

brightball
3 replies
23h8m

.

rajamaka
1 replies
22h49m

Very creepy. I was talking about programming and alas I com to HN and topics related to programming are ob the front page!

brightball
0 replies
22h47m

And yet, you don't see much about mainframes and COBOL on here ever...nor do I often have lunch conversations about it.

The timing was just interesting.

debo_
0 replies
14h11m

You have a point.

ape4
3 replies
20h48m

I surprised banks don't have a domain-specific language (DSL) to describe their businesses. Instead each bank has to code "savings account" in COBOL.

skissane
0 replies
9h46m

Instead each bank has to code "savings account" in COBOL

At most banks nowadays, basic banking functions such as saving accounts are provided by off-the-shelf software, not written from scratch.

That's not to say that nothing gets written from scratch, but it is generally adding code to support the more exotic product offerings, integrations with other systems, institution-specific business rules and processes, etc - any off the shelf banking system is going to have basic bread-and-butter stuff like support for savings accounts already included.

This is true even for COBOL-based banking systems. Historically, many banks used CSC Hogan (now DXC Hogan)-and while many have moved away from it, some are still on it. Hogan runs on IBM mainframes (z/OS), and is written in COBOL and CICS. Basic stuff like savings accounts is supported by the vendor-provided COBOL code, but Hogan sites often end up writing their own custom COBOL code to support their own unique requirements.

I surprised banks don't have a domain-specific language (DSL) to describe their businesses.

There is a long history of 4GL's being used in banking, going back decades. In prior decades, many of these 4GLs worked by generating COBOL code. Some of them were general-purpose, and used across many industries; others were exclusively used in banking. But, even those exclusively used in banking, rarely (to my knowledge) contained domain-specific banking features, and as such I'm not sure they really count as DSLs.

To give a specific example, FIS Global's core banking platform, Profile, is written in MUMPS (actually the open-source variant GT.M). Due to how horrid MUMPS is as a language, they created their own higher-level object-oriented language which compiles to MUMPS, called PSL (Profile Scripting Language), and gradually rewrote their banking platform in it. An old version of PSL was open-sourced over 10 years ago on Sourceforge, if anyone wants to look at it. [0] I don't think PSL itself has anything really banking-specific in it per se, but as is inevitable with a language developed to support a single application, the boundary between the language and the application is a little blurry, and you'll find a lot of banking-specific stuff in the open source release (possibly included by accident), especially inside the binary GT.M database dump it ships with.

I think the areas in which truly domain-specific languages are strongest in finance - such as modelling financial contracts - are generally the furthest away from the COBOL legacy.

[0] https://sourceforge.net/projects/pip/files/PIP/V0.2/ see also more easily digestible (somewhat improved) Git mirror at https://gitlab.com/YottaDB/DBMS/YDBPIP/

ravenstine
0 replies
16h43m

It's an interesting idea, but I'm not sure how that's the least bit surprising.

dan-robertson
0 replies
20h14m

What domain do you think COBOL was made for?

Basically the only things computers were worth it for back then were scientific calculations and bookkeeping. Banking is obviously on the more complicated end of bookkeeping. Insurance did get a special COBOL variant because I guess they had some tasks shaped more like scientific calculations too.

zengid
2 replies
22h33m

<Anecdote> My first programming job was at a transportation enterprise with many programmers that had been working there 15-20+ years, many of whom mainly maintained the COBOL that ran the business. Many of those senior programmers were women; it felt like over 50% in the senior cohort. What really made me sad was that the younger programmers there were predominantly male (although still quite a few female!). That company mainly hired out of the local engineering colleges, so it was an interesting case study on how the number of women entering software engineering programs went down over the years. </Anecdote>

duderific
1 replies
19h30m

I recall hearing (can't remember where) that in the olden days, computer programming was considered akin to typing or secretarial work, so at first it attracted mostly women.

demondemidi
0 replies
8h53m

Women code breakers were literally called “computers”:

https://www.smithsonianmag.com/science-nature/history-human-...

syngrog66
2 replies
21h32m

the 1st professional programmer I knew was a woman who did COBOL for a bank

its struck me as interesting ever since. because over the course of my life and career since its seemed that 99%+ of the programmers I knew of were male.

I don't think there is any one right/ideal breakdown by gender, so I just like to have best sense for facts on the ground

dan-robertson
1 replies
19h41m

I think one thing is that the programmer job somewhat evolved from data entry and clerical roles that employed plenty of women already. So in some sense it didn’t start as a ‘man’s job’. For slightly later generations of programmers, one can observe that computer science degrees peaked near 40% female in the mid ’80s in the US[1]. It’s closer to 20% now and there are much more graduates today so you should probably expect a proportion somewhere around 20% of working CS grads. That number doesn’t seem crazily unrealistic to me for working programmers, though obviously it varies between companies. <1% does seem unrealistic to me. I’m slightly surprised that that’s been your experience (I assume you aren’t exaggerating because you want a sense for facts on the ground).

[1] eg https://www.gcu.edu/blog/gcu-experience/analysis-women-compu...

pflanze
0 replies
2h49m

that the programmer job somewhat evolved from data entry and clerical roles that employed plenty of women already

This reminds me of a presentation[1] that showed pictures on how early computers were advertised and what roles programmers (women) took in the advertisements, which echoes your statement.

(For context, see response to a comment of mine here[2]; more[3] info[4])

[1] https://youtu.be/5zN83hvn68U [2] https://news.ycombinator.com/item?id=37411221 [3] https://www.bbc.co.uk/programmes/b08wmk5l#playt=0h10m07s [4] https://web.archive.org/web/20190713175933/https://datasocie...

suslik
2 replies
9h29m

My grandma has turned 95 this year. She still works in a research institute (ex-USSR) every day, writing numerical simulations in Fortran and Maple. Her colleagues still don't want her to retire, and she postpones that every year. Her knowledge of numerical methods and mathematics is exceptional, and I am extremely proud of her.

lufte
0 replies
3h39m

You should interview her!

jgilias
0 replies
8h43m

Oh man! This is the coolest thing I’ve read in a week at least. I want to be like your Grandma when I’m 95.

Thank you for sharing!

spidermonkey23
2 replies
20h57m

My company which uses mainframe and COBOL is undergoing a migration to Linux. This involves porting all the batch command scripts to bash and also fix quirks with COBOL compiler differences. At the end of it though the system is at least 4x faster and much more maintainable. This was the solution instead of doing a complete rewrite to Java that they backed out of some years before.

dajt
0 replies
9h30m

In the early 90s we ported our COBOL software from the Wang VS to various typs of UNIX by purchasing and extending a Wang VS runtime emulator.

Then I wrote a terminal emulator for Windows 3.1 that would parse the forms sent down by the runtime and display them as something that looked like a Windows app. This wasn't too difficult because mini & mainframe systems present data entry pages a whole screen at a time, like a windows form.

bennysaurus
0 replies
20h14m

Do you know what they're using? Are they going to Microficus COBOL or doing something different?

spelunker
1 replies
21h46m

My mother-in-law worked her entire career in an IT department of an insurance company. She never really did much programming, but became a source of domain knowledge over time.

She has all kinds of amusing stories of ye olden days. Before a proper computer system, everything was stored in physical documents of course, and her first project out of college was traveling to satellite offices to re-organize the file systems there to match the new strategy that HQ had come up with. The business of satellite office would grind to a halt while a team of people literally took every single document out, re-labeled it, and put it back. The whole project took over a year.

That gave some perspective about my job, lol.

posix86
0 replies
18h48m

Given how banks were pushing the limits of paper back then, I can't imagine the complexity of a modern banks. I just don't know enough about finance to really even have an idea of where.

scrlk
1 replies
23h42m

The original HN submission from 2016: https://news.ycombinator.com/item?id=12096250

"The banking programming world is a completely different world than what most of us are used to"

If you're after another read on this topic, "An oral history of Bank Python" is good: https://calpaterson.com/bank-python.html (also previously on HN: https://news.ycombinator.com/item?id=29104047)

SilasX
0 replies
22h54m

Oh wow, I can’t believe that was seven years ago! I never got around to interviewing my mom about programming mainframes with Michigan Algorithm Decoder:

https://news.ycombinator.com/item?id=12097032

gcanyon
1 replies
15h39m

Funny, my mother was a mainframe COBOL programmer, from about 1968 until about 1975. Then she picked up FORTRAN, then she stopped programming around 1985. By 2000, she could barely handle the web, email, and a desktop.

Keep your skills up, people!

tomcam
0 replies
11h18m

I’m retired and I still keep my skills up. Just a tiny bit paranoid!

vkoskiv
0 replies
6h17m

Pretty good explanation of why the app I use for sauna and laundry room bookings in my apartment building is more reliable than Nordea's online banking. Not even a month ago, the Nordea app was showing all my bank accounts as having a balance of 0€. Yikes!

Considering what they are working with, I'm surprised they don't have even more downtime.

otteromkram
0 replies
23h14m

This is great! I wonder how the update(s) went since the article was written in 2016 and they mentioned 4 years per country, or 16 years total. COVID-19 probably threw a wrench in that plan, but how big of an impact is another interesting question.

orsenthil
0 replies
16h34m

Unrelated, I saw a link for medical book early in the article (at the start of the article). Did the author include it or is substack including it as an advertisement without mentioning it as an advertisement?

morbicer
0 replies
10h31m

Great read.

Stories like this put in perspective people that are moaning about tech debt... by which they mean some functions written a year ago in a style they dislike.

guyzero
0 replies
22h55m

Programming a mainframe while working from a hotdesk. Who would have imagined.

euroderf
0 replies
5h6m

They are trying to migrate to DB2 [..] It’s not as simple as just moving the data from IMS into DB2, they also have to update their modules to load & save data from DB2 instead of IMS and they have thousands of modules, many of which were developed by programmers that have either passed away or have retired.

I would think that this is a task tailor-made for A.I./ML

dfee
0 replies
22h27m

(meta) As a platform, substack content must follow the power law, right? That is, a few authors are most productive and there’s a long tail of single (or zero) article substacks.

This article was migrated from medium, but now the author gets to collect email addresses. But why subscribe? There are two articles. Is substack an RSS replacement (potentially with a paywall)? Maybe I should just consider substack to be a low barrier to entry blogging platform that can be set to your personal domain for $50?

Definitely not trying to beat up on the offer, just curious about the real value prop of substack for “most” producers and if that’s aligned with my goals as a reader who sees a email collection modal on every visit.

daly
0 replies
13h54m

I taught COBOL while at grad school at UCONN. Methinks I'm getting old.

boilerupnc
0 replies
20h25m

Recently announced, there’s also now a generative AI tool to assist developers with their COBOL modernization journey to Java [0]. I’ve heard pretty decent reviews on its effectiveness.

[0]. https://www.ibm.com/products/watsonx-code-assistant-z

PeterStuer
0 replies
11h25m

During my time working as a systems integration consultant in the financial services sector, I had to do a lott of integration with these core banking systems.

Most of the time you try to reuse existing integration points from previous projects, as negotiating completely new interfaces had a lott of friction, both technical and business/compliance related, and could easily set your project back for more than a year or two.

Integrations are usually delivering structured documents before a certain time in the evening for overnight batch processing. The existing documents with data based on offsets have by design regions of not yet assigned space in them to accomodate future updates and use. You negotiate over the bytes of those that can be assigned to your needs if new info is required.

For data extraction you will sometimes find more 'modern' api's, but do not expect too many fine grained REST stuff. You'll often find yourself in meetings negotiating with regulation and compliance looking for ways to avoid costly new developments.

On a sidenote: I often found compliance people a lott more pragmatic and solution oriented than IT. Having to convince my team that 'the letter' of a regulation did not mean you have to interpret litterally in the most restrictive way possible what is written was often more a challange than getting a deal with compliance.

Now imagine hundreds of projects over many decennia of these strata of integrations, and you will start to get the first glimpse of why replacing these core systems is such a challange.