The gift of Grace: COBOL's odyssey from Vietnam to the Square Mile
Add 50, 1964 GIVING Anniversary
Cobol is the language most associated with mainframes, especially the IBM System 360 whose 50th anniversary is being celebrated or at least commemorated this week. But when COBOL was first spawned in the mid-1950s, it wasn’t intended for programmers.
It was aimed instead at “accountants and business managers” – basically a Stone Age Excel that turned into something quite different. In fact IBM spent serious effort fighting against COBOL becoming a standard for both good and bad reasons.
The ironies run deep here: Grace Hopper is on the wall of every school as a “role model” for women in programming despite being a prime mover in what is widely regarded as the worst mistake in computing history. This is due to its lack of almost every feature that good programming languages have as well as being absurdly verbose, as in:
ADD First, Second GIVING Sum COBOL
It was also instrumental in the US armed forces work in Vietnam, which also is not universally regarded as a success. 1950s IBM (or 1990s IBM for that matter) didn’t like any standard it did not control and use to leverage its vicelike grip on the whole IT market, which was centred on mainframes.
But once the Codasyl bandwagon started rolling, IBM got on board with alacrity, and the irony turned full circle: with IBM now being a big player in open source.
From the '50s until the '80s, computer firms made the bulk of their money from hardware and developed software like Cobol, CICS, DB2, VMS as ways of making the hardware usable and therefore creating more demand for more horsepower. It did so to the extent that much of the commercial justification for IBM PCs was that each “smart terminal” increased load on the "host" (IBM-ese for mainframe) by about half a MIP in profitable hardware terms. Today, the two remaining Cobol compiler teams actually use much the same techniques for optimising Cobol as for C++.
Real Programmers, of course, used Assembler and Fortran, because compilers were pitifully inefficient even though instruction sets were simpler. Processors did not require you to re-order instructions for best performance and RISC wasn’t even sci-fi yet.
If you can remember COBOL in the '60s, you probably were there as it quickly became the language of choice for online data processing for handling payments, stock control and real-time air defence systems. No I’m not making that up. We scared the Russians so much with our sophisticated COBOL-based defence that they never dared attack and it may or may not be coincidence that President Putin started throwing his weight around just as we retired them. They held on for decades because, despite the sneering of the cool kids (like me) who pushed C & C++ and the Quiche Eating Pascal/Java fanbois, the mainframe Cobol stuff focused on being rock solid rather than interesting; ask yourself if you want a more reliable air traffic control system or one that’s more exciting?
Pay and status reflected the different levels of complexity, with Assembler programmers earning more than COBOLers and mainframe people earning more than minis. That’s on average of course, but for most of the period 1955 to 1995, the bigger the machine you worked on, the more you got paid, Mainframe > Mini >PC. Even when most of us moved to Client/Server during the '80s and '90s, running big database systems was one of the best gigs in town... if you could stand the boredom.
The S/360 had made programming less of an experimental science and something that at least called itself a form of engineering (something that still hasn’t fully happened). This was because the S/360 was designed, rather than “we need a thing to do that, so we’ll stick it in this part” and gave rise to the most important book ever written on Software Engineering, The Mythical Man-Month, a book of essays that is so good that not only have four of my copies been stolen by lesser programmers, but I’ve spent my own money buying replacements.
A Portable Computer: circa 1977
Cobol had no problem embracing and crushing the dreams of the '70s and minicomputer makers like PDP-11-maker DEC, Prime Computer, DG, IBM and even Wang Lab. Each of these players had a Cobol of their own, each different enough to make it easy to write but hard to port code. This created a lock-in that enriched the manufacturers, but in the long run the only one left standing in the same sector is IBM.
That’s created a substantial niche that Micro Focus has done really rather well out of, providing path to a viable hardware and operating system platforms for many corporates. So why not just chuck it away and move to a language younger than the people writing it? You'd think that because as an ITPro most of your work is in changing things, but those under the delusion that the firm exists to make money are more than happy to leave well enough alone.
Cobol Developers did rather well out of the big expansion of financial services in the 1980s. Although PCs were popping up on desks, they lacked muscle, the right applications and reliability, so VAXes became so hot that one over-excited headhunter decided that because my mates and I did VAX C, this was much the same thing and referred to me as a “shit hot Cobol ace” and somehow persuaded Chase Manhattan Bank to fly us out to New York, sight unseen. The rates offered were so good that for a good long minute I was tempted to spend the flight with the manuals on my knees, but a rare outbreak of honesty and being part of the “Cobol will die soon” consensus kept me out of harm’s way.
During her time in the Navy, Hopper was one of the first people
to program the Harvard Mark I and its successors
Post peak Java play
As the shoulderpads of the '80s gave way to scarily sharp haircuts of the '90s and COBOL became Cobol, the old order started to break down. It was not so much because of any change in supply and demand, but rather that a critical component of what you were paid was the fear in the heart of your bosses at the thought of you leaving – which is not exactly the same thing. Cobol was seen as “yesterday” and so employers felt with some justification that they didn’t have to pay so much to keep their Cobolers sweet.
This was in some small part my fault: I started writing about this cool new Visual Basic thing. It allowed you to knock up pretty GUIs, putting lipstick on the server and mainframe pigs – meaning bosses saw lots of “visible productivity” – while a lot of Cobol (+Fortran, REXX, etc) developers saw their future in PCs. Since BASIC started off life as a dumbed-down Fortran, they found it easy enough.
A standard trick in writing CVs is to “compress” older skills so as not to “confuse the message” that you’re fully into whatever is fashionable this season, which meant the visible number of programmers who knew Cobol dropped like a stone just as we were all gearing up for the Millennium Bug beanfeast, making it all the more lucrative.
So on the 50th anniversary of the S/360 Mainframe, the Cobol world is now largely divided between IBM building its compilers for its hardware and Micro Focus, which covers pretty much everywhere else.
I got lots of crap for calling “peak Java pay” a while back, pointing out that the supply of Java programmers was ever increasing, but that the demand was not infinite – meaning more programmers each earning less as the ratio moved against them.
Talking to IBM and Micro Focus, there is a good argument that precisely the opposite is going on. We’re now in a world where most things can be rationally moved off Cobol and its happy gang of CICS, IMS, et al has already moved away, leaving a hard core (or as Micro Focus would call it, a “strong heart”) of tech that just works.
Or at least nearly. All code needs tweaking as the business logic evolves and if you have 100,000 programs (that’s programs, not lines of code) in Cobol, you’re going to add functionality in Cobol, which is of course how we got to 100K in the first place. The 100K programs (yeah, it scares me too), have captured a lot of business logic. They document how exactly the firm makes money (or in the case of insurance firms, the lies it tells to keep its money) far better than any pile of unread documents all large firms accumulate.
The problem with being reliable is that for years now this huge bulk of code hasn’t been improved, which is of course one of the reasons it has been reliable. Doing the un-sexy stuff in the back office, or more likely under the back office, means few people are familiar with the overall structure or have picked up “the way we do things around here” and to make it more fun, the demographics imply that about 14 per cent of them are expected to retire in the next five years.
IBM reckons there’s roughly a million Cobolers out there, so that’s 25,000 to 30,000 jobs needing filling just to keep us where we are. But as we all know merely knowing the language, it takes time to get to the position where your changes reliably do more good than harm, especially if all you have to go on is documentation written under duress 15 years ago by a guy who is now going to play golf until he dies and isn’t answering the phone.
What's cool about Cobol?
Part of the joy of journalism is hearing about really mad stuff, as occurred when IBM fellow Kevin Stoodley introduced me to the idea of Cobol for mobile phones. Forms-based apps on any platform work with little change to the massive beasts that book tickets, provides insurance quotes, etc, but gesture-based interfaces require that the top level mainframe code is refactored to be responsive to my ham-fisted gropings of a fondleslab.
This is already generating some useful paid employment for some of you. Fortunately IBM and Micro Focus have spent the necessary effort to ensure that Cobol can still access a good set of modern libraries, so you’re not restricted to ISAM, SNA, Embedded SQL and CICS.
But last week at a F#unctional Londoners Meetup where the hip young programmers met to look at the Mbrace distributed processing framework something quite interesting happened.
I felt a bit old when they started talking about the need for better transaction controls, and didn’t embarrass my son who was there, (yeah, I’m old) by saying what they needed was CICS, but it did rather look that way.
CICS/Cobol has been the centre of mega-scale transaction processing for 40 years and it does it rather well, mostly not requiring the programmers to think all that hard, and if you’ve had to debug maliciously complex multithreaded code, even in a language like Java, you should be able to appreciate this quality. The number of transactions handled by CICS per second makes the combination of Facebook, Twitter and LinkedIn look like the traffic for CreationistsWhoCanActuallyThink.com.
Part of the image problem for Cobol is the way we remember the development environments, based upon green screens and the gruesome 3270 terminal whose default reaction to almost any stimulus was to lock the keyboard. These days Cobol has (at last) caught up with what we have come to expect for Java and C++, so you get to use Eclipse and Visual Studio, albeit having to type faster to cope with the bulky syntax.
It probably doesn’t shock you to learn that few universities teach Cobol. On my degree it was treated rather like a 1950s information film about venereal disease and some barely even mention it. IBM runs competitions like “Master the Mainframe” to try to get bright young things into the game, but it’s hard work pushing uphill against the trendy languages.
There are apparently hundreds of universities who give at least some skim and people say good things about Marist College, but supply vs demand does seem to be drifting in favour of developers – even if Cobol ultimately does have more yesterdays than tomorrows. ®
Dominic Connor worked on getting Cobol to work properly under OS/2, which was so successful that he’s now a City headhunter.