It’s COBOL all the way down

It’s COBOL all the way down

COBOL has been a mainstay of government, business, and banking operations for nearly 60 years—but how long can it be maintained?
Part of
Issue 5 April 2018

Programming Languages

Before the term “information technology” existed to label the field, learning COBOL (Common Business Oriented Language) was the only sure way to ensure a lifelong career in IT. Developed in 1959 in part from a previous language design created by the legendary Grace Hopper, COBOL was an early example of an attempt at write-once, run-anywhere code at a time when it was more typical to write software using closely linked assembly language on a mainframe, the only kind of computer around. COBOL’s design allowed for describing business processes, like breaking down and accounting for financial transactions, while its syntax relied on verbose procedural programming that contained aspects of English.

Sixty years later, you’d expect COBOL to be the stuff of looping narrative videos at computer-history museums (in fact, there’s part of a floor devoted to mainframes at Seattle’s Living Computers museum, where you can see the old giants in action); maybe you’d hear the name COBOL used in an introductory lecture on computer science to explain how far we’ve come, and chuckle.

Instead, COBOL remains widely and actively used across the financial system, with no good plan for transitioning to modern codebases, nor for keeping a viable coder workforce active. That’s a problem, because while some schools still teach COBOL and many outsourcing firms train employees in it to meet their employers’ needs, it’s not enough. Someone has to maintain an estimated hundreds of billions of lines of COBOL that remain in use, with billions more being written each year for maintenance and new features.

Companies involved in keeping COBOL-based systems working say that 95 percent of ATM transactions pass through COBOL programs, 80 percent of in-person transactions rely on them, and over 40 percent of banks still use COBOL as the foundation of their systems. “Our COBOL business is bigger than it has ever been,” said Chris Livesey, senior vice president and general manager at Micro Focus, a company that offers modern COBOL coding and development frameworks.

The Bank of New York Mellon told Computerworld in 2012 that it had 112,500 COBOL programs representing 343 million lines of code in active use. (And, yes, they’re still hiring COBOL coders in 2018.) The U.S. Social Security Administration (SSA) noted in a 2014 report that it “currently has roughly 60 million lines of COBOL in production that support the agency’s high transaction volume and enable the agency to meet its regulatory, benefit, and reporting requirements.” Starting in 2012, the Commonwealth Bank of Australia spent a reported US$750 million and five years migrating its core software away from COBOL on a mainframe to a modern platform (it’s not clear how that effort ended).

The language never died, though its early practitioners have faded away, and the generation of programmers who built systems towards the end of the predominant mainframe era in the 1970s and ‘80s are largely near or past retirement age. Micro Focus estimates that about 2 million people worldwide actively work with COBOL, although how many directly write or modify code is likely a small proportion. That number is expected to decline rapidly over the next decade.

It’s a slow-moving crisis with no crackling deadline, like Y2K, to focus the minds of chief information officers. Some code has run effectively with relatively few changes for decades, making it hard to come up with a convincing return on the investment required to transition to something more modern. So many companies have used COBOL reliably for high-volume transaction processing for so long—starting long before microcomputer-driven database servers could even begin to keep up—that the cost of moving to newer languages and hardware has simply outweighed the benefits.

As the SSA’s then-CIO, Rob Klopp, said at a conference in 2016, “I’m pretty much of the opinion that what we need to do is understand the business rules and the business process that’s embedded in these legacy systems and just rewrite.” He also noted that billions of dollars would likely be needed to implement this kind of migration across multiple agencies. Though that money was never allocated, the recently passed omnibus spending bill includes over $2 billion for IT modernization, but mixes in maintenance funds. It’s clearly inadequate.

Still, transitions must happen. “The legacy applications are not really sufficient to support modern business requirements,” said Dale Vecchio, the chief marketing officer at LzLabs, which makes “software-defined mainframes.” Vecchio started his work life as a mainframe application developer and spent roughly 20 years at Gartner managing strategy for application modernization before joining LzLabs in 2017. “You can’t just rewrite the whole thing,” he said—but you can’t leave it in place, either.

Livesey of Micro Focus noted that the best you could hope for is spending millions and millions of dollars fraught with risk to get back what you got: “The best case is back to square one.”

The crunch is coming, and it could sink businesses unprepared to deal with a lack of programmers.

Old code knew best

The oldest computer program still running is arguably the Department of Defense’s Mechanization of Contract Administration Services (MOCAS), which manages over a trillion dollars across hundreds of thousands of contracts. It’s written in COBOL, though because it was launched in 1958, it was likely first written in a similar predecessor language, Hopper’s FLOW-MATIC. MOCAS has been updated routinely over the past 60 years and moved to ever-faster hardware, but it’s still the same at its core, consisting of about 2 million lines of code. Proposals to update it are floated routinely, but it’s still ticking away. (Not quite as old? The IRS’s main taxpayer data system from around 1960, written in assembly code.)

In a modern version of COBOL, an archetypal “Hello, world!” program looks something like this (although back in the 1960s, it would have been somewhat more verbose):

000100 IDENTIFICATION DIVISION.
000200 PROGRAM-ID. HELLO-WORLD.
000300 PROCEDURE DIVISION.
000400 DISPLAY 'Hello, world!'.
000500 STOP RUN.

The language ostensibly self-documents, because it’s so verbose and requires so many explicit declarations about what it’s doing. That’s an optimistic statement about any language. But its design was tailored to meet business needs. A series of operations in a company back office—like selecting, sorting, and summarizing the contents of a series of index or punch cards—could be replicated as an algorithm in COBOL. However clunky it might seem, the approach wasn’t to reevaluate how the operations were performed, but to transform them into code.

“When I started in the early 1960s, we went into an organization and most of the people there didn’t even know what a computer was,” said Bill Hinshaw, the founder and CEO of COBOL Cowboys, a consulting firm made up of folks like him who spent their lives in the mainframe mines and who can now earn hefty consulting fees for coming back to help. “You had to extract from the organization their business rules and requirements. You had to learn how to write the code in COBOL while learning what their business was.” Micro Focus’s Livesey said, “These applications are so incredibly complex and sophisticated, because they encapsulate decades of business process and know-how.”

As a result, many companies don’t know exactly how their systems run, because those rules extracted long ago are embedded in hundreds of thousands to tens of millions of lines of COBOL. Those 112,500 programs that the Bank of New York Mellon relies on? Most are tiny modules that carry out routine, easily duplicable tasks, like creating specific reports. But the interaction and batch processing carried out among them, coupled with the inevitable bugs and workarounds in language implementation and expression, mean that reproducing the system would take more than just feeding it into a COBOL to C# or Java converter.

COBOL also suffers from the flaws of early computer languages, such as allowing arbitrary jumps via GO TO. While it allowed for procedures, parameters couldn’t be passed, and all variables could be globally modified anywhere in the code. This makes it extremely difficult to predict how any segments of code work in isolation, because you never know whether or not variables that seem to have a local context are being modified elsewhere by intent or error. That in turn makes it almost impossible to have a gestalt view of an entire system. Replacing any piece of code could have completely unpredictable effects on parts not even known to interact with it. (Later revisions of COBOL made it more modern, including adding object-oriented support, but very little COBOL is written in that. And there are hundreds of other variations of COBOL from the past 60 years.)

An article in the 1978 issue of the Journal of Systems Management, “Flow in Structured COBOL Programs,” advocating for the use of structured programming standards described the benefits as avoiding “‘rats-nest’ or ‘spaghetti-like’ programs.” And a classic 1994 Dilbert comic strip featured Dilbert and Wally talking to a one-shot character, Irv:

Dilbert: “I’ve never seen you do any real work around here, Irv. How do you get away with it?”
Irv: “I wrote the code for our accounting system back in the mid-eighties. It’s a million lines of undocumented spaghetti logic.”
Dilbert: “It’s the Holy Grail of technology!”

While that’s a cynical view of programmers keeping themselves employed, Dilbert creator Scott Adams worked at Pacific Bell and knew the mainframe culture. But more telling is that the strip ran 24 years ago: “Irv” is long retired, but may have been hired back at several times his previous salaried hourly wage to work on his spaghetti.

Escaping the COBOL gravity well

COBOL programmers still get minted, but by all accounts not in great numbers. The Institute for Data Center Professionals program at Marist College in Poughkeepsie, New York, where IBM has had a large presence for decades, was founded in 2004 with funding and support from the National Science Foundation because of a concern that mainframe-computing subjects were no longer being taught.

Marist assistant dean and IDCP program director Susan Scanlon said most students are focused on z/OS, an IBM mainframe operating system. But they also provide non-credit COBOL courses, often taken by those who are already employed and looking to add skills. “They are already working for the company, perhaps in another area, or [they’re] new hires from traditional computer science programs,” she said. “Most of the companies we deal with are large financial institutions, insurance companies, and governmental agencies.”

Marilyn Zeppetelli, technical director of Large Systems Education at Marist and part of the faculty at IDCP, said she’d like to see more uptake of COBOL by undergraduates at Marist, “as it makes them extremely marketable to employers desperate for this skill.” She noted that while COBOL has more constraints than modern languages, “for students who have already acquired skills in multiple languages, they can usually adjust to COBOL.”

There’s some anecdotal evidence that entry-level COBOL programmers and those with mainframe expertise can earn more than programmers with skills in other areas, though that’s hard to gauge at a time when developer salaries in some areas of the economy are sky high. Still, given the installed base and shrinking expertise, a young person deciding to spend a life in COBOL could seemingly expect to have their skills valued even if Java, PHP, and C# become last year’s flavor.

Twenty-seven years ago, Micro Focus’s Livesey said, he was working at a bank after graduation, and was told by the IT manager about the system in place: “Don’t worry! This is all going away.” Today, Livesey said, “That bank is still running that mainframe with a team of COBOL programmers.” A 2014 report from the United States Office of Personnel Management noted that despite efforts to get away from mainframes, “We anticipate cost increases at magnitudes of 10 percent to 15 percent annually as personnel with the necessary coding expertise retire and cannot easily be replaced.” The government may be more upfront about these challenges than private industry, but it’s seemingly no better at overcoming them.

“You don’t change this problem in anything less than a decade. And that’s the problem: In a decade, the Boomers are going to be gone,” said LzLabs’s Vecchio. “I cannot see any scenario where the basic pool of skills is solved at a sufficient scale to solve this problem around the world.” Livesey was more sanguine about the problem, noting that system integrators and outsourcers train and maintain armies of COBOL programmers already, many of them much younger than the bulk of those who built the systems decades ago.

Both work for firms trying to alleviate pressure points, however. LzLabs’s software-defined mainframes let companies shift from IBM and other mainframe hardware, and the associated costs and licenses, to more commodity-style gear while retaining identical functionality. Micro Focus allows PC-based COBOL and mainframe programming using modern tools newly minted developers expect, like Eclipse for editing and .NET as a programming environment. That in turn allows for straightforward programming assistance such as syntax checking. Micro Focus can also let COBOL apps run directly on PCs or within cloud infrastructure, such as on Amazon Web Services and Microsoft Azure. All of this flexibility makes COBOL more accessible to a broader group of fresh coders.

This might help cushion transitions and prevent complete code migrations all at once. Those interviewed for this article and the consensus in government and third-party reports is that, as with any large-scale software-system shift, the odds of failure grow with the size of the project. COBOL Cowboys’ Hinshaw recommended taking “the smallest applications first that will do the least damage if they fail.” Vecchio put it more sharply: “The marketplace is a graveyard of mainframe transition disasters.” He is of the mind that transitions away from COBOL need to rely on open-source innovation and well-understood and tested packages that don’t require reinventing every wheel from scratch. His clients don’t “want to replace 20 million lines of COBOL with 20 million lines of Java.”

The crisis that’s brewing may never boil over if gradual transitions continue to take place. And while it’s impossible to peer inside all of the large and small companies that rely on mainframe-based and migrated COBOL software to know where they’re at, the vise might clamp a little tighter each year as new features are needed or a regulatory or accounting change has to be put into effect.

Bill Hinshaw said he thought he would be out of the COBOL business by now. But “in my attempts to exit out of the software, I can’t seem to leave, since many organizations still depend on COBOL today.”

About the author

Glenn Fleishman writes about the price of type in 19th-century America, bitcoin, and nanosatellites. He is currently completing a collection of 100 sets of printing and type artifacts for the Tiny Type Museum and Time Capsule project.

@GlennF

Artwork by

Ola Niepsuj

behance.net/olaniepsuj

Buy the print edition

Visit the Increment Store to purchase print issues.

Store

Continue Reading

Explore Topics

All Issues