Some observations on modernizing COBOL. Why we need Objects and Layers.

(c) Peter E. C. Dashwood - 2017


Is your COBOL worth saving?

 

Most sites who consider COBOL modernization do so because they have little option.

Over years (decades, usually...) the processes that provide the business functionality have been developed in COBOL. The people (both in the Business and in the IT department) who developed them have moved on, yet there are business rules and special cases which can be critical to retaining good customers, that simply can't be realistically discarded.

Information Technology (IT) has moved on and no longer embraces the paradigm that COBOL is based on (more on this in a minute), the world has access to mobile devices that provide instant information, people are doing business in a way that is far removed from the formality and paper used in the 20th century. Customer EXPECTATION is for instant transactional closure.

For Management, IT has become a minefield with no clear direction and a new "Messiah" every month. Technical gurus and experts are pressing for expensive innovation and there are no guarantees it will pay off.

For Developers, the technology is changing so rapidly it is hard to keep up with and a "career" is made by making sure that the right "buzzwords" are on your CV, and moving to companies that can provide the "right" experience. The whole purpose of developing applications (to make life simpler for people in the Business and to provide better information and functionality to decision makers) is lost, and the Technology becomes an end in itself; get to use the latest systems and software so you can sell the new skill.

Here are the most probable options for sites that have a large investment in COBOL "legacy":

* 1 - Outsource the IT to an EDP bureau

PROs CONs
1. It leaves the Company free to focus on what they do best.

2. It is really very similar to using a package, but without requiring IT expertise.
1. It can only be used by Companies who implement traditional (batch) processes.

2. It is almost extinct because of the customer expectations noted above.
* 2 - Manually convert the existing code and data base into a modern language
PROs CONs
1. Solves the problem of lack of COBOL skill availability.

2. Retains control of the Company's sytems and flexibility.
1. Learning curve for the new language.

2. POINTLESS unless it recognizes the paradigm shift from COBOL. (Generating Java from COBOL source, for example, does NOT improve your systems.)
* 3 - Buy a Package that can do our kind of business
PROs CONs
1. It means the IT effort comes down to "tailoring" the package. (A lighter load than full development.)

2. Architecture to support the modern paradigm is handled by the package.
1. It can be expensive and is usually beyond the means of small Companies.

2. It is hard to lever "competitive advantage" when your competitors are using the same package.
* 4 - Start over and redevelop what we need now, using modern methods and tools.
(This option is ONLY viable if you can salvage the existing Business rules, or you are starting a new business.)

PROs CONs
1. Ensures that systems use the modern paradigm but still retain Legacy information.

2. OLD and NEW share the same data repository. No pressure to "convert" or "Migrate". The OLD is GRADUALLY subsumed by the NEW.
1. It is a lot of work to refactor the existing COBOL manually.

2. There is a learning curve for the new language and paradigm, but it can be acquired without pressure.

The paradigm shift that COBOL mostly missed

Many COBOL programmers are bewildered by the rapidity and ubiquity with which Object Orientation has taken over the world of application development. Back in the 1960s, COBOL had established itself as a good step up from Assembler languages and machine code, it had become the accepted "Best Practice" for programming business applications, and there seemed to be no need to change anything.

COBOL implements what is called a "Procedural Paradigm".

A Von-Neumann style processor (CPU) executes a series of sequential steps (a "Procedure") and everything is controlled by the Processor's stored program (running under the Operating System (OS), of course...). The procedure iterates until there is no more input. This is fine for unattended operation, where there is no need for a Human to interact with the process.

Modern technology revolves around interaction with a Human. It implements a different paradigm called: "Event Driven".

The processing is happening on a device which could well be mobile. The Human interacts with the device and presses a button or clicks a mouse (or touches a screen or pad), or otherwise activates something to happen. This raises an "event". The device recognizes the event, and the OS vectors control to activate the appropriate program code, which deals with the event that was raised. Here's a picture:

TCRDB TCRDB

Unlike the procedural paradigm, the event driven paradigm means that code must execute IMMEDIATELY when events occur. This means that the CONTROL of when code will be required is no longer in the hands of the programmer. (The procedural program was controlled by what the programmer wrote, and nothing could happen unless the program said it could. Data could only be fetched when the program was ready to receive it, for example). But in the event driven paradigm, if the user clicks a mouse (for example) the system must respond and handle that at once; the programmer has no control over when an event will be raised. (Event driven processors have a lot more "spare capacity" than procedural ones because a lot of their time is in the wait state. Various OSes have many clever ways to use this capacity.)

As computers became connected and Networks developed, it became obvious that the event-driven paradigm was much better suited for Network processing. The Network consisted of many different processors (nodes) and the event processing load was not a heavy one. The procedural paradigm suited a very powerful single centralized processor. The NEW paradigm needed small blocks of code to handle various events, and these could even be flashed across the Network if necessary to "share the load" dynamically on other processors. (A process called "load levelling".) It became apparent that a different programming paradigm would be required in order to get the most from the event driven processing model. It arrived in the form of "Object Oriented Programming".

It was hardly a coincidence that the development of one of the first Object Oriented programming languages (SmallTalk), and the development of a pointing device that raised movement and click events, both happened at Xerox Palo Alto Research Centre (PARC), in California, within a few months of each other, in the early 1970s.


(Coming up: Why "Small is beautiful" and why Objects happened, despite objections...)

nextPge

Settings