50 COBOL Interview Questions and Answers (2026)

50 COBOL Interview Questions and Answers (2026)
COBOL (Common Business-Oriented Language) processes an estimated $3 trillion in daily commerce and remains central to banking, insurance, government, and logistics systems worldwide. COBOL skills command premium salaries and strong job security — and a solid COBOL interview performance is your entry point.
This guide covers 50 essential COBOL interview questions with detailed answers, from fundamentals through advanced topics including DB2 embedded SQL and modernisation.
Fundamentals (Questions 1–12)
Question 1: What is COBOL and where is it used?
Answer
COBOL (Common Business-Oriented Language) is a high-level programming language designed for business data processing. Created in 1959 by a committee including Grace Hopper, it was designed to be readable by non-programmers and to process large volumes of structured business data efficiently.
COBOL runs primarily on IBM z/OS mainframes but also on Linux, Windows (via MicroFocus/Broadcom COBOL), and cloud environments. It powers:
- Banking (ATM transactions, core banking, settlements)
- Insurance (claims processing, actuarial calculations)
- Government (tax systems, social security, customs)
- Retail and logistics (inventory, order processing)
- Healthcare (billing, claims adjudication)
An estimated 800 billion lines of COBOL code are in production globally, processing $3 trillion per day.
Question 2: What are the four COBOL divisions?
Answer
Every COBOL program is organised into four divisions in a fixed order:
-
IDENTIFICATION DIVISION — program metadata. Required paragraphs: PROGRAM-ID (the only mandatory one). Optional: AUTHOR, DATE-WRITTEN, INSTALLATION, DATE-COMPILED, SECURITY.
-
ENVIRONMENT DIVISION — describes the hardware and software environment. Contains:
- CONFIGURATION SECTION (source and object computer)
- INPUT-OUTPUT SECTION (FILE-CONTROL paragraph, mapping file names to DD names)
-
DATA DIVISION — declares all data structures. Contains:
- FILE SECTION (record layouts for files defined in ENVIRONMENT)
- WORKING-STORAGE SECTION (program variables, constants, work areas)
- LOCAL-STORAGE SECTION (variables re-initialised on each CALL — thread-safe)
- LINKAGE SECTION (data passed from calling programs or JCL PARM)
-
PROCEDURE DIVISION — the executable code. Business logic, I/O operations, calculations. Optionally divided into sections and paragraphs.
Question 3: What is the structure of a COBOL data item (PIC clause)?
Answer
A COBOL data item is defined with a level number, name, and PICTURE (PIC) clause describing its format.
PICTURE symbols:
9— a single numeric digitA— a single alphabetic characterX— a single alphanumeric character (any character)S— sign (leading or trailing); used for signed numbersV— implied decimal point (no actual decimal stored)P— assumed decimal position (scaling)Z— numeric digit, suppressed to space if zero$,,,.,-,+— editing symbols for display formatting
Examples:
Level numbers: 01 (record level), 02–49 (subordinate fields), 66 (RENAMES), 77 (independent items), 88 (condition names).
Question 4: What is the COMP / COMPUTATIONAL usage clause?
Answer
USAGE specifies how a data item is stored internally:
| Usage | Description | Best for |
|---|---|---|
DISPLAY | Character representation (default) | Character fields, edited output |
COMP / COMPUTATIONAL | Binary integer (2, 4, or 8 bytes) | Integer arithmetic, table indexes |
COMP-3 / PACKED-DECIMAL | Packed decimal (2 digits per byte) | Financial calculations |
COMP-1 | Single-precision floating point | Scientific calculations |
COMP-2 | Double-precision floating point | High-precision scientific |
COMP-4 | Same as COMP (binary) | Synonymous with COMP |
COMP-5 | Native binary (no truncation) | High-performance integer math |
INDEX | Internal table index | SEARCH and SET statements |
COMP-3 (Packed Decimal) is the most common usage for business arithmetic — faster than DISPLAY, exact decimal precision (unlike floating point), and compact storage.
Question 5: What is the difference between WORKING-STORAGE and LINKAGE SECTION?
Answer
WORKING-STORAGE SECTION:
- Data items belong to the program itself
- Storage is allocated and initialised when the program is first called
- Values persist between calls (if the program is loaded in memory as a resident module)
- Used for program variables, work areas, flags, counters, tables
LINKAGE SECTION:
- Data items do not own storage — they are pointers to storage owned by the caller
- Maps to data passed via CALL...USING (from a calling COBOL program) or via the JCL PARM parameter
- Changes made to LINKAGE SECTION items are visible to the caller
- Used for parameter passing between programs
Key distinction: if you initialise a WORKING-STORAGE item to ZEROS, it stays zero unless your code changes it. A LINKAGE SECTION item points to whatever the caller provided — its initial value depends entirely on the caller.
Question 6: What is a COPYBOOK and how is it used?
Answer
A COPYBOOK is a reusable source code fragment stored as a PDS member that is physically inserted into a COBOL program at compile time using the COPY statement.
Benefits:
- Consistency: All programs using the same COPYBOOK have identical record layouts — if the layout changes, only the copybook needs updating (then recompile all programs)
- Reuse: Define a data structure once, use in hundreds of programs
- Error reduction: Eliminates copy-paste mistakes in record definitions
The REPLACING clause allows text substitution within a COPY:
COPYBOOKS are stored in a source library (e.g., MY.COPY.LIB) referenced by the SYSLIB DD in the compile JCL step.
Question 7: What is the 88-level condition name?
Answer
Level 88 defines a condition name — a meaningful name assigned to one or more specific values of a parent data item. Using 88-levels makes code more readable and maintainable.
Definition:
Usage:
Instead of IF WS-STATUS = 'A', you write IF STATUS-ACTIVE — self-documenting code. The SET...TO TRUE statement assigns the corresponding VALUE to the parent field.
Question 8: What is REDEFINES in COBOL?
Answer
REDEFINES allows multiple data names to occupy the same physical storage, enabling different interpretations of the same bytes.
Rules:
- The redefining item must immediately follow the item it redefines (at the same level)
- The redefining item cannot be larger than the redefined item
- REDEFINES cannot be used at the 01 level in File Section (use REDEFINES at 05 level within a record instead)
- 88-level items under a REDEFINES clause inherit the parent's storage
REDEFINES is commonly used for: date reformatting, union-style fields that hold different data types, and mapping packed/binary fields to display fields for printing.
Question 9: What is the OCCURS clause and how are COBOL tables defined?
Answer
OCCURS defines a table (array) — a data item repeated a fixed or variable number of times.
Fixed-length table:
Variable-length table (OCCURS DEPENDING ON):
Accessing table elements:
Multi-dimensional tables:
Access: WS-CELL(row, col)
OCCURS INDEXED BY creates an index item for use with SEARCH and SET.
Question 10: What is the SEARCH and SEARCH ALL statement?
Answer
SEARCH and SEARCH ALL look up values in COBOL tables.
SEARCH — sequential (linear) search through a table:
The table must have an INDEX item (INDEXED BY clause). SEARCH increments the index from its current value.
SEARCH ALL — binary search (much faster for large tables):
Requires the table to be sorted (ASCENDING/DESCENDING KEY clause on OCCURS). SEARCH ALL uses binary search — O(log n) vs SEARCH's O(n).
Question 11: What is the difference between MOVE CORRESPONDING and a regular MOVE?
Answer
MOVE source TO destination — moves an individual data item. If moving between different data types, COBOL performs implicit conversion (numeric to display, etc.).
MOVE CORRESPONDING (MOVE CORR) group1 TO group2 — automatically moves each data item in group1 to the data item with the same name in group2, if it exists.
Useful when two records share many field names but differ in some. Avoid when records differ significantly — the implicit matching can cause hard-to-spot bugs if naming conventions are inconsistent.
Question 12: What are the COBOL file organisations and access modes?
Answer
File Organisation (physical structure):
- SEQUENTIAL — records stored one after another. Must be read in order.
- INDEXED (VSAM KSDS) — has a key index allowing random access by key.
- RELATIVE — records accessed by relative record number.
Access Mode (how the program reads/writes):
- SEQUENTIAL — process records in physical order (read next, write next)
- RANDOM — access individual records by key or relative number
- DYNAMIC — can switch between sequential and random within the same program
Common combinations:
- Sequential file + Sequential access = flat file batch processing
- Indexed (VSAM KSDS) + Random = online-style lookup (find customer by ID)
- Indexed (VSAM KSDS) + Dynamic = read a range starting at a key, then continue sequentially
File operations: OPEN (INPUT/OUTPUT/I-O/EXTEND), READ (NEXT/INTO), WRITE, REWRITE (update in place), DELETE (from VSAM), CLOSE, START (position to a key in VSAM).
Procedure Division and Logic (Questions 13–25)
Question 13: What is the difference between PERFORM and CALL?
Answer
PERFORM — transfers control to a paragraph or section within the same program, then returns:
The paragraph/section being performed is part of the same compilation unit.
CALL — invokes a separate, independently compiled COBOL subprogram:
The called program is a separate load module. Data is passed via USING clause (BY REFERENCE — caller's storage is shared; BY CONTENT — a copy is passed; BY VALUE — value only, changes not visible to caller).
CALL...BY REFERENCE is the default and most common — the called program can modify the caller's data directly. Use BY CONTENT to protect the caller's data from modification.
Question 14: What is BY REFERENCE vs BY CONTENT vs BY VALUE in CALL?
Answer
These control how parameters are passed to a called subprogram:
BY REFERENCE (default):
- The address of the caller's data item is passed
- The subprogram works directly with the caller's storage
- Changes made by the subprogram are visible to the caller
- Most efficient (no copy)
BY CONTENT:
- A copy of the data item is passed to the subprogram
- The subprogram works on the copy
- Changes are NOT visible to the caller
- Protects the caller's data from accidental modification
- Slightly less efficient (copy is made)
BY VALUE:
- The value (not address) is passed
- Used primarily for interoperability with non-COBOL programs (C, Java via JNI)
- The called program receives the value but cannot return a modified value through this parameter
Question 15: What is the EVALUATE statement and when is it used?
Answer
EVALUATE is COBOL's structured equivalent of a switch/case statement — cleaner than nested IF/ELSE chains.
Simple form:
Boolean form (EVALUATE TRUE):
Multi-subject form:
EVALUATE is preferred over deeply nested IFs — it is more readable, less error-prone, and maps naturally to business decision tables.
Question 16: What is the STRING and UNSTRING statement?
Answer
STRING — concatenates multiple data items or literals into a single receiving field:
DELIMITED BY SPACE stops at the first space; DELIMITED BY SIZE uses the full field length.
UNSTRING — splits a single field into multiple items based on delimiters:
UNSTRING is commonly used to parse delimited files (CSV, pipe-separated) in COBOL. The TALLYING phrase counts how many fields were extracted.
Question 17: What is the INSPECT statement?
Answer
INSPECT examines and optionally modifies a data item's contents — counting occurrences of characters or replacing characters.
INSPECT TALLYING — counts occurrences:
INSPECT REPLACING — replaces characters:
INSPECT CONVERTING — translates character sets:
INSPECT CONVERTING is often used to convert lowercase to uppercase, since COBOL on mainframes traditionally worked with uppercase only. The FUNCTION UPPER-CASE intrinsic function is more readable for this purpose in modern COBOL.
Question 18: What are COBOL intrinsic functions?
Answer
Intrinsic functions are built-in COBOL functions invoked with FUNCTION function-name(arguments).
Commonly used intrinsic functions:
| Function | Purpose |
|---|---|
FUNCTION UPPER-CASE(str) | Convert to uppercase |
FUNCTION LOWER-CASE(str) | Convert to lowercase |
FUNCTION LENGTH(item) | Length of a data item |
FUNCTION NUMVAL(str) | Convert numeric string to number |
FUNCTION NUMVAL-C(str) | Convert currency string to number |
FUNCTION TRIM(str) | Remove leading/trailing spaces |
FUNCTION CURRENT-DATE | Returns 21-character current date/time |
FUNCTION WHEN-COMPILED | Compilation date/time |
FUNCTION INTEGER-OF-DATE(date) | Convert date to integer for arithmetic |
FUNCTION DATE-OF-INTEGER(int) | Convert integer back to date |
FUNCTION MOD(x,y) | Modulus |
FUNCTION MAX(a,b,c...) | Maximum of values |
FUNCTION RANDOM | Random number 0–1 |
Example:
Question 19: What is the INITIALIZE statement?
Answer
INITIALIZE sets data items to their default values based on their PICTURE type:
- Alphabetic (A) and alphanumeric (X) fields → spaces
- Numeric (9, S9) fields → zeros
- Alphabetic-edited and numeric-edited fields → spaces/zeros per edit pattern
INITIALIZE is more concise than:
Especially valuable for initialising complex group items with mixed field types. However, INITIALIZE does not initialise items within OCCURS DEPENDING ON tables to their full maximum — only up to the current DEPENDING ON value.
Question 20: What is the difference between ADD, SUBTRACT, MULTIPLY, DIVIDE and COMPUTE?
Answer
COBOL provides both English-style arithmetic verbs and the more flexible COMPUTE statement:
Arithmetic verbs:
COMPUTE — handles complex expressions:
** is the exponentiation operator. COMPUTE is more concise for multi-step calculations. All arithmetic statements support:
ON SIZE ERROR— triggers if result overflows the receiving fieldROUNDED— rounds the result to the receiving field's precision
Always use COMPUTE for financial calculations involving multiple operations — it reduces intermediate rounding errors.
Question 21: What is the COBOL file status code?
Answer
File status is a two-character field that COBOL sets after every file operation (OPEN, READ, WRITE, etc.) to indicate the result.
Define it in WORKING-STORAGE and associate with a file in SELECT:
Common file status values:
| Status | Meaning |
|---|---|
00 | Successful |
10 | End of file (sequential READ) |
22 | Duplicate key (VSAM write) |
23 | Record not found (random READ) |
35 | File not found (OPEN) |
39 | DCB conflict (OPEN) |
46 | READ attempted on closed file |
97 | OPEN successful but file was not closed properly |
Always check file status after every I/O operation in production code. 10 on a sequential READ is the normal end-of-file signal (not an error).
Question 22: How does COBOL handle error conditions — ON SIZE ERROR, INVALID KEY, AT END?
Answer
COBOL provides declarative error handlers built into the syntax of each statement:
ON SIZE ERROR (arithmetic):
AT END (sequential file READ):
INVALID KEY (indexed/relative file access):
Modern practice also checks the FILE STATUS code after every I/O for more specific error information. The USE AFTER ERROR declarative (DECLARATIVES section) provides a catch-all handler for file errors.
Question 23: What is the DECLARATIVES section in COBOL?
Answer
DECLARATIVES is an optional section at the beginning of the PROCEDURE DIVISION that defines error handlers called automatically when specific events occur.
USE AFTER STANDARD ERROR PROCEDURE is triggered when a file operation fails (non-zero file status not otherwise handled by INVALID KEY or AT END clauses). DECLARATIVES can also handle debugging events with USE FOR DEBUGGING.
Declarative sections end with END DECLARATIVES. Execution of the main procedure follows.
Question 24: What is SORT in COBOL (as opposed to JCL SORT)?
Answer
COBOL has a built-in SORT verb that sorts a file or table within the program itself, without requiring a separate JCL SORT step.
Sort a file:
Sort with INPUT PROCEDURE and OUTPUT PROCEDURE (for filtering/transforming during sort):
In the INPUT PROCEDURE, use RELEASE (instead of WRITE) to pass records to the sort. In OUTPUT PROCEDURE, use RETURN (instead of READ) to retrieve sorted records.
The sort file (SORT-WORK-FILE) must be defined in FILE SECTION with SD (Sort Description) instead of FD, and a corresponding SORTWK DD in the JCL (or DFSORT handles it automatically).
Question 25: What are COBOL intrinsic date functions and why are they important?
Answer
Date arithmetic in COBOL is handled via integer-based intrinsic functions to avoid calendar complexity:
FUNCTION CURRENT-DATE returns a 21-character string:
- Positions 1–4: year, 5–6: month, 7–8: day, 9–10: hours, 11–12: minutes, 13–14: seconds, 15–16: hundredths, 17: GMT offset sign, 18–19: GMT hours, 20–21: GMT minutes
These functions were crucial for the Year 2000 remediation effort and remain essential for any date-sensitive financial or scheduling logic.
DB2 Embedded SQL and Advanced Topics (Questions 26–40)
Question 26: How is DB2 SQL embedded in COBOL?
Answer
DB2 SQL statements are embedded within COBOL code using the EXEC SQL...END-EXEC delimiter. The COBOL source is first processed by the DB2 Precompiler (or COBOL with integrated DB2 syntax support), which replaces SQL with COBOL CALL statements.
Host variables (COBOL data items used in SQL) are prefixed with a colon (:). SQLCA (SQL Communication Area) provides SQLCODE and SQLERRM after each SQL statement.
Question 27: What is SQLCA and what is SQLCODE?
Answer
SQLCA (SQL Communication Area) is a data structure included in every DB2 COBOL program that holds information about the most recent SQL statement's execution. It is included with EXEC SQL INCLUDE SQLCA END-EXEC.
SQLCODE is the key field in SQLCA:
| SQLCODE | Meaning |
|---|---|
0 | Successful execution |
+100 | Row not found (SELECT INTO, FETCH at end of cursor) |
-803 | Duplicate key (INSERT violation) |
-805 | Package not found in DB2 (program not bound) |
-811 | More than one row returned by SELECT INTO |
-904 | Resource unavailable (table locked) |
-911 | Deadlock or timeout — transaction rolled back |
-922 | Authorisation error |
SQLERRM contains a text description of the error.
After every SQL statement, check SQLCODE:
Question 28: What is a DB2 cursor and when is it used?
Answer
A cursor is a named query result set that allows a COBOL program to process multiple rows returned by a SELECT statement, one row at a time.
Cursor lifecycle:
- DECLARE — define the query (compile time)
- OPEN — execute the query, position before first row
- FETCH — retrieve next row into host variables
- CLOSE — release the cursor
FOR READ ONLY on the cursor declaration allows DB2 to optimise (no locking, no update capability). FOR UPDATE OF column enables positioned updates/deletes via WHERE CURRENT OF cursor-name.
Question 29: What is the difference between SELECT INTO and a cursor?
Answer
SELECT INTO retrieves exactly one row directly into host variables:
- SQLCODE = 0: row found
- SQLCODE = +100: no row found (not an error — check for it)
- SQLCODE = -811: more than one row returned — program error, query must return at most 1 row
Cursor retrieves multiple rows iteratively via FETCH.
Use SELECT INTO when:
- The query logically returns one row (lookup by primary key)
- You need a single aggregated value (COUNT, SUM)
Use a cursor when:
- The query can return zero to many rows
- You need to process each row individually
Attempting SELECT INTO on a multi-row result set causes SQLCODE -811 — always use a cursor if there's any possibility of multiple rows.
Question 30: What is DB2 COMMIT and ROLLBACK in COBOL?
Answer
COMMIT and ROLLBACK control transaction boundaries in DB2:
EXEC SQL COMMIT END-EXEC — makes all changes since the last COMMIT permanent and releases locks.
EXEC SQL ROLLBACK END-EXEC — undoes all changes since the last COMMIT and releases locks.
Best practices for batch COBOL:
Commit frequency: commit too rarely → large rollback segments, long lock holds, deadlock risk. Commit too often → overhead from repeated commit overhead. Common intervals: every 1,000–10,000 rows for bulk batch processing.
If a DB2 deadlock or timeout occurs (SQLCODE -911), z/OS automatically rolls back the current transaction unit — the program must handle this and restart from the last committed point.
Question 31: What is a NULL indicator in DB2 COBOL?
Answer
SQL NULL means "no value" — it is not zero, not spaces, not anything. COBOL host variables cannot directly represent NULL. Null indicators are COBOL PIC S9(4) COMP fields that communicate NULL status between COBOL and DB2.
Declaration:
Usage in FETCH:
Null indicator values:
-1— column value is NULL (WS-AMOUNT is undefined/unreliable)0— column has a value (WS-AMOUNT is valid)> 0— value was truncated (for variable-length strings)
For INSERT/UPDATE, set the indicator to -1 to insert NULL, or 0 to use the COBOL field's value.
Question 32: What is the difference between static and dynamic SQL in COBOL?
Answer
Static SQL: SQL statements are fixed at compile time, precompiled into the DB2 DBRM (Database Request Module), and bound into a Package/Plan before execution. DB2 creates an access path at bind time.
Advantages: access path determined once (fast execution), syntax errors caught at compile/bind time, no runtime parsing overhead.
Dynamic SQL: SQL is built as a character string at runtime and prepared/executed during program execution.
Advantages: flexible (table names, WHERE clauses can vary), needed for ad-hoc query tools and report generators. Disadvantages: access path determined at runtime, higher CPU overhead, SQL errors only caught at runtime.
Most batch COBOL uses static SQL. Dynamic SQL is used in interactive tools, generic query programs, and where the table or column names are not known until runtime.
Question 33: What is the WORKING-STORAGE vs FILE SECTION for record processing?
Answer
FILE SECTION defines the record buffer directly in the I/O buffer. When you READ a record, the data goes directly into the FILE SECTION record area.
WORKING-STORAGE defines a separate copy of the record layout for program processing.
Best practice — use READ INTO / WRITE FROM:
READ INTO copies the I/O buffer to WORKING-STORAGE after the read. WRITE FROM copies WORKING-STORAGE to the I/O buffer before the write.
Advantages: you can manipulate WS-INPUT-RECORD freely without interfering with the I/O buffer; the FILE SECTION record can remain a single PIC X field.
Question 34: What is COBOL structured programming and what are its benefits?
Answer
Structured COBOL uses end-scope terminators (END-IF, END-PERFORM, END-READ, END-EVALUATE, etc.) to create explicit scope, making code easier to read and maintain.
Old style (period-delimited scopes):
A single misplaced period ends all open scopes — a notorious source of bugs.
Structured (scope terminators):
Benefits:
- Explicit scope — no ambiguity about where an IF ends
- Nested logic is clearly visible
- A misplaced period is a compile error, not a silent logic bug
- Supports inline PERFORM (code directly within PERFORM...END-PERFORM)
- More readable for multi-programmer teams and maintenance
Modern COBOL shops mandate structured style. Legacy code using period-terminated sentences is a maintenance hazard.
Question 35: What is the LOCAL-STORAGE SECTION?
Answer
LOCAL-STORAGE is a COBOL section (part of DATA DIVISION) where data is automatically re-initialised on every CALL to the program.
Contrast:
- WORKING-STORAGE: initialised once when the program is first loaded. Values persist between calls.
- LOCAL-STORAGE: initialised to the VALUE clauses (or default zeros/spaces) on every call entry.
LOCAL-STORAGE is critical for re-entrant programs — programs that may be called concurrently by multiple tasks (threads). WORKING-STORAGE is not thread-safe (shared between all callers). LOCAL-STORAGE is allocated per-call and therefore thread-safe.
Used in: CICS transaction programs (each transaction gets its own LOCAL-STORAGE), multi-threaded batch environments, and recursive COBOL programs.
Question 36: What are the differences between OPEN modes: INPUT, OUTPUT, I-O, EXTEND?
Answer
| Mode | Description | Operations allowed |
|---|---|---|
INPUT | Open for reading only | READ |
OUTPUT | Open for writing (creates/overwrites) | WRITE |
I-O | Open for both reading and updating | READ, WRITE, REWRITE, DELETE |
EXTEND | Open to append records to end of sequential file | WRITE (appends) |
Key points:
OUTPUTcreates the file if it doesn't exist; if it exists, it is overwritten (DISP=OLD or DISP=MOD in JCL)EXTENDrequires the file to already exist; new records are added after the last existing record (DISP=MOD in JCL)I-Oon sequential files allows REWRITE (update the last record read) but not random updatesI-Oon VSAM KSDS allows random READ, WRITE, REWRITE, and DELETE
For VSAM random update:
Question 37: What is a COBOL subprogram and how are parameters passed?
Answer
A COBOL subprogram (called module) is a separately compiled COBOL program invoked via CALL. The calling program passes data via the USING clause; the subprogram receives it in its LINKAGE SECTION.
Calling program:
Called subprogram (SUBPROG):
STOP RUN in a subprogram terminates the entire job. Use GOBACK or EXIT PROGRAM instead — these return control to the caller while keeping the caller running.
Question 38: What is the difference between STOP RUN, GOBACK, and EXIT PROGRAM?
Answer
| Statement | Behaviour in main program | Behaviour in subprogram |
|---|---|---|
STOP RUN | Terminates the job | Terminates the entire job (not just the subprogram) — dangerous in subprograms |
GOBACK | Terminates the program (same as STOP RUN in main) | Returns control to the calling program |
EXIT PROGRAM | No effect in main program | Returns control to the calling program |
Best practice:
- Use
GOBACKin all programs — it behaves correctly whether the program is a main program or a subprogram - Never use
STOP RUNin a subprogram — if the subprogram is called from a mainline COBOL program (or CICS), it will abruptly terminate the entire address space
Question 39: What is the COBOL CICS interface?
Answer
CICS (Customer Information Control System) is the online transaction processing middleware on IBM mainframes. COBOL CICS programs use the EXEC CICS...END-EXEC interface to interact with CICS services.
Common EXEC CICS commands:
CICS COBOL programs do not use normal file I/O statements (OPEN/READ/WRITE/CLOSE) for VSAM files — all file access goes through EXEC CICS commands. Local-Storage is used instead of Working-Storage for thread safety across concurrent transactions.
Question 40: What is performance tuning in COBOL — key techniques?
Answer
Key COBOL performance techniques:
- Use COMP-3 (Packed Decimal) for all arithmetic fields — faster than DISPLAY and exact decimal
- Use COMP (Binary) for counters, subscripts, and loop variables
- Minimise I/O — buffer large batches, avoid random I/O on sequential files
- BLOCK CONTAINS 0 RECORDS — lets z/OS choose optimal block size
- Read INTO / Write FROM — avoids unnecessary data movement between sections
- Avoid unnecessary MOVE — don't move data that isn't needed
- SEARCH ALL over SEARCH — binary search is O(log n) vs O(n)
- Minimise DB2 SQL calls — bulk fetch with FETCH FOR n ROWS; avoid SELECT inside loops
- COMMIT at intervals — reduces lock contention in DB2 batch
- Use DFSORT in JCL for sorting — faster than COBOL SORT for large files
- PERFORM inline — modern compilers optimise inline code better than paragraph calls
- Review SYSOUT and SMF records — use IBM Fault Analyzer and OMEGAMON for profiling
The biggest gains usually come from I/O reduction (blocking) and DB2 access path optimisation (proper indexes, statistics up-to-date).
Modernisation and Best Practices (Questions 41–50)
Question 41: What is COBOL modernisation and what are the main approaches?
Answer
COBOL modernisation encompasses several strategies:
-
Rehosting / Lift-and-shift: Move COBOL applications from mainframe to distributed platforms (Linux/x86) using emulation software (Micro Focus Enterprise Server, LzLabs, NTT Data Mainstar). No code changes — same COBOL, different hardware. Reduces mainframe costs.
-
Re-platforming: Recompile COBOL for modern open-source COBOL compilers (GnuCOBOL, OpenCOBOL). May require minor code changes. Applications run on Linux/cloud.
-
Wrapping / API-enabling: Expose COBOL business logic as REST APIs or microservices using IBM z/OS Connect or COBOL-based web service stubs. The COBOL core is preserved; modern frontends consume it via API.
-
Refactoring: Modernise COBOL code style (structured programming, eliminate GOTOs, add unit tests with COBOL-Check or zUnit) while staying on z/OS.
-
Rewriting: Convert COBOL to Java, Python, or Go. Highest risk and cost — business rules encoded in decades of COBOL are often poorly documented.
Most organisations choose a hybrid: keep critical, stable batch COBOL on z/OS, API-enable it for new digital channels, and rewrite only where the cost/benefit is clear.
Question 42: What is the GOTO statement and why is it discouraged?
Answer
GO TO transfers control unconditionally to a paragraph or section label, similar to a goto in C/C++.
Why GO TO is discouraged:
- Spaghetti code: unrestricted jumps make control flow impossible to follow
- Maintenance nightmare: changing a paragraph that is a goto target requires finding all callers
- Testing difficulty: programs with heavy GOTO are nearly impossible to unit test
- Structured alternatives: PERFORM, EVALUATE, IF/ELSE provide all the same logic without jumping
GO TO is still acceptable in one specific pattern: GO TO error-paragraph at the bottom of a section as a single forward jump to an error handler. Modern COBOL shops prohibit GO TO except in this pattern.
ALTER (which changes the target of a GO TO dynamically) is obsolete and never acceptable.
Question 43: What is the 77-level data item?
Answer
Level 77 declares an independent (elementary) working storage item that is not subordinate to any group item. It cannot have subordinate items.
Level 77 items are functionally equivalent to level 01 elementary items. Modern COBOL style prefers level 01 for standalone items:
Level 77 is considered legacy — most modern COBOL shops use 01 for all top-level items for consistency. Some coding standards prohibit 77 entirely. You will encounter it in legacy code.
Question 44: What is the COBOL REXX and how does REXX relate to mainframe COBOL?
Answer
REXX (Restructured Extended Executor) is a scripting language on IBM mainframes used for:
- JCL automation (building and submitting jobs dynamically)
- TSO/ISPF scripting
- Mainframe administration
- Glue scripts between COBOL batch jobs
REXX does not replace COBOL — they are complementary. A typical mainframe workflow might be:
- REXX script checks if input files exist and have records
- REXX builds and submits a JCL job that runs COBOL programs
- REXX checks the job return codes
- REXX routes output or sends notifications
COBOL can call REXX via IKJEFT01 in JCL, and REXX can submit COBOL jobs. In modern environments, COBOL programs can also call z/OS UNIX services and shell scripts.
For COBOL interview purposes, REXX knowledge demonstrates mainframe breadth beyond pure coding.
Question 45: What is COBOL unit testing?
Answer
Unit testing COBOL has historically been challenging due to lack of tooling, but modern frameworks now exist:
COBOL-Check (open-source): a unit test framework that allows writing test cases directly in COBOL syntax. Tests are co-located with source code.
zUnit (IBM Developer for z/OS): IBM's unit testing framework integrated into IBM Developer (Eclipse-based). Supports test stubs, mocks, and test suites.
Micro Focus Unit Testing: available in Micro Focus Enterprise Developer for COBOL on distributed platforms.
A basic test pattern:
Key principles:
- Test individual paragraphs/sections in isolation
- Mock VSAM and DB2 I/O (avoid real I/O in unit tests)
- Test boundary conditions (zero, maximum values, NULL indicators)
- Use COPY REPLACING to inject test stubs for file operations
Question 46: What is a COBOL copybook and how does REPLACING work?
Answer
(Extended from Question 6 — the REPLACING clause detail)
COPY...REPLACING performs text substitution when including a copybook:
If CUST-LAYOUT contains:
After REPLACING, the program sees:
The ==...== delimiters mark the text to be substituted. This allows a single copybook to be used multiple times in the same program with different prefixes — for example, one copybook for both the input customer record and the output customer record, with different field name prefixes to avoid duplicate data-name errors.
This technique is standard in shops that have common record layouts shared across hundreds of programs.
Question 47: What is the COBOL POINTER data type and ADDRESS OF?
Answer
COBOL supports pointer data items for working with dynamic memory allocation and interfacing with C or system routines.
ADDRESS OF data-item returns the virtual storage address of a COBOL data item as a pointer. SET pointer TO ADDRESS OF item assigns the address. SET ADDRESS OF linkage-item TO pointer makes a LINKAGE SECTION item point to arbitrary storage.
Used in:
- Dynamic memory allocation (COBOL ALLOCATE/FREE in later standards)
- Interfacing with C programs that pass pointers
- Processing variable-length records or dynamically structured data
Pointer arithmetic is not directly supported in standard COBOL — use ADD 1 TO WS-PTR after REDEFINES with a binary integer for pointer arithmetic when needed.
Question 48: What is the difference between COBOL standards (COBOL-85, COBOL-2002, COBOL-2014)?
Answer
COBOL has evolved through several ISO/ANSI standards:
COBOL-85 (ANSI X3.23-1985):
- End-scope terminators (END-IF, END-PERFORM, etc.)
- EVALUATE statement
- Improved PERFORM variants
- Still the baseline for much legacy code
COBOL-2002 (ISO 1989:2002):
- Object-Oriented COBOL (classes, methods, inheritance)
- Intrinsic functions (UPPER-CASE, CURRENT-DATE, etc.)
- Unicode support
- POINTER and dynamic memory
- Free-format source (no column restrictions)
- XML GENERATE and PARSE
COBOL-2014 (ISO 1989:2014):
- Floating-point decimal (IEEE 754 decimal)
- Conditional expressions
- JSON GENERATE and PARSE
- ALLOCATE/FREE for dynamic memory
- Improved OO features
IBM Enterprise COBOL for z/OS implements a subset of COBOL-2014 plus IBM extensions. Most mainframe shops run Enterprise COBOL 6.3+, which supports JSON, XML, UTF-8, 64-bit arithmetic, and modern structured syntax.
Question 49: How do you handle large files efficiently in COBOL batch?
Answer
Processing multi-million-record files efficiently requires:
-
Maximise blocking: Use
BLOCK CONTAINS 0 RECORDS(let z/OS optimise) or explicitly set large BLKSIZE to reduce I/O operations. -
Sequential processing: Avoid random I/O lookups within a sequential loop. Pre-sort lookup files and use a merge approach instead.
-
Minimise data movement: Use READ INTO only if needed; direct FILE SECTION access avoids a MOVE per record.
-
Table lookups: Load small reference tables into WORKING-STORAGE at the start and use SEARCH ALL (binary search) instead of hitting a VSAM file per record.
-
DB2 bulk fetch: For programs with DB2 cursors, use multi-row FETCH:
Reduces DB2 FETCH calls by 100x.
-
Commit intervals: Commit every N records to balance checkpoint overhead vs lock hold time.
-
Parallel processing: Split the file into ranges and run multiple job steps in parallel using JCL step dependencies.
-
DFSORT for pre-sorting: Sort before processing to enable sequential instead of random access.
Question 50: What questions should you ask in a COBOL interview to demonstrate expertise?
Answer
Asking good questions signals seniority and genuine interest. Strong questions to ask interviewers:
Technical depth:
- "What COBOL standard does the shop target — Enterprise COBOL 6.x? Are you using any modern features like JSON GENERATE or UTF-8 support?"
- "How do you handle unit testing of COBOL programs — do you use zUnit, COBOL-Check, or something else?"
- "What is your DB2 commit strategy for large batch jobs?"
Architecture:
- "Are you API-enabling existing COBOL logic with z/OS Connect, or keeping it purely batch?"
- "How is the COBOL code version-controlled — Endevor, Git, something else?"
Modernisation:
- "What is your roadmap for the COBOL estate — maintain on z/OS, rehost, or phased rewrite?"
- "Do you have a COBOL-to-Java or COBOL-to-Python conversion initiative underway?"
Culture:
- "What does code review look like for COBOL changes?"
- "What does the onboarding process look like for someone coming to this codebase for the first time?"
These questions demonstrate that you think beyond syntax and care about the broader engineering practices around the COBOL ecosystem.
Summary
These 50 COBOL interview questions cover the complete range from language fundamentals through DB2 integration and modernisation strategy. To stand out in COBOL interviews:
- Master the four divisions cold — interviewers always start here
- Understand COMP-3 vs COMP vs DISPLAY — storage and performance implications
- Know DB2 cursors, SQLCODE, and commit strategy — DB2/COBOL is the core enterprise pattern
- Be able to explain COPYBOOKS and record layouts — fundamental to real-world COBOL shops
- Have an opinion on modernisation — senior candidates are expected to understand the landscape
For hands-on practice, use IBM's free Z Trial environment, Hercules with OpenMVS, or Micro Focus Visual COBOL (community edition) to write, compile, and run real COBOL programs.
