Blog Post What is Batch Processing?

A quick guide to the history of Batch Processing and its limitations in contrast to Job Scheduling and Workload Automation.


A short history of batch processing

Batch processing is the procedure by which computers automatically complete batches of jobs in sequential order with minimal human interaction. It is named after the process whereby batches of punched cards were queued to load the data into the mainframe’s memory in order to process the data. As the term has changed over the years, the concept today goes by many names, including job scheduling and workload automation, to name but two. These concepts have made batch processing more sophisticated and efficient and include new disruptive technologies such as Cloud Computing, Big Data, AI and DevOps.

The Evolution of Batch Processing

Batch processing has been a major IT concept since the early years of computing technology, establishing the basis for the future concepts of job scheduling and workload automation and tightly bound to the evolution of the computing industry as whole.

In the early days, there were mainframe computers and something called a batch window. Since all computing activities were carried out on mainframes, finding the best way to utilize this finite (and limited) resource was absolutely critical. Batch windows were nightly periods during which large numbers of batched jobs were run offline, a practice that was developed to free up computing power during the day. This allowed end-users to run transactions without the system drag caused by high-volume batch processing. In general, mainframe workloads were classified into batch workloads and online transaction processing (OLTP) workloads.

Batch Workloads

Batch jobs were used for processing large volumes of data for routine business processes such as

  • Monthly billing, fortnightly payrolls, etc.
  • Online transactions (OLTP)
  • Online (interactive) transactions were able to handle users’ action-driven interactive processing, meaning users no longer needed to wait for processing, but instead received an immediate response when requests were submitted.

In past years, IT operators would submit batch jobs based on an instruction book that would not only tell them what to do, but also provided them with directions on how to handle certain conditions — such as when things went wrong. Jobs were typically scripted using so-called job control language (JCL), a standard mainframe computer concept for defining how batch programs should be executed. For a mainframe computer, JCL:

  • Identified the batch job submitter
  • Defined what program to run
  • Stated the locations of input and output
  • Specified when a job was to be run

When one submitted job finished running, the operator would submit another. This was effective at first, but quickly became problematic as the number of machines, jobs, and scheduling dependencies increased. Early job scheduling was meant to bring some level of automation to these tasks.

Limitations of Batch Processing

As batch processing developed over a long period of time, it comes with limitations. Among other things, restrictions include:

  • No automation, considerable manual intervention
  • No centralized management functions for controlling or monitoring overall workloads
  • No audit trail to verify the completion of jobs
  • No flexibility in scheduling rules
  • No ability to allow cross-platform dependencies
  • No automated restart/recovery of scheduled tasks

Learn more how to overcome the traditional limitations of typical batch processing by introducing enterprise job scheduling to your data center. 

Start Your Automation Initiative Now

Schedule a Live Demo with a Stonebranch Solution Expert

Further Reading

Static vs. Dynamic IT Automation, and How They Work Together

Although some organizations have completely evolved their IT operations using dynamic, event-based workload automation, others utilize both static and Dynamic...

Dynamic IT automation whitepaper

Dynamic IT Automation: Why "dynamic" is different and how it enables real-time automation.

In this whitepaper on dynamic IT automation, Stonebranch discusses how dynamic, real-time automation can help meet the modern challenges of digital...

What is Workload Automation (WLA)—and How is it Changing?

Read this blog post and learn why workload automation is the best practice for using software to schedule and manage tasks related to business processes and...

Stonebracnh and Python Header

How To Install External Python Modules in Universal Agent

A technical how-to written by Colin Cocksedge, Director Product Management at Stonebranch, on how to install external python modules on Universal Agent (UA).