Moving Beyond Job Scheduling
Evolving Beyond Job Scheduling to File Orchestrations
Job scheduling is a key and often overlooked facility of modern financial systems. Although it may be considered 'low-tech' in this age of streaming video, mobile, big data, and high-speed messaging, much of the financial processing taking place today is processed via the processing of ‘jobs.’. These jobs may be processing log files submitted for review by analytics engines, or clearing a collection of checks submitted for payment or returning a report of financial transactions to a law enforcement agency triggered by a search warrant. Evolving from simply processing jobs to performing complete orchestrations of business information (e.g., file orchestrations) is a natural evolution in more effective use of business resources and increased productivity.
Moving Beyond Job Scheduling to Orchestration
Job scheduling is often addressed in a fragmented, siloed, application-by-application basis. Each application has an isolated view of its jobs — those activities and resources needed for the application’s processing. Each application has its own private knowledge and understanding of the job.
A central facility directing the processing of jobs is seldom available — such job processing and scheduling is often buried in task schedulers, batch and process scripts, and sometimes even the job names themselves.
As the complexity of modern systems continues to increase, the need for centralized control and orchestration of an enterprise's jobs grows ever more pressing. Job processing, while almost as old as data processing itself, continues to evolve. Job design, scheduling, execution, and monitoring continue to move toward a more robust model of ‘orchestration.’ The service-oriented architecture and business process modeling communities both developed and refined the term 'orchestration' and the notion of ‘orchestration platforms.’
“An orchestration platform is dedicated to the effective maintenance and execution of business process logic. Modern-day orchestration environments are expected to support sophisticated and complex service composition logic that can result in longrunning runtime activities." See: SOA Patterns definition of orchestration
Refining this 'orchestration' definition to something specific for job scheduling and processing yields:
- Job Orchestration is the ability to control operational flows and activities based on business rules, especially in multi-application systems.
- Job Orchestration includes data transfer and the ability to apply automation to job processing such as triggers, schedules, explicit calls, and chained calls to solve a business problem.
- Job Orchestration involves complex workflows, service level agreements, transformations, messaging, alerts, and other attributes of complex enterprise systems.
- Finally, Job Orchestrations themselves can be described and assembled from a set of reusable and descriptive patterns describing their behavior and capabilities.
The Business Benefit of Job Orchestration
Those who have moved from basic job scheduling to job orchestration have seen significant benefit in enterprise activities such as:
- Backup and restoration
- Disaster recovery/failover
- Service request/fulfillment
- Incident management/event response
- Data movement
Another candidate synonym for job orchestration is IT process automation.
In addition, proponents cite a litany of opportunities for savings. Their research shows opportunities for:
- Savings due to you potentially using fewer or less costly personnel
- Savings realized by your reducing or eliminating manual, unnecessary or repeating processes
- Savings enjoyed by your no longer missing SLAs, and reduced customer dissatisfaction
- Savings on your meeting regulatory and other compliance issues
- Savings through your purchasing fewer or smaller quantities of software and licenses
- Savings on annual software maintenance costs
- Savings on homegrown software development, testing, and maintenance
- Savings through your use of fewer hardware, virtual, database or OS platforms
File Processing Overview
An often overlooked area of support within job scheduling and generic business process orchestrations engines is the domain of file processing. File processing involves many activities. These include, for example, and in no particular order:
|File Splitting||File Extraction||File Replication|
|File Conversion / Transcoding||File Reformatting||File Matching and Merging|
|File Concatenation||File Aggregation||File Validation|
|File Compare||File Parsing||File Summarization|
|File Archiving||File Sorting||File Loading (to databases and ETL tools)|
|File Generation/Creation||File Ingestion||File Injection|
|File Copy||File Processing||File Deletion|
|File Renaming||File Encrypting||File Versioning|
|File Scheduling||File Detection||File Transfer|
Each file process provides value to multiple enterprise business processes. The ‘File Compare’ process, for example, provides value in business processes involving change management, audit and compliance, and internal controls.
Since these file processes are common and often reused, having a descriptive catalog of file processes from a job perspective is valuable. Such a catalog would contain for each file process, for example:
- A definition of the process
- The impact of the process, and its applicability, to business and technology usage
- Usage statistics of where the process is used
- A list of example usages within existing and future applications
- A definition of metrics and key performance indicators (KPIs) associated with the process
Reviewing and refining this file process catalog leads to insight into each process and how processes can be combined into process orchestrations and business workflows. With careful attention paid to the issues regarding high volume and/or mission-critical processing, these process orchestrations lead to significantly simpler, robust, and reliable workflows.
File Processing + Job Scheduling = File Orchestration
Job scheduling involves initiating and controlling of jobs. Often these jobs process files. The business value of a file orchestration platform lies in the facilities it provides to standardize and improve the reliability of fundamental file processing activities and integrate job scheduling and monitoring into workflows involving files.
Financial institutions constantly grapple with disparate industry-standard and proprietary file formats, and varied channels to receive and deliver files. Challenges surface in managing the receipt, processing, converting, and delivery of this diversity. Providing a consistent means of addressing this diversity is a key tenet of file orchestration.
In a manner similar to a web application server, a file orchestration platform wraps each of the above processes with retry and error handling capabilities, a storage mechanism to maintain each of the above patterns and make them reusable in multiple circumstances, load balancing and distributed processing, monitoring, and service level tracking and alerting.
A file orchestration platform also addresses the ‘no one size fits all’ paradox. An approach that works well for processing small text files may not fit for processing very large (gigabytes in size) binary files containing images or digital media. Any effective file orchestration platform must provide a robust tool set of capabilities to address the diversity of processing it may encounter. Key attributes of a file orchestration platform consist of, for example:
- Load balancing of work
- Clustering across multiple physical and virtual processors
- Fine-grained partitioning and allocation of work across a cluster — allowing work to be processed on machines most appropriate for the file processing required
- Parallel execution of processes
- The facility to delegate tasks to other resources
- Built-in and user-defined facilities for file transfer, file processing, and file validation
- File-by-file process monitoring
- Automated retry and recovery of file processing exceptions
- Centralized storage and management of file workflows
- Secure processing of files and fine-grained access controls to the workflows and processing engines that govern file processing
Taken in their entirety, these facilities comprise the business value available from a file orchestration platform that is not otherwise achieved in existing application servers or homegrown solutions.
Consider a common file orchestration - that of routing a payload of information (such as a payments file or a report) to a destination based on routing information maintained in an enterprise database. Such file orchestrations are common in numerous systems, such as the delivery of disclosure forms for financial transactions and the delivery of statements from a healthcare system.
Job orchestration is the key next step in supporting highly reliable and repeatable business processes involving jobs. A job orchestration platform standardizes and provides a repository for a catalog of common processes that are then designed into executable workflows supporting robust orchestrations. A job orchestration system that incorporates file processing into its model has significant value for a wide variety of application domains.
The Flux software platform orchestrates file transfers and batch processing workflows for banking and finance. First released in 2000, Flux has grown into a financial platform that the largest US, UK, and Canadian banks and financial services organizations rely on daily for their mission critical financial systems. Flux provides Electronic Bank Account Management (eBAM) solutions for banks. Electronic bank account management replaces slow paper-based processes with electronic efficiencies, reducing human errors and providing greater transparency into bank and corporate operations.
Banks that offer an eBAM solution possess a critical market advantage in their efforts to expand and retain their corporate customer base.