Data collection made easy for global healthcare provider using Flux
With Growth Comes Added Workflow Complexity
As businesses grow, what were once simple tasks often become unmanageable. Handling such growth effectively is a competitive advantage. Data collection tasks that were once simple, straightforward, and easy to manage often spiral in complexity as the business’s customers and data centers begin to span time zones and continents.
A global leader in the healthcare industry needed a solution to manage such a predicament. At an allotted time each day, their markets from around the world would begin reporting data, sending files containing information about the day’s activity to a central FTP location. From there, the files would be parsed, saving key information to a database system before being archived.
Early on, these tasks could be scheduled simply and independently, but as time passed and the complexity of the system (and the volume of data) grew, they needed to find a new solution. Among their requirements, they needed a system that could:
- schedule on time-based events spanning multiple time zones with ease;
- manage the transfer of thousands of files per day;
- invoke local code and web services, using data collected from the same files;
- update databases, via SQL and stored procedure invocations, to store the data after processing;
- send notifications on both success and error conditions;
- and provide reporting and high auditability, with historic records of all the processing performed (and resulting data output) on the system.
Flux proved a perfect fit for their requirements. Working closely with Flux’s workflow experts, they were quickly able to assemble a workflow template to perform all of their file transfer and data processing — tasks which had previously been impossible to manage, spanning multiple systems and requiring constant monitoring.
These disparate tasks were coalesced into a single workflow template that not only distilled a complex web of tasks to a single, easy to manage interface, but also provided visual dependency tracking and the ability to rapidly develop new tasks and dependencies as they were needed.
Flux’s robust error handling ensured that the system no longer required countless operator hours to watch and maintain; automatic notifications and simple error reporting were easily available, freeing up resources and allowing those operators not only to monitor the system, but easily engage in its development to meet new challenges.
Workflow Reuse Seen as a Key Benefit
With so much data to process at a multitude of international locations, it was necessary to make the template workflows easily reusable, with shared data stored in a single location to avoid tedious development tasks and numerous updates if an FTP site changed or a database was moved. Our customer leveraged Flux’s robust runtime configuration capabilities to store their shared data in a single point, making new development and updates to existing workflows a breeze.
Dynamic Data and Configuration is an Added Benefit
For data generated at runtime, Flux’s repository allowed the customer to make their workflows templates reusable and, through use of the Flux’s flow chart capabilities, allowed dynamic invocation of these templates with quick, automatic population of dynamic data.
After putting everything together, Flux was an immediate success, and within a few short weeks, was up and running in production. Flux has continued to provide the strong, reliable backbone for their operations for nearly half a decade.