The Advantages of a Pipelined Approach for Build and Deployment Automation

[article]
Summary:

Automation is required to build and deploy software applications consistently. Automation is necessary to build and deploy software applications rapidly. While build and deployment automation is essential for modern software development, not all approaches to automation produce the same results.

A pipelined approach to build and deployment automation is an effective design technique because it models the end-to-end process. Unlike other approaches to automation, this approach automates the build and deployment workflow and the instructions that comprise each step in the workflow. To understand why the pipelined approach is superior consider the alternatives for build and deployment automation.

Definition of a Pipelined Approach for Build and Deployment Automation
Perhaps the easiest way to describe a pipelined approach is to explain what it is not.

Unlike a collection of helper scripts, the pipelined approach is not detached. This means individual scripts do not run in isolation to implement the various steps needed to build and deploy a software application.

Unlike a single end-to-end build and deployment script, the pipelined approach is not monolithic. This means one large script does not implement all the steps needed to build and deploy a software application.

Instead, a pipelined approach is a modular technique that sequences and executes the steps to build and deploy a software application. Ideally, a workflow engine manages the sequencing and execution of the build and deployment steps. The workflow begins with steps that are prerequisites to the build, such as labeling the source code files and retrieving them from the version control system. The workflow ends with steps that complete the deployment or perform post-deployment operations on the software application. Smoke testing the build is an example of a common post-deployment operation.

Example Build and Deployment Steps
To understand how the pipelined approach compares to other approaches, consider the following simplified steps to build and deploy a software application.

    1. Log on to the build server using the appropriate account.
    2. Configure the build environment with the environment variables needed to run the build script(s).
    3. Label, tag, or snapshot the files from the version control system that will be used to create the build.
    4. Populate a workspace on the build server with the files included in the label, tag, or snapshot.
    5. Execute the command to compile the application.
    6. Execute the command to package the application into deployable units.
    7. Copy the packaged application from the build server to the target server(s).
    8. Log on to the target server(s) using the appropriate account(s).
    9. Configure the deployment environment with the environment variables needed to run the deployment script(s).
    10. Execute the command(s) on the target server(s) to deploy the application.
    11. Verify that the application deployed as expected by running a smoke test.

The steps listed above comprise the end-to-end build and deployment process. To automate this process, the execution of each step must be automated. Additionally, the sequencing of the steps must be automated.

To understand why a pipelined approach is ideal for automating the sequencing and execution of build and deployment steps, consider three alternatives: helper scripts, a monolithic script, and a top-level script.

Using Helper Scripts for Build and Deployment Automation

The purpose of a helper script is to automate a repetitive task. The helper script reduces the tedium that occurs when executing the same task frequently. The helper script speeds the execution of the task and reduces the chance of errors.

A script that copies artifacts from the build server to the target servers is an example of a helper script. In this case, the helper script executes the sequence of commands to navigate to the artifacts on the build server and to transfer these artifacts to the target servers.

Helper scripts are a vast improvement over an entirely manual process. However, a collection of helper scripts is not end-to-end build and deployment automation. Specific knowledge is needed to run each script on the right server using the appropriate arguments.

Using the example steps above, helper scripts may create the build, move the build artifacts to the target servers, and execute the deployment commands. However, the scripts must be used by someone who has knowledge of, and access to, the build and deployment servers. Often, manual instructions accompany the scripts so that they can be executed properly.

Using a Monolithic Build and Deployment Script
The build and deployment steps listed above seem simple enough to be included in, or controlled by, a single script. A member of the development team who is familiar with the application could craft a script to build and deploy the application.

An Ant script that retrieves files from the version control system, builds a Java application, and deploys that application is an example of a monolithic script. A monolithic script keeps the automation in one place. Only one script needs to run to execute the end-to-end build and deployment process. While well suited for integration builds by developers, a monolithic script does not scale to support deployments to test and production environments.

The monolithic script would work fine if each step automated by the script never changed. With some additional effort, the script could accommodate changes to the steps. However, what happens to the monolithic script if the build and deployment steps are more complex? How does the script need to evolve to build the same application from different codelines? What changes are required to deploy different builds to different environments?

The limitations of the single script become apparent each time it needs to be modified to support changes to the build and deployment steps and/or environment. The build and deployment script needs to be modified to support ongoing changes to the software application and test environments, yet the script must be stable so that it builds and deploys reliably. Additionally, the script must be easy to understand so that it can be modified to meet changing needs.

Using a Top-Level Build and Deployment Script
To the operator of the top-level script it appears to be identical to the monolithic script. Yet, to the maintainer of the top-level script the difference is clear.

The top-level script acts as a basic control mechanism. It invokes other scripts that perform specific tasks. For example, the top-level script may invoke four lower-level scripts. The first script retrieves the files from the version control system. The second script executes the build and creates the build artifacts. The third script transfers the build artifacts to the target servers. The fourth script deploys the build artifacts on the target servers.

A well designed top-level script can be easier to maintain than a monolithic script. This is the case if the role of the top-level script is to invoke the appropriate lower-level script to complete each step in the build and deployment process. Changes to one or more steps can be isolated to the lower-level scripts that implement these steps.

Compared to helper scripts and a monolithic script, the top-level script is superior. However, the top-level script still falls short in comparison to the pipelined approach.

Using the Pipelined Approach to Automate Workflow and Job Execution
The pipelined approach to build and deployment automation improves on the scalability and maintainability of a top-level script. This approach combines workflow with job execution to automate the end-to-end build and deployment process.

The workflow defines the sequence of steps required to build and deploy the software application. The workflow identifies dependencies between steps. The workflow knows when steps must be executed in sequence and when they can be executed in parallel.

Job execution is the automation of each step in the workflow. Job execution is more than executing the commands to complete a step. Job execution entails configuring the environment so that the commands run properly. This is a major difference between the pipelined approach and helper scripts. While a helper script relies on the operator to know how to run the script, a job execution engine handles the setup for the operator.

Implementing a Pipelined Approach
The most effective way to implement the pipelined approach is by using a robust framework. AnthillPro is an example of a commercial tool that automates both workflow and job execution. Using AnthillPro, an automation engineer specifies workflows that define the end-to-end build and deployment process. Each workflow consists of one or more jobs. Each job contains a sequence of job steps. Each job step is a command or script that is executed by an AnthillPro agent from the command line.

 AnthillPro ensures that jobs run in the right order in the correct environments. While each job step executes AnthillPro logs the output and checks for successful completion. Only when the step completes successfully does AnthillPro invoke the next step.

Since AnthillPro automates both the workflow and the jobs automation engineers can focus on specifying the actual commands that comprise each step in the build and deployment process. Additionally, an automation engineer can implement jobs in a job library so that the same job can be used in different workflows. For example, the job that deploys a Java EAR file to an application server can be placed in the job library and used by all workflows that deploy applications packaged in an EAR file.

Conclusion
Consider the different approaches before automating a build and deployment process. The quick and simple approach is not likely to be the most extensible and maintainable. Helper scripts, a monolithic script, and a top-level script are approaches for build and deployment automation. While these approaches are preferred to a manual process, the pipelined approach is ideal because it automates both the workflow and the jobs that comprise the workflow. The combination of workflow and job execution can be used to implement build and deployment automation that is flexible and extensible.

About the author

StickyMinds is a TechWell community.

Through conferences, training, consulting, and online resources, TechWell helps you develop and deliver great software every day.