Years ago, I learned of commercial tools that automated builds and deployments by executing a combination of command line instructions and scripts. At the time, I questioned the value that these tools could provide. Even though the tools were promoted for build and deployment automation, they did not generate the command line instructions and scripts to actually build and deploy applications. The new tools did not replace existing build tools like Ant and make. Instead, the tools required that their users provide the build and deployment scripts. Surely, I could create one top-level script to build my application and another top-level script to deploy it. Why would I need a commercial framework to run my build and deployment commands and scripts?
Years later, I understand and appreciate the effort that can be involved in building and deploying a distributed application. Following manual build and deployment procedures is not optimal. The effort can be time consuming and error prone when steps must be completed manually. Executing manual steps, even when following written instructions, is not the ideal way to achieve a repeatable process. Instead, automating manual steps by using scripts that can be executed from the command line is a much better way to achieve a repeatable process.
Widely used build tools, like Ant and make, provide the framework for build automation. Scripting languages, like UNIX and DOS shells, Perl, and Python, enable deployment automation. However, creating the build and deployment scripts is only part of the solution. The scripts need to run from the appropriate accounts on the servers that build and run the application. Often multiple servers, such as the web server, application server, and database server, must be accessed from different accounts to deploy the components of a distributed application. Additionally, the environment needs to be configured consistently on each server so that commands and scripts run properly. Finally, the output of each command and script needs to be captured and processed. When a build or deployment command or script fails, the failure needs to be handled in a way that does not leave an environment in an unstable state.
As the user of a commercial build and deployment tool, I now appreciate the value of these tools. Even though I must supply the command line instructions and scripts to build and deploy applications, the tool organizes and manages the command line instructions and scripts so that they run in the appropriate environment, in the proper sequence, and according to the desired schedule. Additionally, the tool allows other authorized users to run the build and deployment commands and scripts even if these users do not have access to the required servers. Finally, the tool serves as an information radiator by reporting the status and history of builds and deployments. The tool even emails the results of scheduled and on-demand builds and deployments.Commercial Tools that Execute User-Developed Scripts
Build Forge and ElectricCommander are established commercial tools that model, configure, and execute everything that is needed to automate builds and deployments. Cruise Control is a new commercial tool from an established consultancy that helped make continuous integration a common development practice.
In 2007, I examined AnthillPro, Build Forge, and ElectricCommander. As I became familiar with these tools, I realized that they followed a similar approach to build and deployment automation. The similarities of these tools begin with their architecture.
The server-agent architecture is popular among the three commercial build and deployment tools. The server is the software component that processes requests for builds and deployments. The agent is the software component that runs the scripts or command line instructions to complete a request. Typically, a software development team, or even an entire software development organization, needs one server and several agents to provide build and deployment automation.
Users interact with the server from a web browser. The server's web-based GUI allows users to configure and execute the build and deployment commands and scripts. Even though the build and deployment commands and scripts are specified on the server, the actual build and deployment work is done by the agents. The server processes requests to build and deploy. For each request, the server determines which agent is required and available to process the request. The server sends the request to the appropriate agent and waits for a response as the agent works the request.
The agents run on the physical servers used to build and deploy the applications. Each agent runs from the account, or as the user name, needed to execute the build and deployment commands or scripts. The agent logs the output from executing the command line instructions or scripts. The agent sends the output log back to the server so that the server can report this information. Additionally, the agent passes back the return code from executing the instructions or script at the command line. The server uses the return code to determine if the request completed successfully or failed.
The three established commercial tools use the paradigm of a project as the top-level container for build and deployment automation. Typically, a project in the tool maps to a software development project, or to one of many codelines from a software development project. Depending on the tool, the project is, or contains, the unit of automation. When the project is not the unit of automation, the automation may be implemented by a workflow or job.
The web-based GUI of each tool allows users to configure a project with the command line instructions and environment settings needed to build and deploy the software application. The commercial tools provide for the build and deployment instructions to be specified as a series of steps. Depending on the tool, the steps are part of the project, or part of a component of the project.
Using the steps in a project, or project component, the build and deployment can be implemented at any level of granularity. While it is possible to create one step to build the application and another step to deploy the application, this approach does not exploit the full capabilities of the tool. Ideally, each step should specify only one action. When the steps are used to implement the build and deployment at a fine level of granularity, the steps document the build and deployment process. As each step is executed by the tool, the GUI can show the time required to complete the step and the output from executing only one action. If a step fails during execution, it is easier to identify the command that failed.
Environment Variables and Properties
Defining the commands to execute is a prerequisite for build and deployment automation. Determining the environment configuration required to execute these commands is another precondition. The commercial tools enable build and deployment automation because they capture the commands to execute, and they maintain the required environment configuration.
Environment variables, and their required values, are stored by the tool and set before executing the build or deployment commands. This ensures that the commands are configured to run properly every time a build or deployment request is triggered.
In addition to setting environment variables, commercial tools may use properties so that the same build and deployment steps can be used for different builds or deployments. For example, the parameters that are passed to a build or deployment script might be stored in properties.
Scheduled, On-Demand, and Continuous Integration Builds and Deployments
The three established commercial tools support scheduled, on-demand, and continuous integration (CI) builds and deployments. Scheduled activities are likely to include daily builds and deployments to test environments. On-demand builds are likely to be trigged when the latest code change needs to be integrated in preparation for testing. CI builds will run as code is committed to the version control system to ensure that the latest code change does not break the build.
Notification and Reporting
Whether a build or deployment is performed to validate a code change or to deliver software to an environment, there is always a need to know if the build or deployment completed successfully. The three commercial tools display the status of each build and deployment on the server's GUI. Additionally, the tools can notify interested parties when a build or deployment succeeds or fails.
Self-Service Enabled by Role-Based Security
AnthillPro, Build Forge, and ElectricCommander are designed to allow team members other than build and release engineers to build applications and to deploy builds to test environments. This concept is known as self-service. It is intended to relieve the bottleneck than can occur when there are not enough build and release engineers to support the demands for builds and deployments.
The commercial build and deployment tools employ role-based security so that each user can be granted the appropriate level of access to perform a build or deployment. Build and release engineers can be granted access to define and execute builds and deployments to all environments. Developers can be granted access to execute builds and to deploy them to a development integration environment. QA team members can be given access to deploy builds to test environments. Operations staff can be granted access to deploy builds to production.
To enforce the access rights, the tools authenticate users with a built-in or external mechanism. If the tool's native authentication mechanism is not sufficient, an authentication mechanism like LDAP can be used.
AnthillPro, Build Forge, and Electric Commander each have features that distinguish them from the other products. The value of each feature depends on the manner in which the tool will be used.
AnthillPro is based on the concept that one build should be created from each label or snapshot in the version control system. Once created, the build is stored in AnthillPro's Codestation repository. Secondary processes operate on the build to test it and to deploy it to different environments as the build is promoted from development, through QA, to production.
Build Forge provides the ability to run pre-flight builds. Prior to committing code changes to the version control system, developers can run a pre-flight build from an IDE to validate the changes in a remote build environment. Build Forge copies the files from the developer's local workspace to the remote server where the build executes. If the pre-flight build executes successfully, the developer can commit the code changes knowing that the actual build will complete on the target server.
ElectricCommander uses an easy-to-understand, yet conceptually powerful abstraction to model builds and deployments. The paradigm is general enough that it can be used to automate any activity that can be performed on a remote environment. Since different accounts are often needed to run commands on a remote server, ElectricCommander supports the concept of user impersonation. Even though the ElectricCommander agent may run from one account, the commands executed by that agent can run from a different account. The credentials of the other account are stored by ElectricCommander so that the user can be impersonated by the agent to run the command on the remote server.
Commercial Tool vs. Home-Grown Scripts
While a commercial tool is not required for build and deployment automation, AnthillPro, Build Forge, and ElectricCommander offer advantages over a home-grown scripted solution.
Each of these tools implements a framework for build and deployment automation. The framework allows tool users to work at a higher level of abstraction than is possible by using the core constructs of a scripting language. By working at a higher level of abstraction, tool users can concentrate on defining the steps that comprise the build or deployment. Build and deployment steps can be generalized by using properties and other features of the framework. This means one set of steps may be able to build or deploy many different applications.
Too often, software development organizations focus on the creation of build and deployment automation without realizing the need to maintain that automation. Maintenance is an area where the framework provided by a commercial tool offers an advantage. When the steps in each build and deployment are defined at a fine level of granularity, it is easy to see the function of each step. When changes are required for the build or deployment, it is easy to identify the steps that require changes. Contrast this to the maintenance of a home-grown scripted solution. If critical logic is buried deep in scripts, it may be difficult to uncover that logic. If the developers of the script are no longer around, considerable effort may be required to understand how the home-grown scripts work.
While it is possible to build your own solution from scratch, you need to determine if the cost to build is less than the cost to buy a commercial tool and configure it. Begin your analysis by obtaining an evaluation copy of AnthillPro, Build Forge, Cruise, and ElectricCommander then configuring each tool to build an existing application and deploy it to an integration or test environment. As you use each tool, you will identify some features that are suited for your needs and others that are not. Before building your own solution from scratch, you need to determine if configuring a tool like AnthillPro, Build Forge, Cruise or ElectricCommander is a more practical approach.
The ability of a software development organization to build and deploy consistently is a measure of that organization's value. Realizing this, build and deployment automation is a pressing need for many software development organizations. A development organization can create the automation from scratch or configure a commercial build and deployment tool. AnthillPro, Build Forge, and ElectricCommander are established tools that follow a similar approach for build and deployment automation. Cruise is a newcomer that is likely to compete with these established products. Even though commercial tools require users to provide the commands and scripts that build and deploy applications, the tools provide a feature-rich framework to model, configure, and execute everything that is needed for build and deployment automation.