start portlet menu bar

HCLSoftware: Fueling the Digital+ Economy

Display portlet menu
end portlet menu bar
Close
Select Page

Automate anything, Run Anywhere is our motto!

What better place to start making it real than from our very own Workload Automation test environment?

We have been applying it in our Test Lab for Workload Automation version 9.5 since its general availability in 2019, and to all the subsequent fix packs, including the last one, fix pack 3.

Let’s start discovering how we can leverage a Workload Automation deployment to run automation jobs that build our code and prepare packages for both our HCL and IBM brand offerings. The automation jobs trigger several automation suites running for each brand.

Every day, we run about 20,000 automated test scenarios that cover all the Workload Automation components (Dynamic Workload Console, master domain manager and its backup, the Rest APIs layer, dynamic agents and fault-tolerant agents).

Rest assured that we always uphold backward compatibility and detect instability issues before, rather than later! The main goal for us is to prevent injecting defects and instability to the base functionalities used by the most part of our customers.

Let’s take a deep dive into the automation world!

How many test automation suites?

how many test automation suite

Fig. 1 – Automation suites daily flow

 

The Workload Automation solution includes the following test automation suites:

  • BVT suites (Build Verification test that validates the health of the most recent available product component packages)
  • Installation and upgrade suite
  • Rest APIs suite
  • Dynamic Workload Console (web-based User Interface) suite
  • Server suite
  • Agent suite

 

The latest results of the automation suites are available on our Jenkins site and they are gathered in theAggregate Test Report. The Aggregate Test Report is a giant matrix where each row represents a Jenkins job that groups a list of test scenarios belonging to a specific suite and the columns correspond to the date when the Jenkins job runs. The color of each cell of the Aggregate Test Report is updated each time the Jenkins job completes its daily or weekly run. The color indicates the percentage of failed test cases with respect to the total number of them.

The matrix has continually evolved since the test phase of the version 9.5 General Availability through until the test phase of the latest fix pack.

extract of the aggerate test report

Fig. 2 – Extract of the Aggregate Test Report

 

Let’s figure out the number of the scenarios that automatically run every day for each automation suite.

 

BVT automation suites

BVT suites

#test cases description
Docker Installation 2 Runs a subset of the scenarios of the CVT installation suite for docker images.
Server on-prem         341 Runs a subset of the scenarios of the CVT suite on a standalone server with different agents.
Server Docker         277 Runs a subset of the server suite scenarios for docker container images on a simple topology (a master, backup master and agents) with different agents.
Dynamic Workload Console      5 Runs a subset of the scenarios of the CVT suite on a standalone console that is connected to a standalone server.
Agent   1036 Runs the full set of CVT scenarios on a standalone agent.
Rest API  266 Runs a subset of the scenarios of the CVT suite.
Docker server for CI/CD container pipeline 277

Runs a subset of server scenarios for docker container images on a simple topology (a server, backup server and agents). This Jenkins job validates the server and the agent containers for the CI/CD pipeline made available to fix the security vulnerabilities in the latest fix pack release.

  2204

 

BVT aggerate test reports section

Fig. 3 – The BVT Aggregate Test Report section

Installation and upgrade automation suites

Installation and Upgrade

#test cases description
Dynamic agent, z-centric agent and fault-tolerant agent (IBM brand) 731 ·       Runs fresh installation coverage and variation scenarios across all supported OS. The scenarios vary in the installation input parameters that are used. They can be default or non-default, as well as, required or optional across all supported OS.

·       Runs coverage and variation restore scenarios across all supported OS.

Dynamic agent and z-centric agent (HCL brand) 557 ·       Runs fresh installation coverage and variation scenarios across all supported OS. The scenarios vary in the installation input parameters that are used. They can be default or non-default, as well as, required or optional across all supported OS.

·       Runs coverage and variation restore scenarios distributed across all supported OS.

Server (IBM and HCL brand) 913 ·       Runs coverage and variation installation scenarios distributed across multiple, supported OS and relational DB types, for the master domain manager, backup master domain managers and dynamic domain managers. The scenarios vary in the installation input parameters that are used. They can be default or non-default, as well as, required or optional.

·       Runs coverage and variation upgrade/downgrade scenarios that install a previous version and apply a new FP version or update from the previous FP version to the new one for some relational DB types.

·       Runs installation and upgrade scenarios with SSL customization and custom certificates.

·       Runs installation and upgrade scenarios for WA docker images saved in an internal shared repository.

Console (IBM and HCL brand) 335 Runs coverage and variation installation and upgrade scenarios distributed across multiple, supported OS and relational DB types for the console component.
  2536  

 

the agent and zwa agent

Fig. 4 – The agent and Z Workload Automation agent (z-centric) Aggregate Test Report section

the server installation and upgrade

Fig. 5 – The Server installation and upgrade Aggregate Test Report section

the console installation and upgrade aggerate

Fig. 6 – The console installation and upgrade Aggregate Test Report section

 

REST APIs automation suite

Rest APIs  suites

#test cases description
Rest API CVT 2504 The automation suite runs a set of CVT test cases on odd days to cover the basic core functionalities of the product exposed by the master domain manager Rest API.
Rest API CVT Extended 2462 The automation suite runs an additional set of CVT test cases on even days to cover new functionalities of the product exposed by the master domain manager Rest API, which are added over time, release after release, or fix pack after fix pack.
  4966

 

the rest API aggerate test report

Fig. 7 – The Rest APIs Aggregate Test Report section

 

Server automation suite

Server  suites

#test cases description
Server (IBM brand) 3348 The suite implements a set of coverage and variation test cases that cover the major functionalities of the server component. The available Jenkins jobs that run the suite cover the available OS and RDBMS for the server for the IBM brand. It also implements the test of the Docker server image.
Server (HCL brand) 1340 The suite implements a set of coverage and variation test cases that cover the major functionalities of the server component. The available Jenkins jobs that run the suite cover the available OS and RDBMS for the server for the HCL brand. It also implements the test of the Docker server image.
  4688

 

the server aggerate test

Fig. 8 – The server Aggregate Test Report section

 

Dynamic Workload Console automation suite

Console Automation suite

#test cases description
Console 206 x 4 By exploiting the functionality of the HCL One test tools, we implemented a set of coverage and variation test cases (206) that exercise the major functionalities of the Dynamic Workload Console. We created 4 jobs where we mix the available OS and supported RDBMS for the console for both the IBM and HCL brand. The tests are executed using the most widely used browsers supported by Workload Automation.
  824

 

console aggreate test report

Fig. 9 – The Console Aggregate Test Report section

 

Dynamic agent automation suite

Dynamic Agents automation   suites (aka LAR)

#test cases description
Dynamic agent CVT suites (IBM brand) 4273 Runs a set of coverage and variation scenarios for the dynamic agent that includes the scenarios for embedded job plug-ins or Automation Hub plug-ins. The available Jenkins jobs cover all the supported OS for the IBM brand.
Dynamic agent CVT suites (HCL brand) 1272 Runs a set of coverage and variation scenarios for the dynamic agent that includes the scenarios for embedded job plug-ins or Automation Hub plug-ins. The available Jenkins jobs cover all the supported OS for the HCL brand.

 

  5545

 

agent suite aggreate test report section

Fig. 10 – The agent suites Aggregate Test Report section

 

What is the flow of the daily WA test automation suites?

 

continous testing journey

Fig. 11 – Detailed Continuous testing journey

 

Every day, when a new product build is available, the automatic Jenkins workflow starts and performs the following steps:

  • The code for the test automation suites is extracted from the code repository (GitHub) and then a build for each test automation suite is performed.
  • The Workload Automation daily build and the build for the test automation suites are combined into the test repository.
  • The Workload Automation daily build is installed automatically on all the test machines that are part of the test automation infrastructure.
  • The test automation suites are deployed automatically on all the test machines that are part of the test automation infrastructure.
  • The execution of the test automation suites is invoked automatically.
  • At the end of each execution of the test automation suites, test logs are collected, and test results are published on the Jenkins Aggregate Report to be analysed by a test automation specialist.

 

During the development phase (when the new product features are developed), the test automation suites are enriched to also cover the features that are being developed over time. The main objective of test automation during this phase is to increase the test coverage to include the new functionalities that are being added in the product version under test.

During the formal System Verification Test (SVT) phase, the new test automation test cases, that were created during the development phase, are merged into the pre-existing test automation suites.

The daily runs of these enriched test automation suites check the stability of the code, gain greater coverage as new tests are added to the regression package, and reduce test time compared to manual testing. The main objective of test automation during this phase is to demonstrate the stability of the product and to prevent potential backward compatibility issues affecting ideally the base features.

 

At the end of this journey, there is specific success criteria that must be met to consider the quality of the product good enough to be released on the market and that is, the single test automation suites must have a success rate that is higher than the results obtained for the previous release:

 

At the end of the Workload Automation 9.5 Fix Pack 3 development cycle, the success criteria were satisfied with the following values:

  • Dynamic Workload Console suite success rate: 100%
  • Installation suite success rate: 100%
  • Rest API Suite success rate: 99%
  • Agent suite success rate: 99%
  • Server suite success rate: 99%

 

Moreover, the test automation framework described in this article allowed the Workload Automation team to find and fix around 240 defects during the 9.5 Fix Pack 3 development cycle.

 

We hope this article is helpful in understanding the ecosystem in which the WA solution shapes up and evolves to satisfy our customer needs.

 

Learn more about Workload Automation here and get in touch with us writing at HWAinfo@hcl.com

 

 AUTHORS:

serenaSerena Girardini

Serena is the Test and UX manager for the Workload Automation product in distributed environment. She joined IBM in 2000 as a Tivoli Workload Scheduler developer and she was involved in the product relocation from the San Jose Lab to the Rome Lab during a short-term assignment in San Jose (CA). For 14 years, Serena gained experience in Tivoli Workload Scheduler distributed product suite as a developer, customer support engineer, tester and information developer. She covered for a long time the role of L3 fix pack release Test Team Leader and, in this period, she was a facilitator during critical situations and upgrade scenarios at customer sites. In her last 4 years at IBM, she became IBM Cloud Resiliency and Chaos Engineering Test Team Leader. She joined HCL in April, 2019 as an expert Tester and she was recognized as the Test Leader for the product porting to the most important Cloud offerings in the market. She has a Bachelor’s Degree in Mathematics.

Linkedin: https://www.linkedin.com/in/serenagirardini/

 

valentina fuscoValentina Fusco

Valentina joined HCL in July 2019 as a Junior Software Developer where she began working with the Verification Test team as an Automation tester of the Workload Automation suite on both  distributed and cloud-native environments. She has a Master’s Degree in Electronic Engineering.

Linkedin: www.linkedin.com/in/valentinafusco1

 

 

girgiocorsettiGiorgio Corsetti

Giorgio works as Performance Test Engineer in the Workload Automation team. In his role, he works to identify bottlenecks in the architecture under analysis assessing how the overall system works managing specific loads and to increase customer satisfaction providing feedback about performance improvements when new product releases become available through technical document publications. Since April 2018, he has the added responsibility of the Test Architect role for the Workload Automation product family. Giorgio has a degree in Physics and is currently based in the HCLSoftware Rome software development laboratory.

Linkedin: https://www.linkedin.com/in/giorgio-corsetti-8b13224/

 

 

Comment wrap
Automation | July 24, 2023
Workload Automation: More than 100 Plugins Lets You Automate More (and Better)
The Automation Hub is an innovative work-in-progress as we regularly add many more items to it. It showcases HCL Workload Automation’s ability to orchestrate IT and business workloads.
Automation | February 10, 2023
Banking Case Study: New Client Adoption and Reverse Check: HWA+DRYiCE-iControl
We look at a banking case study wherein we are going through the “New Client Adoption and Reverse Check” business process.
Automation | September 28, 2022
See your scheduling metrics on Prometheus and Grafana
HCL Workload Automation has exposed its metrics for the main components, the back-end which reports metrics around job execution.