Risk Driven Management
Stories from Projects
The article discusses the course and key obstacles of testing on a project in a compliance-driven setting. In this setting, the integration testing team joined the project very late and faced challenging deadlines. The article highlights meetings, working sessions, micromanagement, and individualized support as means of mitigating these obstacles.
What It Will Cover
- Senior Test Consultant
- Test Managers
- Senior Test Analysts
- Test Analysts
- Test Engineers
The project aimed to provide a simpler solution for identity management within the organization as a replacement for the current solution, which is to be decommissioned. The TO-BE solution should eliminate dependency on the AS-IS solution's supplier, improve documentation, and automate previously manual processes.
Our team joined an ongoing project where the development of the TO-BE solution had been underway for 2 years and was nearing completion. Testing was about to start. Our team was assigned the task of System Integration Testing, which would validate UAT readiness. At that time, the first wave of the COVID-19 pandemic was at its strongest, resulting in remote-only onboarding and work.
The project introduced significant changes across multiple systems. The AS-IS solution had been around and maintained for 15 years, with key knowledge scattered among outdated documents, source code, and the memories of employees who might have already left the organization. The project team(s) had limited experience with projects of this scale and even less experience with integration testing.
System Integration Testing was not initially part of the scope but was added to the schedule later. The schedule was adjusted by parallelizing tasks. The scope of SIT was to cover the integration of 11 systems within the project, as well as their basic integration with upstream and downstream solutions.
Project Development Description: Beginning, Progress, Conclusion
The initial survey uncovered that the requirements were poorly documented and difficult to test. The teams had little experience with software testing, but they had prepared, reviewed, and approved test cases. We realized that the individual teams had been working almost independently from one another and from business representatives, which had prevented the discovery of misunderstandings or gaps in the design. Suppliers completed factory testing without significant findings.
An early smoke integration test in the development environment identified the extent of design gaps, leading to a 3-month postponement of the testing phase. During those three months, all teams worked together to understand key use cases and ensure that the solution supported them as intended, without undesired side effects.
Testing in the common integration environment proved more difficult than expected due to frequent solution performance issues, overall environment instability, and the quality of a key component of the TO-BE solution. The decision to parallelize SIT and UAT helped mitigate delays in project delivery milestones, but it required extra effort due to UAT uncovering design flaws that prevented deployment to production and subsequent changes.
All teams involved in testing faced administrative overheads due to changes in documentation practices. The testing was concluded within the revised schedule, meeting all the conditions necessary for the Go-Live decision to allow solution deployment.
Overcoming Major Obstacles
Issue No. 1: We solved the problem of replacing the old system, with key knowledge scattered in outdated documents, source code, and among employees who had left the company. We introduced exploratory testing sessions led by experienced testers, involving business representatives, and individual system component experts. These sessions reverse-engineered processes and documented them for their originally intended purposes.
Issue No. 2: Constantly changing interfaces, data flows, and communication models hindered automation and reduced testing efficiency per interface. We segregated duties to minimize overlap between factory testing and integration testing. Suppliers became responsible for testing interfaces, internal data flows, and communication models with the nearest systems. The integration testing team focused on end-to-end data propagation.
Issue No. 3: We addressed the lack of cooperation between project teams by organizing separate status and defect meetings with individual teams, as well as common status meetings. Micromanagement was necessary and provided by Project, Test, and Defect Managers.
Issue No. 4: We provided support to the inexperienced testing teams by assigning experienced testers to assist them and dedicated specific time slots for consultation, issue resolution, and work review.
Issue No. 5: We communicated changes in minimal acceptable documentation practices to all affected parties and allowed extra time to address necessary changes. For retrospective changes where gathering required details or associated effort was impossible and posed a risk to timely project delivery, we addressed them through discrepancy reports. These reports explained why the discrepancy occurred, what was done to prevent it in the future, and were accompanied by signatures from the respective responsible individuals.
Issue No. 6: The compliance tailored for a waterfall project significantly limited options for speeding up progression and presented risks of additional effort. We made it possible to apply agile methodology by keeping the same artifacts as the original approach would produce or by artificially producing them on the side.
We successfully led the project testing teams to a productive conclusion of the testing phase. Over a period of 4 months, we identified over 500 issues, with more than 200 of them being critical. We ensured that these issues were corrected, retested, and properly recorded or included in the backlog for the next project phase. The solution was deemed viable for deployment and is expected to replace the previous solution soon.