Large Program Turnaround

Our client, a $2 billion software company, was struggling to implement a SaaS-based sales force automation solution. After two years of work, they had pulled the plug on go-live for the second time—and faced a budget variance of nearly 100%. The extremely frustrated management team had to understand what was wrong and get the program back on track. Failure would force them to cancel the program entirely.

The CIO called on Artizen to perform a Program Assessment engagement. The initial four-week engagement revealed a host of problems, but three stood out.

First, the program scope was a moving target. No formal business requirements, software gap documents, or functional specifications had been signed off by the business. As a result, no baseline existed for the work’s scope or deliverable functionality. That left users going in circles. As they executed their User Acceptance Testing (UAT), they were seeing the solution for the first time—and working out the functional designs at the same time. They logged change requests as “defects.” In some cases, the changes were very large. Project managers were not tracking scope change or going through any formal change approval process.

Second, managers had outsourced the development of a very large, complex set of software integrations between the SaaS application and their ERP system. There were significant quality problems with the integrations that appeared to be design problems, not just code quality.

Third, no weekly project status reports were being produced. As project timelines began to slip, IT managers and the executive team were not aware of the problems. This compounded the scope management and quality problems by hiding them until it was too late to take corrective action.

The lack of scope management and the metrics to measure change had left both managers and stakeholders completely in the dark about the real problems. As a result, business executives responsible for the program were outraged by what they perceived as poor quality, when in reality they had failed to get the requirements right up front.

Artizen recommended critical actions to be taken immediately:

  • Implement a formal scope change review-and-approval process. In addition, provide weekly scope change metrics to management, clarifying how many new changes were being requested, the size of the new requests in workdays and the status of those requests.
  • Begin reporting status weekly, using tools that 1)flag scope, schedule, and resources and budget status (Red/Yellow/Green dashboards), 2) show schedule baseline-versus-actual and 3) supply scope change metrics, defect counts and defect types (design, code, etc.)
  • Conduct a defect history and design-impact review on all integrations, assessing whether integration “quality problems” were the result of scope change, design flaws or code quality. Finally, examine the pipeline of open changes and, if necessary, initiate a formal redesign phase.

Tellingly, no further redesign was needed for the integrations. The mere presence of a scope management process and management’s new ability to see changes slowed the number of approved requests to a trickle. Weekly status reporting improved management’s understanding and increased trust.

Two months after Artizen’s recommendations were implemented, the application entered a production pilot with 10 per cent of the sales force. Three months after the pilot, the global rollout was complete. The sales force provided a very positive response to the application.