Finding the Solution to Crappy Software

By Bob Doyle

(Back to article)

Organizations are increasingly dependent on software to run daily operations and generate revenues and this dependence has served to heighten executive awareness about the need to improve software quality. It’s no surprise then that application development organizations have always sought ways to improve the scalability, performance, security, integrity and reliability of the software they deliver.


Applications that are created without foresight (or even regard) for functional and non-functional quality face a very real threat of failure. With technology’s increasingly strategic role within business, a software failure can take an organization down with it. The FAA’s software problems that lead to nationwide delays and groundings are a good example. This is just one high-profile example of software issues that are nearly unavoidable due to a reliance on antiquated software development processes.


The responsibility for effective software quality rests with CIOs. Quality needs be ingrained into the development culture every step of the way. As I used to tell my IT associates, our customers are not expecting that we produce junk for them to operate and manage the business.


The problem is that most software development organizations aren’t sure how to achieve this goal. As with most things related to application delivery, software quality is still viewed as a “black box”. Sure, there is testing as part of every software project but what is really going on under the hood? The path to improvement varies from business to business, so there are no clear business rules or principles to follow. As a result, CIOs inadvertently focus their efforts in areas that often yield little return.


For example, companies often make costly software investments in test automation tooling, but they don’t see expected quality improvements. Or they develop processes that pass capability maturity model integration (CMMI) appraisals, but don’t yield measurable improvements in delivered software. It’s no surprise then, for many companies, investing in quality improvement seems like an endless cycle of unproductive spending.


Finding the Solution


What organizations need are a set of principles to help identify the most critical steps to improving quality. By making process and automation choices that deliver tangible, measurable benefits and stem from a careful analysis of an organization’s unique needs, software quality can be quickly and significantly improved. The good news is there are identifiable, quantifiable stages and steps to the software delivery process that can guide an organization in transforming the quality of its software.


However, the first step is to understand where you are today. What areas are going well? Which are ripe for improvement? Only with this understanding can you identify where you would like to be regarding software quality and what it will take to get there.


To this end, I would like to introduce a framework that can help organizations identify the most critical steps to improving quality. It empowers you to make process and automation choices that deliver the most appropriate and impactful benefits based on a careful analysis of your organization’s unique needs. This “quality maturity curve” defines a clear path for lasting change by providing five stages that represent a template for gradual quality improvement.

Maturity levels are relative to each organization’s size and goals. By following this framework, you can identify and characterize the capabilities of your software quality practices. Then, the framework can suggest steps that you can take to begin improving. By realizing where your organization fits within the five stages, you are empowered to adopt a more proactive and structured approach to software quality throughout the application lifecycle.


Let’s take a look at the stages and see what each means. I have also included a few real world examples from my experience with various organizations.


Traditional Testing: A mostly reactive and ad hoc approach with very few formal test processes and unpredictable outcomes. Manual testing prevails and quality assurance (QA) teams are small with minimal budgets for tools.


Example: The Aerospace Manufacturing Division of Lear Siegler was clearly in this stage. Results were only as good as the individual developers’ testing expertise. As a rule, we experienced more failures than successes. Confidence in IT-produced software was almost nonexistent and users relied on IT as a last resort!


The solution was to implement simple, standard application lifecycle management (ALM) procedures that included quality checks at critical points in the development process. All developers utilized this standard and reported progress and shared results against the same baseline. As a result, individual program quality improved dramatically and within two years we reached Level 4 operational lifecycle quality management (LQM) and successfully installed an integrated manufacturing resource planning system.


Test Automation: Testing is managed with some automated procedures. Tests are repeatable and predictable, and regression and load/performance testing are routinely performed. However, there is little focus on process and any improvement initiatives are on a task by task basis.


Quality Management: Test process automation is utilized to drive improved testing efficiency. Requirements are better defined and there is clearer visibility of results. The QA function has a project or team focus, and is able to respond effectively to change. Often there is a drive to consolidate test tools, and to achieve a targeted return on investment (ROI) on any new tool purchase.


Example 1: The development organizations I found in health care and retail food distribution fell into this level of testing maturity. For their critical business systems (in health care, claim processing systems and, in retail, food scheduling/distribution systems), they had established a separate quality control team with rigorous testing processes to ensure system integrity. However, for all other development efforts it was back to Level 1, maybe Level 2 at best. The goal was to bring the integrity of critical business system to a satisfactory level.


The challenge was the existing process was very unresponsive. It was sometimes six months before a new release of the software could be placed into production—regardless of the magnitude of changes. Additionally, all other development quality efforts were not reliable and were dependent upon the individual developer resulting in very frustrated users.


The solution was to first implement a standard ALM that integrated and streamlined the process used by the existing QA management function. The turnover rate was improved to bi-monthly implementations. Then all developers were trained to use the same standard ALM with built in quality processes and performance check points. This caused an almost immediate improvement in software quality.

Example 2: A food service company had begun deployment of a customized standard foodservice distribution system ($400+ million project), and were experiencing significant quality issues that threatened to derail the entire project. This system was to be placed in over 50 distribution centers, but the project was hampered in technical and functional quality issues that put a hold on the entire project. Users revolted and remaining distribution centers refused to accept the system.


The solution was to focus our efforts on stabilizing the application, utilizing a standard ALM with built in quality control checks. Then we established and trained two implementation teams utilizing the same QA implementation process for installing and quality testing the system to ensure user satisfaction before going live.


The result was the successful creation of a standard system in all locations across the country. Subsequently, the development organization matured and reached a Level 5 where quality was totally integrated into the ALM and utilized through every aspect of IT services.


Operational Lifecycle Quality Management (LQM): The principles of LQM are applied to cross-project and/or cross-departmental improvement programs. There is executive level commitment to upstream quality, plus cross-organizational initiatives, such as requirements based testing (RBT), to ensure that testing is supporting business objectives. At this stage of the evolution path there is some level of integration with ALM. Processes are in place to facilitate the sharing of best practices across geographically distributed teams. The organization will have implemented a form of process re-engineering to make software development and testing more effective.


Example: The software development function within General Dynamics was clearly at this level. Operational LQM was universally applied cross projects and across departments, and even across the different divisions of the company. It was common to find integrated development “red” teams, formed from the aerospace, electronics and submarine divisions. We operated using LQM principles and a standard ALM. A single human resources system was developed and implemented in this environment, as well as several common data center management and service center systems. Proceeding along this path naturally evolved General Dynamics up to Level 5.


Enterprise LQM: Testing is tightly integrated with other lifecycle functions. There is full integration with ALM, and a systemic approach to cost and risk reduction. CIO-driven programs implement further and continuous  improvement through initiatives such as standardization of testing process and technology; the use of virtualization techniques for resource sharing; and the establishment of a center of excellence for QA to provide quality services to other parts of the organization, including system, integration, regression, platform/environmental and load/performance testing.


Example: General Dynamics software development function evolved to operate at this level. When General Dynamics divested itself of its IT organization this corporate department subsequently became part of Computer Science Corp. The company had a robust, but flexible, ALM with a quality management program fully integrated throughout the process. A QA function worked in close harmony with all the development and operational units to ensure the latest quality tools and services were made available and applied. The QA function was an active participant in all project reviews, which included budget, schedule and quality performance.




Once you’ve assessed where your organization falls in the curve, there are a set of actions and steps that you can take to see some immediate improvement while incrementally making progress towards the end goal of “software quality nirvana” – Level 5. For instance, an organization that falls clearly into stage one, Traditional Testing, can begin adopting automated testing in certain projects. Immediately, defect rates will start to drop as test coverage increases.


As development organizations move through the five stages, implementing “baby steps” as they continue to improve, the accrued benefits multiply. Quality management delivers team improvements, reducing the cost and time of testing while enabling each team to be more efficient and effective.


Operational LQM has an organizational focus and moving to this quality level brings cross-project value in matching projects to your business needs and eliminating redundancy. It also drives consistent project quality, predictable time-to-market and reduced testing costs across the entire organization.


Enterprise LQM enables global economization. By centralizing skills and sharing workload across distributed testing teams, you can maximize the opportunity for cost reduction and efficiency improvements.


Once you have mapped out where you are, where you want to get to the steps become clear. Devising a solution and planning implementation is just an exercise in figuring out the pace of change that your organization can handle. After completion of each major part of the program, I recommend that you perform another assessment to determine your organization’s new level of capability. This also enables you to keep track of progress and verify that the required improvements have been made.


The Need for Change


The pressures on software development organizations are rising. Increasingly, the world relies on software to conduct their day-to-day activities. The Internet and the global marketplace have combined to produce a 24x7x365 expectation of service. I have seen this first hand in the foodservice, health care, aerospace, marine, defense and technology industries.


Consumers are quick to complain and change their allegiance if software lets them down. VPs and managers know that their organizations must produce high quality software faster and at lower cost, which means development and testing must become more productive and efficient. Plenty of time, resources and conversations have been dedicated to defining the problem. What organizations need now is concrete, pragmatic solutions.


The quality maturity approach provides organizations with a structure and process for improving software quality. This framework is derived from long and wide-ranging experience meant to improve the efficiency and effectiveness of software development and testing processes. No, it isn’t a “silver bullet” to the problem of software quality because challenges this thorny rarely have a simple solution. The quality maturity curve is a framework—a tangible, practical guide that can help organizations understand what they need to do to improve quality and how to go about doing it.

Bob Doyle has over 35 years experience with Fortune 100 and smaller companies. His areas of expertise include strategic planning, supply chain management, ERP system implementations, process reengineering, IT organizational development and offshore outsourcing. He has served as a Partner at Tatum CIO Partners and a CIO at Fleming Companies, Alliant Foodservice (formerly Kraft Foodservice), Community Mutual Insurance Company (Blue Cross/Blue Shield in Ohio) among others.