Definitions

Capability Maturity Model

Capability Maturity Model

The Capability Maturity Model (CMM) is a process capability maturity model which aids in the definition and understanding of an organization's processes.

The CMM was first described in Managing the Software Process by Watts Humphrey, and hence was also known as "Humphrey's CMM". Humphrey based it on the earlier work of Phil Crosby. Active development of this model began in 1986 at the US Dept. of Defense Software Engineering Institute located at Carnegie Mellon University in Pittsburgh.

The CMM was originally intended as a tool for objectively assessing the ability of government contractors' processes to perform a contracted software project. Though it comes from the area of software development, it can be (and has been and still is being) applied as a generally applicable model to assist in understanding the process capability maturity of organizations in diverse areas. For example, software engineering, system engineering, project management, software maintenance, risk management, system acquisition, information technology (IT), personnel management. It has been used extensively for avionics software and government projects around the world.

The CMM has been superseded by a variant - the Capability Maturity Model Integration (CMMI). The old CMM was renamed to Software Engineering CMM (SE-CMM) and organizations accreditations based on SE-CMM expired on 31 December 2007.

Variants of maturity models derived from the CMM have emerged over the years, including, for example, Systems Security Engineering CMM SSE-CMM and the People Capability Maturity Model.

Maturity models have been internationally standardized as part of ISO 15504.

Maturity model

A maturity model can be described as a structured collection of elements that describe certain aspects of maturity in an organization. A maturity model may provide, for example :

  • a place to start
  • the benefit of a community’s prior experiences
  • a common language and a shared vision
  • a framework for prioritizing actions
  • a way to define what improvement means for your organization.

A maturity model can be used as a benchmark for comparison and as an aid to understanding - for example, for comparative assessment of different organizations where there is something in common that can be used as a basis for comparison. In the case of the CMM, for example, the basis for comparison would be the organizations' software development processes.

Structure of the CMM

The CMM involves the following aspects:

  • Maturity Levels: A 5-Level process maturity continuum - where the uppermost (5th) level is a notional ideal state where processes would be systematically managed by a combination of process optimization and continuous process improvement.
  • Key Process Areas: A Key Process Area (KPA) identifies a cluster of related activities that, when performed collectively, achieve a set of goals considered important.
  • Goals: The goals of a key process area summarize the states that must exist for that key process area to have been implemented in an effective and lasting way. The extent to which the goals have been accomplished is an indicator of how much capability the organization has established at that maturity level. The goals signify the scope, boundaries, and intent of each key process area.
  • Common Features: Common features include practices that implement and institutionalize a key process area. There are five types of common features: Commitment to Perform, Ability to Perform, Activities Performed, Measurement and Analysis, and Verifying Implementation.
  • Key Practices: The key practices describe the elements of infrastructure and practice that contribute most effectively to the implementation and institutionalization of the KPAs.

Levels of the CMM

(See also chapter 2 of (March 2002 edition of CMMI from SEI), page 11.)

There are five levels defined along the continuum of the CMM, and, according to the SEI: "Predictability, effectiveness, and control of an organization's software processes are believed to improve as the organization moves up these five levels. While not rigorous, the empirical evidence to date supports this belief."

The levels are:

Level 1 - Ad hoc (Chaotic)

It is characteristic of processes at this level that they are (typically) undocumented and in a state of dynamic change, tending to be driven in an ad hoc, uncontrolled and reactive manner by users or events. This provides a chaotic or unstable environment for the processes.

Organizational implications:
(a) Because institutional knowledge tends to be scattered (there being limited structured approach to knowledge management) in such environments, not all of the stakeholders or participants in the processes may know or understand all of the components that make up the processes. As a result, process performance in such organizations is likely to be variable (inconsistent) and depend heavily on the institutional knowledge, or the competence, or the heroic efforts of relatively few people or small groups.

(b) Despite the chaos, such organizations manage to produce products and services. However, in doing so, there is significant risk that they will tend to exceed any estimated budgets or schedules for their projects - it being difficult to estimate what a process will do when you do not fully understand the process (what it is that you do) in the first place and cannot therefore control it or manage it effectively.

(c) Due to the lack of structure and formality, organizations at this level may over-commit, or abandon processes during a crisis, and it is unlikely that they will be able to repeat past successes. There tends to be limited planning, limited executive commitment or buy-in to projects, and limited acceptance of processes.

Level 2 - Repeatable

It is characteristic of processes at this level that some processes are repeatable, possibly with consistent results.

Process discipline is unlikely to be rigorous, but where it exists it may help to ensure that existing processes are maintained during times of stress.

Organizational implications:
(a) Processes and their outputs could be visible to management at defined points, but results may not always be consistent. For example, for project/program management processes, even though (say) some basic processes are established to track cost, schedule, and functionality, and if a degree of process discipline is in place to repeat earlier successes on projects with similar applications and scope, there could still be a significant risk of exceeding cost and time estimates.

Level 3 - Defined

It is characteristic of processes at this level that there are sets of defined and documented standard processes established and subject to some degree of improvement over time. These standard processes are in place (i.e., they are the AS-IS processes) and used to establish consistency of process performance across the organization.

Organizational implications:
(a) Process management starts to occur using defined documented processes, with mandatory process objectives, and ensures that these objectives are appropriately addressed.

Level 4 - Managed

It is characteristic of processes at this level that, using process metrics, management can effectively control the AS-IS process (e.g., for software development ). In particular, management can identify ways to adjust and adapt the process to particular projects without measurable losses of quality or deviations from specifications. Process Capability is established from this level.

Organizational implications:
(a) Quantitative quality goals tend to be set for process output - e.g., software or software maintenance.
(b) Using quantitative/statistical techniques, process performance is measured and monitored and generally predictable and controllable also.

Level 5 - Optimized

It is characteristic of processes at this level that the focus is on continually improving process performance through both incremental and innovative technological changes/improvements.

Organizational implications:
(a) Quantitative process-improvement objectives for the organization are established, continually revised to reflect changing business objectives, and used as criteria in managing process improvement.''' Thus, process improvements to address common causes of process variation and measurably improve the organization’s processes are identified, evaluated, and deployed.
(b) The effects of deployed process improvements are measured and evaluated against the quantitative process-improvement objectives.
(c) Both the defined processes and the organization’s set of standard processes are targets for measurable improvement activities.
(d) A critical distinction between maturity level 4 and maturity level 5 is the type of process variation addressed.
At maturity level 4, processes are concerned with addressing statistical special causes of process variation and providing statistical predictability of the results, and though processes may produce predictable results, the results may be insufficient to achieve the established objectives.
At maturity level 5, processes are concerned with addressing statistical common causes of process variation and changing the process (for example, shifting the mean of the process performance) to improve process performance. This would be done at the same time as maintaining the likelihood of achieving the established quantitative process-improvement objectives.

Extensions

Some versions of CMMI from SEI indicate a "level 0", characterized as "Incomplete". Some pundits leave this level out as redundant or unimportant, but Pressman and others make note of it. See page 18 of the August 2002 edition of CMMI from SEI.

Anthony Finkelstein extrapolated that negative levels are necessary to represent environments that are not only indifferent, but actively counterproductive, and this was refined by Tom Schorsch as the Capability Immaturity Model.

Software process framework for SEI's Capability Maturity Model

The software process framework documented is intended to guide those wishing to assess an organization/projects consistency with the CMM. For each maturity level there are five checklist types:

TypeSD Description
Policy Describes the policy contents and KPA goals recommended by the CMM.
Standard Describes the recommended content of select work products described in the CMM.
Process Describes the process information content recommended by the CMM. The process checklists are further refined into checklists for:
  • roles
  • entry criteria
  • inputs
  • activities
  • outputs
  • exit criteria
  • reviews and audits
  • work products managed and controlled
  • measurements
  • documented procedures
  • training
  • tools

Procedure Describes the recommended content of documented procedures described in the CMM.
Level Overview Provides an overview of an entire maturity level. The level overview checklists are further refined into checklists for:
  • KPA purposes (Key Process Areas)
  • KPA goals
  • policies
  • standards
  • process descriptions
  • procedures
  • training
  • tools
  • reviews and audits
  • work products managed and controlled
  • measurements

History

The Capability Maturity Model was initially funded through military research - the United States Air Force funded a study at the Carnegie-Mellon Software Engineering Institute to create an abstract model for the military to use as an objective evaluation of software subcontractors. The result was the Capability Maturity Model, published as Managing the Software Process in 1989. The CMM has been superseded by the Capability Maturity Model Integration (CMMI).

Context

In the 1970s the use of computers became more widespread, flexible and less expensive. Organizations began to adopt computerized information systems, and the demand for software development grew significantly. The processes for software development were in their infancy, with few standard or "best practice" approaches defined.

As a result, the growth was accompanied by growing pains: project failure was common, and the field of computer science was still in its infancy, and the ambitions for project scale and complexity exceeded the market capability to deliver. Individuals such as Edward Yourdon, Larry Constantine, Gerald Weinberg, Tom DeMarco, and David Parnas began to publish articles and books with research results in an attempt to professionalise the software development process.

Watts Humphrey's Capability Maturity Model (CMM) was described in Managing the Software Process. The CMM as conceived by Watts Humphrey was based on the work a decade earlier of Phil Crosby who published the Quality Management Maturity Grid in his book Quality is Free in 1979. Active development of the model by the US Department of Defense Software Engineering Institute (SEI) began in 1986.

The CMM was originally intended as a tool to evaluate the ability of government contractors to perform a contracted software project. Though it comes from the area of software development, it can be, has been, and continues to be widely applied as a general model of the maturity of processes (e.g., IT Service Management processes) in IS/IT (and other) organizations.

Note that the first application of a staged maturity model to IT was not by CMM/SEI, but rather Richard L. Nolan, who, in 1973 published the Stages of growth model for IT organisations.

The model identifies five levels of process maturity for an organization:

  1. Initial (chaotic, ad hoc, heroic) the starting point for use of a new process.
  2. Repeatable (project management, process discipline) the process is used repeatedly.
  3. Defined (institutionalized) the process is defined/confirmed as a standard business process.
  4. Managed (quantified) process management and measurement takes place.
  5. Optimising (process improvement) process management includes deliberate process optimization/improvement.

Within each of these maturity levels are Key Process Areas (KPAs) which characterise that level, and for each KPA there are five definitions identified:

  1. Goals
  2. Commitment
  3. Ability
  4. Measurement
  5. Verification

The KPAs are not necessarily unique to CMM, representing — as they do — the stages that organizations must go through on the way to becoming mature.

Process assessment is best led by an appropriately skilled/competent lead assessor. The organisation's process maturity level is assessed, and then a specific plan is developed to get to the next level. Skipping levels is not allowed.

N.B.: The CMM was originally intended as a tool to evaluate the ability of government contractors to perform a contracted software project. It may be suited for that purpose. When it became a general model for software process improvement, there were many critics.

Shrinkwrap companies are also called commercial-off-the-shelf (COTS) firms or software package firms. They include Claris, Apple, Symantec, Microsoft, and Lotus, amongst others. Many such companies rarely if ever managed their requirements documents as formally as the CMM described in order to achieve level 2, and so all of these companies would probably fall into level 1 of the model.

Origins

In the 1980s, several military projects involving software subcontractors ran over-budget and were completed much later than planned, if they were completed at all. In an effort to determine why this was occurring, the United States Air Force funded a study at the SEI. The result of this study was a model for the military to use as an objective evaluation of software subcontractors. In 1989, the Capability Maturity Model was published as Managing the Software Process. The basis for the model is the Quality Management Maturity Grid introduced by Philip Crosby in his 1979 book 'Quality is Free'.

Timeline

  • 1987: SEI-87-TR-24 (SW-CMM questionnaire), released.
  • 1989: Managing the Software Process, published.
  • 1990: SW-CMM v0.2, released (first external issue see Paulk handout).
  • 1991: SW-CMM v1.0, released.
  • 1993: SW-CMM v1.1, released.
  • 1997: SW-CMM revisions halted in support for CMMI.

Current state

Although the CMM model proved useful to many organizations, the use of multiple models has been problematic. Applying multiple models that are not integrated within and across an organization could be costly in terms of training, appraisals, and improvement activities. The CMM Integration (CMMI) project was formed to sort out the problem of using multiple CMMs.

Future direction

With the release of the CMMI Version 1.2 Product Suite, the possibility of multiple CMMI models was created.(Refer to CMMI for further information.)

Controversial aspects

The software industry is diverse and volatile. All methodologies for creating software have supporters and critics, and the CMM is no exception.

Pros

  • Prior to the introduction of Humphrey's CMM, there was no theoretical basis applicable to process maturity for IT-related processes. (Action in the absence of theory as a basis is, by definition, irrational.)
  • CMM has been shown to be well-suited for organizations wishing to define their key processes.

Cons

  • The objective of scientifically managing the software process using defined metrics is difficult to achieve until Level 4. Prior to that level, Activity-Based Costing (ABC) is difficult to apply to validate process cost-savings, except by empirical means.
  • The CMM does not help to define the structure of an effective software development organization. The CMM contains behaviors or best practices that successful projects have demonstrated. Thus, being CMM compliant would not necessarily guarantee that a project would be successful. However, being compliant could increase a project's chances of being successful .
  • Critical analysis of CMM has been published in at least two papers. Bach raises questions about the validity of CMM benchmarks for "good" software development processes. Bollinger and McGowan discuss flaws in the CMM approach where it may use "assembly-line" process models. They suggest that manufacturing is fundamentally different to software development, as the former is primarily concerned with replication and the latter with design.

CMM Levels 2 and 3 can have beneficial aspects

  • Creation of Software Specifications, stating what is going to be developed, combined with formal sign off, an executive sponsor and approval mechanism. This is NOT a living document, but additions are placed in a deferred or out of scope section for later incorporation into the next cycle of software development.
  • A Technical Specification, stating how precisely the thing specified in the Software Specifications is to be developed will be used. This is a living document.
  • Peer Review of Code (Code Review) with metrics that allow developers to walk through an implementation, and to suggest improvements or changes. (Note - This is problematic because the code has already been developed, and a bad design potentially cannot be fixed by "tweaking".) The Code Review gives complete code a formal approval mechanism.
  • Version Control - a very large number of organizations have no formal revision control mechanism or release mechanism in place.
  • The idea that there is a "right way" to build software, that it is a scientific process involving engineering design and that groups of developers are not there to simply work on ad hoc problems.

See also

References

Books
*
*Papers
*

*Websites
* History of Process Models
* Process Improvement: The Capability Maturity Model

External links

Search another word or see capability maturity modelon Dictionary | Thesaurus |Spanish
Copyright © 2014 Dictionary.com, LLC. All rights reserved.
  • Please Login or Sign Up to use the Recent Searches feature
FAVORITES
RECENT

;