2 The Internal Consistency of the ISO/IEC Software Process Capability Scale Khaled El Emam Fraunhofer Institute for Experimental Software Engineering Sauerwiesen 6 D Kaiserslautern Germany Abstract ISO/IEC is an emerging international standard for software process assessment. It has recently undergone a major change in the rating scale used to measure the capability of processes. The objective of this paper is to present a follow up evaluation of the internal consistency of this process capability scale. Internal consistency is a form of reliability of a subjective measurement instrument. A previous study evaluated the internal consistency of the first version of the ISO/IEC document set (also known as SPICE version 1). In the current study we evaluate the internal consistency of the second version (also known as ISO/IEC PDTR 15504). Our results indicate that the internal consistency of the capability dimension did not deteriorate, and that it is still sufficiently high for practical purposes. Furthermore, we identify that the capability scale has two dimensions that we termed Process Implementation and Quantitative Process Management. 1. Introduction Over the last five years there has been an on-going effort to develop an international standard for software process assessment [6]. The emerging standard has been designated as ISO/IEC During the development of ISO/IEC 15504, a series of empirical trials have been, and continue to be, conducted (see [9] for an overview of these trials). These trials are being conducted by the SPICE Project 1. The trials have been divided up into three broad phases. Currently, phase 2 of the SPICE Trials are coming to a close. Phase 1 of the SPICE Trials evaluated what is known as version 1 of the SPICE document set (see [6] for the full version 1 document set). Based on feedback from the Phase 1 trials, as well as comments from the formal ISO balloting procedure, SPICE version 1 has been revised. The revised version is called ISO/IEC PDTR , but is also known as SPICE version 2. Phase 2 of the SPICE Trials are empirically evaluating ISO/IEC PDTR The overall architecture of ISO/IEC PDTR is two dimensional. One dimension consists of the processes that are assessed. The second dimension is the actual scale that is used to evaluate the capability of processes. This is the capability dimension. The general architecture has not changed from SPICE version 1, however, the elements that make up the process and the capability dimensions have gone through revisions. One of the enduring issues in the SPICE Trials has been the reliability of software process assessments [4]. Reliability is defined in general as the extent to which the same measurement procedure will yield the same results on repeated trials [1]. Lack of reliability is caused by random measurement error. There are different types of reliability, but the one that is the focus of this paper is termed internal consistency. Internal consistency measures the extent to which the components of an instrument have been constructed to the same or to consistent content specifications of what the instrument is supposed to measure. This type of reliability accounts for ambiguity and inconsistency 1 Software Process Improvement and Capability determination. 2 PDTR stands for Proposed Draft Technical Report. This is one of the stages that a document has to go through before becoming an International Standard. 2

3 amongst indicators or subsets of indicators in an assessment instrument as sources of error. A review of internal consistency and common ways of calculating internal consistency coefficients has been provided in [7]. In a previous report we evaluated the internal consistency of the SPICE v1 capability dimension [7]. As a comparison point, we had also evaluated the internal consistency of the SEI s 1987 Maturity Questionnaire. The results of that study indicated that the SPICE v1 capability dimension was highly reliable, that it is usable in practice, and that its reliability was comparable to that of an existing instrument. These results were intended to serve as a baseline to compare future versions (and also other assessment instruments) against. In this paper we report on a follow-up investigation by evaluating the internal consistency of the ISO/IEC PDTR capability dimension. This investigation was intended to answer two questions: did the changes to SPICE v1 deteriorate the internal consistency of the ISO/IEC PDTR capability dimension compared to version 1? is the internal consistency of ISO/IEC PDTR still sufficiently high that it is usable in practice? The importance of these questions stems from the fact that the number of items in the capability dimension have been reduced drastically from SPICE v1 to ISO/IEC PDTR (from 26 to 9). This means that the number of ratings that an assessor has to make in order to evaluate the capability of a process is less. One of the motivations for this was to reduce the costs of assessments. It is known, however, that internal consistency of a measurement instrument is affected by the number of items it has [10]. Therefore, the general concern is whether this reduction in the number of items has also caused a reduction in the internal consistency of the capability dimension. Briefly our results indicate that the internal consistency of the ISO/IEC PDTR capability dimension did not deteriorate substantially from SPICE version 1, and that it is still of sufficiently high internal consistency to be usable in practice. Furthermore, we determined that software process capability is a two-dimensional construct. The latter may be of value for future empirical research that measures process capability. In the following section we give an overview of the rating scheme in ISO/IEC PDTR Then in Section 3 we describe the method that was used to collect the data for our study. Section 4 provides the results of the internal consistency evaluation. This is followed in Section 5 by a summary and directions for future research. 2. Overview of ISO/IEC PDTR Rating Scheme In ISO/IEC 15504, there are 5 levels of capability that can be rated, from Level 1 to Level 5. A Level 0 is also defined, but this is not rated directly. These 6 levels are shown in Table 1. In Level 1, one attribute is directly rated. There are 2 attributes in each of the remaining 4 levels. The attributes are also shown in Table 1 (also see [6]). The rating scheme consists of a 4-point achievement scale for each attribute. The four points are designated as F, L, P, N for Fully Achieved, Largely Achieved, Partially Achieved, and Not Achieved. A summary of the definition for each of these response categories is given in Table 2. The unit of rating in an ISO/IEC PDTR process assessment is the process instance. A process instance is defined as a singular instantiation of a process that is uniquely identifiable and about which information can be gathered in a repeatable manner [6]. 3

4 ID Title Level 0 Incomplete Process There is general failure to attain the purpose of the process. There are no easily identifiable work products or outputs of the process. Level 1 Performed Process The purpose of the process is generally achieved. The achievement may not be rigorously planned and tracked. Individuals within the organization recognize that an action should be performed, and there is general agreement that this action is performed as and when required. There are identifiable work products for the process, and these testify to the achievement of the purpose. 1.1 Process performance attribute Level 2 Managed Process The process delivers work products of acceptable quality within defined timescales. Performance according to specified procedures is planned and tracked. Work products conform to specified standards and requirements. The primary distinction from the Performed Level is that the performance of the process is planned and managed and progressing towards a defined process. 2.1 Performance management attribute 2.2 Work product management attribute Level 3 Established Process The process is performed and managed using a defined process based upon good software engineering principles. Individual implementations of the process use approved, tailored versions of standard, documented processes. The resources necessary to establish the process definition are also in place. The primary distinction from the Managed Level is that the process of the Established Level is planned and managed using a standard process. 3.1 Process definition attribute 3.2 Process resource attribute Level 4 Predictable Process The defined process is performed consistently in practice within defined control limits, to achieve its goals. Detailed measures of performance are collected and analyzed. This leads to a quantitative understanding of process capability and an improved ability to predict performance. Performance is objectively managed. The quality of work products is quantitatively known. The primary distinction from the Established Level is that the defined process is quantitatively understood and controlled. 4.1 Process measurement attribute 4.2 Process control attribute Level 5 Optimizing Process Performance of the process is optimized to meet current and future business needs, and the process achieves repeatability in meeting its defined business goals. Quantitative process effectiveness and efficiency goals (targets) for performance are established, based on the business goals of the organization. Continuous process monitoring against these goals is enabled by obtaining quantitative feedback and improvement is achieved by analysis of the results. Optimizing a process involves piloting innovative ideas and technologies and changing non-effective processes to meet defined goals or objectives. The primary distinction from the Predictable Level is that the defined process and the standard process undergo continuous refinement and improvement, based on a quantitative understanding of the impact of changes to these processes. 5.1 Process change attribute 5.2 Continuous improvement attribute Table 1: Overview of the capability levels and attributes. The scope of an assessments is an Organizational Unit (OU) [6]. An OU deploys one or more processes that have a coherent process context and operates within a coherent set of business goals. 4

5 The characteristics that determine the coherent scope of activity - the process context - include the application domain, the size, the criticality, the complexity, and the quality characteristics of its products or services. An OU is typically part of a larger organization, although in a small organization the OU may be the whole organization. An OU may be, for example, a specific project or set of (related) projects, a unit within an organization focused on a specific life cycle phase (or phases), or a part of an organization responsible for all aspects of a particular product or product set. Rating & Designation Not Achieved - N Partially Achieved - P Largely Achieved - L Fully Achieved - F Description There is no evidence of achievement of the defined attribute. There is some achievement of the defined attribute. There is significant achievement of the defined attribute. There is full achievement of the defined attribute. Table 2: The four-point attribute rating scale. 3. Research Method The overall objective of this study is to evaluate the internal consistency of the ISO/IEC PDTR process capability measurement scale, and to compare it to that of SPICE v1. In doing do, we specifically aim to answer the two questions posed in Section 1. The data that was used for this study was obtained from Phase 2 of the SPICE Trials. During the trials, organizations contribute their assessment ratings data to an international trials database located in Australia, and also fill up a series of questionnaires after each assessment. The questionnaires collect information about the organization and about the assessment. There is a network of SPICE Trials coordinators around the world who interact directly with the assessors and the organizations conducting the assessments. This interaction involves ensuring that assessors are qualified, making questionnaires available, answering queries about the questionnaires, and following up to ensure the timely collection of data. At the time of writing a total of 30 assessments had been conducted, 14 in Australia and 16 in Europe. In total 332 process instances were assessed. Of these, 88 were in Australia and 244 were in Europe. Since more than one assessment may have occurred in a particular OU (e.g., multiple assessments each one looking at a different set of processes), organizations involved in the assessments were split with 15 in Europe and 8 in the Southern Asia Pacific region, giving a total of 23 different organizations. To calculate internal consistency coefficients we only need the assessment ratings data. The coefficient of internal consistency that we use is Cronbach alpha [2]. A description of this coefficient in the context of process assessments is also provided in [7]. Cronbach alpha ranges from 0 to 1. A value of 0 is the worst internal consistency, and a value of 1 indicates perfect reliability. In order to compute the Cronbach Alpha coefficient, we converted the F, L, P, N ratings into a numerical scale by assigning the values 4, 3, 2, 1 respectively. This is the same approach that was used in [7], and is a common approach for assigning numbers to subjective scales [11]. All 29 processes defined in the ISO/IEC PDTR reference model were covered by these 332 process instances. The processes and their purpose statements are given in Table 3. 5

6 Process CUS.1 Acquire software CUS.2 Manage customer needs CUS.3 Supply software CUS.4 Operate software CUS.5 Provide customer service ENG.1 Develop system requirements and design ENG.2 Develop software requirements ENG.3 Develop software design ENG.4 Implement software design ENG.5 Integrate and test software ENG.6 Integrate and test system ENG.7 Maintain system and software SUP.1 Develop documentation SUP.2 Perform configuration management SUP.3 Perform quality assurance SUP.4 Perform work product verification SUP.5 Perform work product validation SUP.6 Perform joint reviews SUP.7 Perform audits SUP.8 Perform problem resolution MAN.1 Manage the project MAN.2 Manage quality MAN.3 Manage risks MAN.4 Manage subcontractors ORG.1 Engineer the business Process Purpose Statement To obtain the product and/or service that will satisfy the need expressed by the customer. The acquisition process is enacted by the acquirer. The process begins with the identification of a customer need and ends with the acceptance of the product and/or service needed by the customer. To manage the gathering, processing, and tracking of ongoing customer needs and requirements throughout the operational life of the software; to establish a software requirements baseline that serves as the basis for the project s software work products and activities, and to manage changes to this baseline. To package, deliver, and install the software at the customer site and to ensure that quality software is delivered as defined by the requirements. To support the correct and efficient operation of the software for the duration of its intended usage in its installed environment. To establish and maintain an acceptable level of service to the customer to support effective use of the software. To establish the system requirements (functional and nonfunctional) and architecture, identifying which system requirements should be allocated to which elements of the system and to which releases. To establish the requirements of the software component of the system. To define a design for the software that accommodates the requirements and can be tested against them. To produce executable software units and to verify that they properly reflect the software design. To integrate the software units with each other, producing software that will satisfy the software requirements. To integrate the software component with other components, such as manual operations or hardware, producing a complete system that will satisfy the users expectations expressed in the system requirements. To manage modification, migration, and retirement of system components (such as hardware, software, manual operations, and network if any) in response to user requests. To develop and maintain documents, recording information produced by a process or activity within a process. To establish and maintain the integrity of all the work products of a process or project. To ensure that work products and activities of a process or project comply with all applicable standards, procedures, and requirements. To confirm that each work product of a process or project properly reflects the requirements for its construction. To confirm that the specific requirements for a particular intended use of the work product are fulfilled. To maintain a common understanding with the customer of the progress against the objectives of the contract and what should be done to help ensure development of a product to satisfy the customer. To confirm independently that the products and processes employed conform with the specific requirements defined. To ensure that all discovered problems are analyzed and removed, and that trends are identified. To define the processes necessary to establish, coordinate, and manage a project and the resources necessary to produce a product. To manage the quality of the project s products and services and to ensure that they satisfy the customer. To continuously identify and mitigate the project risks throughout the life cycle of a project. To select qualified subcontractor(s) and manage their performance. To provide the individuals in the organization and projects with a vision and culture that empowers them to function effectively. 6

7 Process ORG.2 Define the process ORG.3 Improve the process ORG.4 Provide skilled human resources ORG.5 Provide software engineering infrastructure Process Purpose Statement To build a reusable library of process definitions (including standards, procedures, and models) that will support stable and repeatable performance of the software engineering and management process. To improve continually the effectiveness and efficiency of the processes used by the organization in line with the business need, as a result of successful implementation of the process. To provide the organization and projects with individuals who possess skills and knowledge to perform their roles effectively and to work together as a cohesive group. To provide a stable and reliable environment with an integrated set of software development methods and tools for use by the projects in the organization, consistent with and supportive of the defined process. Table 3: Processes included in the study and their purpose statements. It should be noted, however, that the Cronbach alpha coefficient assumes that the construct being measured is unidimensional [7]. If indeed the ISO/IEC PDTR capability scale is multidimensional, then it would be more appropriate to compute the internal consistency coefficient for each dimension separately. As done in [5][3], it is possible to evaluate the dimensionality of a particular construct using principal components analysis [8]. We follow this approach to identify the dimensions of the capability of software processes. 4. Results We first provide a descriptive summary of the organizations and assessed processes for which we have collected data to establish the context for interpreting the results. Subsequently, the results of the dimensionality and internal consistency analysis are presented. 4.1 Descriptive Summary Approximately 52% of the OU s were concerned with the production of software or other IT products or services, 14% were in defense, and 14% in the financial sector. The data for the approximate number of staff and the approximate number of IT staff in the OU s are shown in Figure 1 and Figure 2 for 21 of the 23 OU s 3. The questions corresponding to these data both asked for approximate numbers of staff, rounded to a suitable number. As can be seen from this data, there was good variation in the sizes (both small and large) of the OU s that participated in the trials thus far. number of OUs Figure 1: Approximate number of OU staff in participating OU s. 3 The remaining two observations are missing data. 7

8 number of OUs Figure 2: Approximate number of IT staff in participating OU s. The variation in the number of process instances rated per assessment is shown in the box and whisker plot of Figure 3. These range from one to 30. However, the median value is 7 process instances per assessment. Process Instances per Assessment Non-Outlier Max = 30 Non-Outlier Min = 1 75% = 18 25% = 6 Median = 7 Figure 3: Distribution of process instances per assessment. The ISO/IEC documents do not specify a particular assessment method, nor do they specify a particular distribution of effort that must be spent on each assessment activity. To characterize the assessment methods that were followed, we collected data on the amount of effort that was spent on each of the following assessment activities: Preparing the assessment input Briefing the sponsor and/or OU staff about the methodology to be used Collecting evidence (e.g. reviewing documentation, interviewing OU staff/management) Producing and verifying ratings Preparing the assessment results Presenting the results to management The average effort distribution by activity is shown in Figure 4. As can be seen, evidence collection consumed the greatest amount of effort, followed by the briefing to the sponsor and/or OU staff about ISO/IEC and the method to be used. The effort spent on the preparation of the inputs to the 8

9 assessment (e.g., defining the scope) is not negligible, and consumes approximately 15% of the effort on average. Presentation, 3.0 % Other, 5.0 % Input Preparation, 15.0 % Result Preparation, 14.0 % Briefing, 17.0 % Production/Verif., 14.0 % Evidence Collection, 32.0 % Distribution of Assessment Effort Figure 4: Distribution of assessment effort by activity (average) Cost per Process Instance (US$) Non-Outlier Max = 2777 Non-Outlier Min = 81 75% = % = 200 Median = 833 Outliers Figure 5: Cost of assessment per process instance. In Figure 5 we show the distribution of the cost of assessments per process instance in US$. As can be seen the range is quite large, from $81 to $2777 per process instance. The median cost is $833 per process instance. The total cost of the assessments alone from which we have data exceeds $250K. 4.2 Internal Consistency Results As noted in [7], the internal consistency of an instrument assumes unidimensionality. Therefore, we first identify the dimensions of process capability. To determine if process capability is multi- 9

10 dimensional, we performed a principal components analysis with varimax rotation [8] for all nine attributes. The emergent factor loadings are shown in Table 4. Loadings greater than 0.7 have been bolded in the table. As can be seen, there is a clear two dimensional structure, with the attributes from levels 1 to 3 in one dimension, and the attributes from levels 4 and 5 in the second dimension. We have termed these two dimensions Process Implementation and Quantitative Process Management respectively. Attribute ID Factor 1 Process Implementation Factor 2 Quantitative Process Management Table 4: Results of principal components analysis (78% of variation explained). During an assessment, it is not always the case that all of the attributes up to Level 5 are rated (in ISO/IEC it is not obligatory to do so). In fact, from the data available thus far, 80 process instances that were rated up to level 3 were not rated at levels 4 and 5. This is shown more clearly in Figure 6. It is also seen that there is a large drop in the number of observations after Level 3. This indicates that, in general, assessments either rate up to Level 3 (only one dimension), or go all the way up to Level 5 (both dimensions). Therefore, from a practical perspective it would seem reasonable to treat process capability as a two dimensional scale since this is congruent with the way the scale is used in practice. 360 Number of Observations n Level 1 Level 2 Level 3 Level 4 Level 5 Figure 6: Cumulative number of process instances rated. 10

12 estimated internal consistency for a 26 item instrument for each dimension 4. We obtain 0.98 for the Process Implementation dimension and for the Quantitative Process Management dimension. These results can be compared to the results obtained in [7] for SPICE v1 in Table 7. This indicates that, taking each dimension separately, the internal consistency is quite similar to that obtained for SPICE v1. It is reasonable then to conclude that the process capability scale, from an internal consistency perspective, has not deteriorated between SPICE v1 and ISO/IEC PDTR This answers the first question posed earlier. Furthermore, each dimension individually has a sufficiently high internal consistency that it is usable in practice, which answers the second question posed earlier. This is a substantial advantage as, potentially, the reduction in the number of ratings (from 26 to 9) that need to be made contributes to a reduction in the cost of conducting an assessment. SPICE v1 Data Set 1 [7] (full length instrument) SPICE v1 Data Set 2 [7] (full length instrument) Table 7: Internal consistency results of baseline. If one wishes to use a single measure of process capability, for example, for research purposes, then the two dimensions can be summed up to produce an overall capability measure. Nunnally [10] has shown how to compute the internal consistency of composite scales that measure different traits. Following this formulation, we have an internal consistency of Again, this is close to the 0.9 threshold. However, it is clear that each dimension individually has a slightly higher internal consistency. 5. Conclusions ISO/IEC is an emerging international standard for software process assessment. Alongside the development of the ISO/IEC 15504, the SPICE Project is conducting a series of trials to empirically evaluate the ISO/IEC document set as it evolves (known as the SPICE Trials). Using data collected from the second phase of the SPICE Trials, we evaluated the internal consistency of the ISO/IEC PDTR (also known as SPICE version 2) capability dimension. The motivation for this study was to find out if the internal consistency of version 2 had deteriorated as a consequence of changes from version 1, and if the internal consistency remains sufficiently high for practical application. We used the results from a previous investigation of the internal consistency of SPICE version 1 as a baseline. One result that emerges from this study was the determination that software process capability, as measured by ISO/IEC PDTR 15504, is a two-dimensional construct. This can be useful for future researchers measuring process capability, whereby each of the two dimensions could be treated separately. Our results, based on a large data set, indicate that the internal consistency of version 2 did not suffer any deterioration, and that it still has sufficiently high internal consistency to be usable in practice. Such internal consistency studies will continue to be performed as the ISO/IEC document set evolves to an international standard. 6. Acknowledgements This study would not have been possible without the contributions of many people involved in the SPICE Trials. In particular, we wish to thank Inigo Garro, Stuart Hope, Robin Hunter, Munish 4 We use the Spearman-Brown formula [7] to estimate internal consistency if the length of the instrument is increased to 26 items. 12

Utilizing measurement in the context of software maturity models Jari Soini Tampere University of Technology Information Technology Department Pori, Finland Where I come from Pori University Consortium

Critical Design Decisions in the Development of the Standard for Process Assessment Author Rout, Terry Published 2013 Conference Title Software Process Improvement and Capability Determination DOI https://doi.org/10.1007/978-3-642-38833-0_24

Generating Value from Investments in Business Analytics Rajeev Sharma, Tim Coltman, Abhijith Anand, University of Wollongong Foreword So you ve embraced the idea that data is an asset. You have invested

with High Quality and Cost Savings for the Customer* By Khaled El Emam Background The Organization and Its Products TrialStat Corporation is a small software company in Canada that develops software for

Highlights of CMMI and SCAMPI 1.2 Changes Presented By: Sandra Cepeda March 2007 Material adapted from CMMI Version 1.2 and Beyond by Mike Phillips, SEI and from Sampling Update to the CMMI Steering Group

TOGAF 9. in Pictures The TOGAF ADM Cycle Stage Set up an EA team and make sure it can do its work The ADM is about understanding existing architectures and working out the best way to change and improve

CHAPTER 4 PRODUCT DEVELOPMENT LIFE CYCLE 1 Learning Objectives Review the Systems Development Life Cycle (SDLC). Examine the problems and alternatives with SDLC. Know the key issues in ERP implementation

PART THREE: Work Plan and IV&V Methodology (RFP 5.3.3) 3.1 IV&V Methodology and Work Plan 3.1.1 NTT DATA IV&V Framework We believe that successful IV&V is more than just verification that the processes

Process Improvement Is Continuous Improvement We can never reach perfection. The CMM does not provide all the answers; it too is evolving and improving. Process management means constructive and continual

World Green Building Council Rating Tools Task Group: QUALITY ASSURANCE GUIDE FOR GREEN BUILDING RATING TOOLS Version 1.1 September 2015 Introduction This guide has been developed as a part of the World

SOFTWARE ENGINEERING SOFTWARE MAINTENANCE Software maintenance is the process of modification or making changes in the system after delivery to overcome errors and faults in the system that were not uncovered

Changes to the SCAMPI Methodology and How to Prepare for a SCAMPI Appraisal Presented by: Lemis O. Altan SEI-Certified SCAMPI V1.3 Lead Appraiser for Development Process Edge International, Inc. Copyright

Committed to Excellence Information Brochure Helping with your decision to apply About the EFQM Levels of Excellence The EFQM Levels of Excellence is created to motivate and encourage systematic improvement,

PRINCE2 Sample Papers The Official PRINCE2 Accreditor Sample Examination Papers Terms of use Please note that by downloading and/or using this document, you agree to comply with the terms of use outlined

Journal of Information Technology ISSN #1042-1319 A Publication of the Association of A CASE STUDY ON THE CHALLENGES AND TASKS OF MOVING TO A HIGHER CMMI LEVEL DAN TURK COLORADO STATE UNIVERSITY Dan.Turk@Colostate.Edu

Skyworks has embraced a workplace where Quality is the number one differentiator to achieve Customer Loyalty. Skyworks has adopted a single Quality Management System which drives efficiency, consistency

RESOURCE: MATURITY LEVELS OF THE CUSTOMIZED CMMI-SVC FOR TESTING SERVICES AND THEIR PROCESS AREAS This resource is associated with the following paper: Assessing the maturity of software testing services

Requirements Analysis and Design Definition Chapter Study Group Learning Materials 2015, International Institute of Business Analysis (IIBA ). Permission is granted to IIBA Chapters to use and modify this

Project Management Body of Knowledge (PMBOK) Enterprise & Project Management Please note that these slides are not intended as a substitute to reading the recommended text for this course. Objectives Explain

Applying PSM to Enterprise Measurement Technical Report Prepared for U.S. Army TACOM by David Card and Robert MacIver Software Productivity Consortium March 2003 SOFTWARE PRODUCTIVITY CONSORTIUM Applying

Governance in a Multi-Supplier Environment This paper provides advice and guidance for organisations faced with governing a multi-supplier environment. 1. The Need for Governance ISACA, the global IT governance

Common Criteria Evaluation and Validation Scheme For Information Technology Security Guidance to Validators of IT Security Evaluations Scheme Publication #3 Version 1.0 February 2002 National Institute

Contents Preface Acknowledgments Tables and Figures xi xiii xv 1 Introduction and Overview 1 Introduction 1 What Are the CMM and CMMI? 2 What the CMM and CMMI Are Not 2 What Are Standards? 3 IEEE Software

Volume 8, No. 1, Jan-Feb 2017 International Journal of Advanced Research in Computer Science RESEARCH PAPER Available Online at www.ijarcs.info A Study of Software Development Life Cycle Process Models

The Basics of ITIL Help Desk for SMB s This three-step process will provide you the information necessary to understand ITIL, help you write your strategic IT plan and develop the implementation plan for

NOT MEASUREMENT SENSITIVE MIL-HDBK-1467 10 DECEMBER 1997 SUPERSEDING SEE 6.2 DEPARTMENT OF DEFENSE HANDBOOK ACQUISITION OF SOFTWARE ENVIRONMENTS AND SUPPORT SOFTWARE This handbook is for guidance only.

Software Development Life Cycle (SDLC) Tata Consultancy Services ltd. 12 October 2006 1 Objectives (1/2) At the end of the presentation, participants should be able to: Realise the need for a systematic

Work Plan and IV&V Methodology ISG Public Sector has been helping public sector organizations acquire and implement (from a project oversight and organizational change management perspective) new, enterprise-wide

GUIDE TO THE CHANGES IN PMP- 2015 simpl learn i Table of contents Introduction the purpose of this manual 1 New Tasks: Initiating 3 New Tasks: Planning 4 New Tasks: Executing 6 New Tasks: Monitoring and

Report on Portico Audit Findings Executive Summary The Center for Research Libraries (CRL) conducted a preservation audit of Portico (www.portico.org) between April and October 2009 and hereby certifies