MB0033 ok_1.doc

16
Master of Business Administration Semester-IIi ASSIGNMENT MI0033 – Software Engineering Set- I SUBMITTED BY:  NAME : VIJAY KUMAR SHARMA ROLL NO : 520933061 COURSE : mba CENTRE CODE : 3293 CENTRE CITY : NEW DELHI Note: Note: Each ques tion carries 10 Mar ks. Answer all the questions.

Transcript of MB0033 ok_1.doc

7/29/2019 MB0033 ok_1.doc

http://slidepdf.com/reader/full/mb0033-ok1doc 1/16

Master of Business AdministrationSemester-IIi

ASSIGNMENT

MI0033 – Software Engineering

Set- I

SUBMITTED BY: 

NAME : VIJAY KUMAR SHARMAROLL NO : 520933061

COURSE : mba

CENTRE CODE : 3293

CENTRE CITY : NEW DELHI

Note: Note: Each question carries 10 Marks. Answer all the questions.

7/29/2019 MB0033 ok_1.doc

http://slidepdf.com/reader/full/mb0033-ok1doc 2/16

Master of Business Administration – Semester-III MI0033 – Software Engineering Assignment Set- I

  Q1. Quality and reliability are related concepts but are fundamentally different in a

number of ways. Discuss them.

Answer .

The quality movement started in the 1940s with a major contribution on quality

aspects from W. Edwards Deming. One of the major benefits of quality has been the saving

in the overall cost of production. A system of continuous improvement helps in achieving

good quality. Kaizen refers to a system of continuous process improvement. The purpose of 

kaizen is to develop a process that is visible, Repeatable, and measurable. After Kaizen it is

atarimae hinshitsu, which refers to examination of intangibles that affect the process and

works to optimize their impact. Both kaizen and atarimae hinshitsu focuses on processes.

The next stage is kansei which leads to improvement in the product itself and, potentially, tothe process that created it. The final stage is miryokuteki hinshitsu which broadens the

management concern beyond the immediate product.

Quality Concepts:

It is a well-known fact that all engineered and manufactured parts exhibit some or the other 

variation. The variations may not be clearly visible always. The variations are sometimes

microscopic which can be identified by means of some equipment necessary to measure the

geometrical attributes, electrical characteristics etc.

Quality:Designers specify the characteristics of the quality of a product. The grade of materials used

in the product development and product characteristics, permissible tolerances, and

performance specifications contribute to the quality of design. For higher-grade of materials

the tolerances are very small. When the tolerance is set to a very low level the expected

design characteristics would be of high quality. When greater levels of performance are

specified, there is an increase in the design quality of a product and the manufacturing

processes and the product specification are set according to the specified quality norms.

Quality of conformance is expressed as the degree to which the design specifications are

followed during the process of manufacturing. If the degree of conformance is high then thelevel of quality of conformance is also deemed as high.

Quality of conformance is mainly focused on the implementation of the software.

Quality Control:

Quality is the buzz word of every organization today. But how does one work towards

achieving quality in the organization and within the organization at various process levels.

There are a number of ways of achieving quality. One can consider the fundamental step of 

quality where the variations are measured with respect to the expected values in any process

or characteristics of the product. The first step towards quality is to see that the variations areminimized. Controlling quality can be done by means of measuring various characteristics of 

Page 2 of 16Vijay Kumar Sharma

7/29/2019 MB0033 ok_1.doc

http://slidepdf.com/reader/full/mb0033-ok1doc 3/16

Master of Business Administration – Semester-III MI0033 – Software Engineering Assignment Set- I

the product and understanding the behavior of the product towards changes in the product

characteristics. It involves a series of inspections, reviews, and tests on the software

processes. A feedback mechanism in the process list will help in constantly reviewing theperformance and enhancement in the performance.

 A combination of the measurement and the feedback allows the software developer to refine

the software process and tend to approach perfection.

It is possible to automate these steps in the quality control process of the software system.

One of the concepts of quality control is that every process can be measured. The

measurement will tell as to whether there has been any improvement in the process or not.

Quality Assurance:

Quality assurance is a process of auditing various areas and identifying the nonconformances in such areas. A non conformance is reported if a deviation is observed in the

actual performance when compared with the planned performance against certain

expectation. The expectations are listed out based on the requirement of certain standards

norms. The nonconformances are reported area wise or process wise. The report based on

the audit provides the management with the information that is necessary for them to take

suitable actions.

Cost of Quality:

There are many activities involved in a software project leading to the completion of the

intended service or the product. Every such activity is associated with some cost. Andassociated with every process is the quality which again comes with certain cost. The total

cost of quality means the sum total of all the costs involved in setting up a quality process or 

a quality activity and additional resources procured towards maintaining and running the

quality process. The main categories under which the quality costs may be listed are the ones

dealing with processes towards prevention, processes towards appraisal, and processes

towards maintenance. The main components contributing towards the cost are the cost

component of quality planning, cost component of formal technical reviews and the cost

component pertaining to the test equipment.

Software Reliability:The need for quality is there in the minds of everybody associated with the software project.

One of the key issues pertaining to the quality aspect is the reliability of the software product.

There are number of methods to ensure reliability of the product which depends upon the

characteristics of the product and its features and the expectations from the product and its

services. One of the task before the software engineer or the software manager is to

establish the relevant reliability measures well in advance before the implementation so that

the quality is assured. A series of audits may be conducted to keep a tab on the deviations if 

they tend to occur.

Statistically the software reliability may be defined as the probability of an operation of a

computer program which is free from error or has not failed during the operation time, testedunder a specified environment and for specified time. Failure refers to nonconformance to the

Page 3 of 16Vijay Kumar Sharma

7/29/2019 MB0033 ok_1.doc

http://slidepdf.com/reader/full/mb0033-ok1doc 4/16

Master of Business Administration – Semester-III MI0033 – Software Engineering Assignment Set- I

requirements of the software stated. One of the simple measures of reliability is the express it

as the meantime between failure (MBF) which is the sum of mean time of occurrence of 

failure (MTF) and mean time towards repair (MTR). It is necessary to identify and assessthe hazards in software projects that affect the software performance. If it is possible to

identify the hazards in the early stages of the software project then a module to counteract

such hazards could be developed or built in to the software which will then be able to rectify

errors leading to hazards. Suitable models could be used to achieve this safety.

Background Issues:

The quality assurance processes are very vital in establishing quality features in the product.

Various standard mechanisms are developed in the companies to focus on the quality of the

product. These mechanisms have to undergo improvements time to time in order to maintain

the competition in the market. The product has to be viewed from the user point of view. Asatisfaction note on the various features of the product is necessary to be reviewed to bring a

change in the product to enhance it and to make it a quality product.

Q2. Discuss the Objective & Principles Behind Software Testing.

Answer:

Testing Objectives

Glen Myers states a number of rules that can serve well as testing objectives:

1. Testing is a process of executing a program with the intent of finding an error.

2. A good test case is one that has a high probability of finding an as return discovered error.

3. A successful test is one that uncovers an as-yet-undiscovered error.

Testing Principles

Davis suggests a set of testing principles that have been adapted:

All tests should be traceable to customer requirements: As we have seen, the objective

of software testing is to uncover errors. It follows that the most severe defects (from thecustomer’s point of view) are those that cause the program to fail to meet its requirements.

Tests should be planned long before testing begins: Test planning can begin as soon as

the requirements model is complete. Detailed definition of test cases can begin as soon as

the design model has been solidified. Therefore, all tests can be planned and designed

before any code has been generated.

The Pareto principle applies to software testing: Stated simply, the Pareto principle

implies that 80 percent of all errors uncovered during testing will most likely be traceable to

20 percent of all program components. The problem, of course, is to isolate these suspectcomponents and to thoroughly test them.

Page 4 of 16Vijay Kumar Sharma

7/29/2019 MB0033 ok_1.doc

http://slidepdf.com/reader/full/mb0033-ok1doc 5/16

Master of Business Administration – Semester-III MI0033 – Software Engineering Assignment Set- I

Testing should begin “in the small” and progress toward testing “in the large”: The first

tests planned and executed generally focus on individual components. As testing progresses,

focus shifts in an attempt to find errors in integrated clusters of components and ultimately inthe entire system.

Exhaustive testing is not possible: The number of path permutations for even a

moderately sized program is exceptionally large. For this reason, it is impossible to execute

every combination of paths during testing. It is possible, however, to adequately cover 

program logic and to ensure that all conditions in the component-level design have been

exercised.

To be most effective, testing should be conducted by an independent third

party- By most effective, we mean testing that has the highest probability of finding errors(the primary objective of testing).

For reasons that have been introduced earlier in this unit, the software engineer who created

the system is not the best person to conduct all tests for the software.

 Q3. Discuss the CMM 5 Levels for Software Process.

Answer:

The Capability Maturity Model (CMM) is a theoretical process capability maturity model. The

CMM was originally developed as a tool for objectively assessing the ability of governmentcontractors' processes to perform a contracted software project. For this reason, it has beenused extensively for avionics software and government projects around the world. The 5-Level structure of the CMM can be illustrated by the diagram below.

  Although the CMM comes from the area of software development, it can be (and has

been and still is being) applied as a generally applicable model to assist in understanding the

process capability maturity of organisations in areas as diverse as, for example: softwareengineering, system engineering, project management, software maintenance, riskmanagement, system acquisition, information technology (IT), and personnel management.

Page 5 of 16Vijay Kumar Sharma

7/29/2019 MB0033 ok_1.doc

http://slidepdf.com/reader/full/mb0033-ok1doc 6/16

Master of Business Administration – Semester-III MI0033 – Software Engineering Assignment Set- I

The CMM was first described in the book managing the Software Process (1989) by WattsHumphrey, and hence was also known as "Humphrey's CMM". Humphrey had started

development the model at the SEI (US Dept. of Defence Software Engineering Institute) in1986, basing it on the earlier work of Phil Crosby - the latter had earlier published the QualityManagement Maturity Grid in his book Quality is Free (1979). The SEI was at CarnegieMellon University in Pittsburgh.

The CMM has been superseded by a variant - the Capability Maturity Model Integration(CMMA) - the old CMM being renamed to Software Engineering CMM (SE-CMM). Accreditations based on the SE-CMM expired on 31 December 2007.

Variants of maturity models derived from the CMM emerged over the years, including, for example, Systems Security Engineering CMM (SSE-CMM) and the People CapabilityMaturity Model.

(Note that maturity models generally started to become part of international standards as partof ISO 15504.)

The CMM involves the following aspects:

• Maturity Levels: A 5-Level process maturity continuum - where the uppermost (5th)level is a notional ideal state where processes would be systematically managed by acombination of process optimization and continuous process improvement.

• Key Process Areas: Within each of these maturity levels are Key Process Areas

(KPAs) which characterise that level, and for each KPA there are five definitionsidentified:o Goals

o Commitment

o  Ability

o Measurement

o Verification

The KPAs are not necessarily unique to CMM, representing - as they do - the stages thatorganizations’ processes will need to pass through as they progress up the CMM continuum.

Goals: The goals of a key process area summarize the states that must exist for thatkey process area to have been implemented in an effective and lasting way. Theextent to which the goals have been accomplished is an indicator of how muchcapability the organization has established at that maturity level. The goals signify thescope, boundaries, and intent of each key process area.

Common Features: Common features include practices that implement andinstitutionalize a key process area. There are five types of common features:Commitment to Perform, Ability to Perform, Activities Performed, Measurement and Analysis, and Verifying Implementation.

Key Practices: The key practices describe the elements of infrastructure and practice

that contribute most effectively to the implementation and institutionalization of theKPAs.

Page 6 of 16Vijay Kumar Sharma

7/29/2019 MB0033 ok_1.doc

http://slidepdf.com/reader/full/mb0033-ok1doc 7/16

Master of Business Administration – Semester-III MI0033 – Software Engineering Assignment Set- I

Levels of the CMM:

There are five levels defined along the continuum of the CMM, and, according to the SEI:

"Predictability, effectiveness, and control of an organization’s software processes arebelieved to improve as the organization moves up these five levels. While not rigorous, theempirical evidence to date supports this belief."

Level 1 - Initial

Processes are usually ad hoc and the organization usually does not provide a stableenvironment. Success in these organizations depends on the competence and heroics of the people in the organization and not on the use of proven processes. In spite of this adhoc, chaotic environment, maturity level 1 organizations often produce products andservices that work; however, they frequently exceed the budget and schedule of their projects.

Organizations are characterized by a tendency to over commit, abandon processes in thetime of crisis, and not be able to repeat their past successes again.

Software project success depends on having quality people.

Level 2 - Repeatable

Software development successes are repeatable. The processes may not repeat for all theprojects in the organization. The organization may use some basic project management totrack cost and schedule.

Process discipline helps ensure that existing practices are retained during times of stress.When these practices are in place, projects are performed and managed according to their documented plans.

Project status and the delivery of services are visible to management at defined points (for example, at major milestones and at the completion of major tasks).

Basic project management processes are established to track cost, schedule, andfunctionality. The minimum process discipline is in place to repeat earlier successes onprojects with similar applications and scope. There is still a significant risk of exceeding cost

and time estimate.

Level 3 - Defined

The organization’s set of standard processes, which is the basis for level 3, is establishedand improved over time. These standard processes are used to establish consistencyacross the organization. Projects establish their defined processes by the organization’s setof standard processes according to tailoring guidelines.

The organization’s management establishes process objectives based on the organization’sset of standard processes and ensures that these objectives are appropriately addressed.

 A critical distinction between level 2 and level 3 is the scope of standards, processdescriptions, and procedures. At level 2, the standards, process descriptions, and

Page 7 of 16Vijay Kumar Sharma

7/29/2019 MB0033 ok_1.doc

http://slidepdf.com/reader/full/mb0033-ok1doc 8/16

Master of Business Administration – Semester-III MI0033 – Software Engineering Assignment Set- I

procedures may be quite different in each specific instance of the process (for example, ona particular project). At level 3, the standards, process descriptions, and procedures for a

project are tailored from the organization’s set of standard processes to suit a particular project or organizational unit.

Level 4 - Managed

Using precise measurements, management can effectively control the softwaredevelopment effort. In particular, management can identify ways to adjust and adapt theprocess to particular projects without measurable losses of quality or deviations fromspecifications. At this level organization set a quantitative quality goal for both softwareprocess and software maintenance.

Sub processes are selected that significantly contribute to overall process performance.These selected sub processes are controlled using statistical and other quantitativetechniques.

 A critical distinction between maturity level 3 and maturity level 4 is the predictability of process performance. At maturity level 4, the performance of processes is controlled usingstatistical and other quantitative techniques, and is quantitatively predictable. At maturitylevel 3, processes are only qualitatively predictable.

Level 5 - Optimizing

Focusing on continually improving process performance through both incremental andinnovative technological improvements. Quantitative process-improvement objectives for theorganization are established, continually revised to reflect changing business objectives,and used as criteria in managing process improvement. The effects of deployed processimprovements are measured and evaluated against the quantitative process-improvementobjectives. Both the defined processes and the organization’s set of standard processes aretargets of measurable improvement activities.

Process improvements to address common causes of process variation and measurablyimprove the organization’s processes are identified, evaluated, and deployed.

Optimizing processes that are nimble, adaptable and innovative depends on the

participation of an empowered workforce aligned with the business values and objectives of the organization. The organization’s ability to rapidly respond to changes and opportunitiesis enhanced by finding ways to accelerate and share learning.

 A critical distinction between maturity level 4 and maturity level 5 is the type of processvariation addressed. At maturity level 4, processes are concerned with addressing specialcauses of process variation and providing statistical predictability of the results. Thoughprocesses may produce predictable results, the results may be insufficient to achieve theestablished objectives. At maturity level 5, processes are concerned with addressingcommon causes of process variation and changing the process (that is, shifting the mean of the process performance) to improve process performance (while maintaining statisticalprobability) to achieve the established quantitative process-improvement objectives.

3. Discuss the Water Fall model for Software Development.

Page 8 of 16Vijay Kumar Sharma

7/29/2019 MB0033 ok_1.doc

http://slidepdf.com/reader/full/mb0033-ok1doc 9/16

Master of Business Administration – Semester-III MI0033 – Software Engineering Assignment Set- I

Answer:

Water fall model:

The simplest software development life cycle model is the waterfall model, which states thatthe phases are organized in a linear order. A project begins with feasibility analysis. On thesuccessful demonstration of the feasibility analysis, the requirements analysis and projectplanning begins.

The design starts after the requirements analysis is done. And coding begins after the designis done. Once the programming is completed, the code is integrated and testing is done. Onsuccessful completion of testing, the system is installed. After this the regular operation andmaintenance of the system takes place. The following figure demonstrates the steps involved

in waterfall life cycle model.

 

The Waterfall Software Life Cycle Model 

With the waterfall model, the activities performed in a software development project arerequirements analysis, project planning, system design, detailed design, coding and unittesting, system integration and testing. Linear ordering of activities has some importantconsequences. First, to clearly identify the end of a phase and beginning of the others. Somecertification mechanism has to be employed at the end of each phase. This is usually done bysome verification and validation. Validation means confirming the output of a phase isconsistent with its input (which is the output of the previous phase) and that the output of the

phase is consistent with overall requirements of the system.

The consequence of the need of certification is that each phase must have some definedoutput that can be evaluated and certified. Therefore, when the activities of a phase arecompleted, there should be an output product of that phase and the goal of a phase is toproduce this product. The outputs of the earlier phases are often called intermediate productsor design document. For the coding phase, the output is the code. From this point of view, theoutput of a software project is to justify the final program along with the use of documentationwith the requirements document, design document, project plan, test plan and test results.

 Another implication of the linear ordering of phases is that after each phase is completed andits outputs are certified, these outputs become the inputs to the next phase and should not bechanged or modified. However, changing requirements cannot be avoided and must befaced. Since changes performed in the output of one phase affect the later phases that might

Page 9 of 16Vijay Kumar Sharma

7/29/2019 MB0033 ok_1.doc

http://slidepdf.com/reader/full/mb0033-ok1doc 10/16

Master of Business Administration – Semester-III MI0033 – Software Engineering Assignment Set- I

have been performed. These changes have to make in a controlled manner after evaluatingthe effect of each change on the project. This brings us to the need for configuration controlor configuration management.

The certified output of a phase that is released for the best phase is called baseline. Theconfiguration management ensures that any changes to a baseline are made after carefulreview, keeping in mind the interests of all parties that are affected by it. There are two basicassumptions for justifying the linear ordering of phase in the manner proposed by thewaterfall model.

For a successful project resulting in a successful product, all phases listed in the waterfallmodel must be performed anyway.

 Any different ordering of the phases will result in a less successful software product.

 Q5. Explain the Advantages of Prototype Model, & Spiral Model in Contrast to Water 

Fall model.

Answer:

Prototype Model Advantages

Creating software using the prototype model also has its benefits. One of the key

advantages prototype modeled software has is the time frame of development. Instead of 

concentrating on documentation, more effort is placed in creating the actual software. This

way, the actual software could be released in advance. The work on prototype models could

also be spread to others since there are practically no stages of work in this model. Everyone

has to work on the same thing and at the same time, reducing man hours in creating

software. The work will even be faster and efficient if developers will collaborate more

regarding the status of a specific function and develop the necessary adjustments in time for 

the integration.

 Another advantage of having prototype modeled software is that the software is

created using lots of user feedbacks. In every prototype created, users could give their 

honest opinion about the software. If something is unfavorable, it can be changed. Slowly the

program is created with the customer in mind.

The waterfall model is a sequential  design process, often used in software 

development processes, in which progress is seen as flowing steadily downwards (like a

waterfall) through the phases of Conception, Initiation,  Analysis, Design, Construction,

Testing, Production/Implementation and Maintenance.

Page 10 of 16Vijay Kumar Sharma

7/29/2019 MB0033 ok_1.doc

http://slidepdf.com/reader/full/mb0033-ok1doc 11/16

Master of Business Administration – Semester-III MI0033 – Software Engineering Assignment Set- I

The unmodified "waterfall model". Progress flows from the top to the bottom, like a waterfall.

The waterfall development model originates in the manufacturing and construction 

industries: highly structured physical environments in which after-the-fact changes are

prohibitively costly, if not impossible. Since no formal software development methodologies

existed at the time, this hardware-oriented model was simply adapted for software

development.The first known presentation describing use of similar phases in software engineering

was held by Herbert D. Benington at Symposium on advanced programming methods for 

digital computers on 29 June 1956.[1] This presentation was about the development of 

software for  SAGE. In 1983 the paper was republished [2]  with a foreword by Benington

pointing out that the process was not in fact performed in strict top-down, but depended on a

prototype.

The first formal description of the waterfall model is often cited as a 1970 article by

Winston W. Royce, [3] though Royce did not use the term "waterfall" in this article. Royce

presented this model as an example of a flawed, non-working model (Royce 1970). This, in

fact, is how the term is generally used in writing about software development—to describe a

critical view of a commonly used software practice.[4]

Q.6. Explain the COCOMO Model & Software Estimation Technique.

 Answer:

Page 11 of 16Vijay Kumar Sharma

7/29/2019 MB0033 ok_1.doc

http://slidepdf.com/reader/full/mb0033-ok1doc 12/16

Master of Business Administration – Semester-III MI0033 – Software Engineering Assignment Set- I

The COCOMO cost estimation model is used by thousands of software project

managers, and is based on a study of hundreds of software projects. Unlike other cost

estimation models, COCOMO is an open model, so all of the details are published, including:

• The underlying cost estimation equations

• Every assumption made in the model (e.g. "the project will enjoy good management")

• Every definition (e.g. the precise definition of the Product Design phase of a project)

• The costs included in an estimate are explicitly stated (e.g. project managers are

included, secretaries aren't)

Because COCOMO is well defined, and because it doesn't rely upon proprietaryestimation algorithms, Costar offers these advantages to its users:

• COCOMO estimates are more objective and repeatable than estimates made by

methods relying on proprietary models

• COCOMO can be calibrated to reflect your software development environment, and to

produce more accurate estimates

Costar is a faithful implementation of the COCOMO model that is easy to use on small

projects, and yet powerful enough to plan and control large projects.

Typically, you'll start with only a rough description of the software system that you'll be

developing, and you'll use Costar to give you early estimates about the proper schedule and

staffing levels. As you refine your knowledge of the problem, and as you design more of the

system, you can use Costar to produce more and more refined estimates.

Costar allows you to define a software structure to meet your needs. Your initial

estimate might be made on the basis of a system containing 3,000 lines of code. Your second

estimate might be more refined so that you now understand that your system will consist of 

two subsystems (and you'll have a more accurate idea about how many lines of code will be

in each of the subsystems). Your next estimate will continue the process -- you can use

Costar to define the components of each subsystem. Costar permits you to continue this

process until you arrive at the level of detail that suits your needs.

One word of warning: It is so easy to use Costar to make software cost estimates, that it's

possible to misuse it -- every Costar user should spend the time to learn the underlying

Page 12 of 16Vijay Kumar Sharma

7/29/2019 MB0033 ok_1.doc

http://slidepdf.com/reader/full/mb0033-ok1doc 13/16

Master of Business Administration – Semester-III MI0033 – Software Engineering Assignment Set- I

COCOMO assumptions and definitions from Software Engineering Economics and

Software Cost Estimation with COCOMO II .

Introduction to the COCOMO Model 

The most fundamental calculation in the COCOMO model is the use of the Effort

Equation to estimate the number of Person-Months required developing a project. Most of the

other COCOMO results, including the estimates for Requirements and Maintenance, are

derived from this quantity.

Source Lines of Code

The COCOMO calculations are based on your estimates of a project's size in Source

Lines of Code (SLOC). SLOC is defined such that:

• Only Source lines that are DELIVERED as part of the product are included -- test

drivers and other support software is excluded

• SOURCE lines are created by the project staff -- code created by applications

generators is excluded

• One SLOC is one logical line of code

• Declarations are counted as SLOC

• Comments are not counted as SLOC

The original COCOMO 81 model was defined in terms of Delivered Source

Instructions, which are very similar to SLOC. The major difference between DSI and SLOC is

that a single Source Line of Code may be several physical lines. For example, an "if-then-

else" statement would be counted as one SLOC, but might be counted as several DSI.

The Scale Drivers

In the COCOMO II model, some of the most important factors contributing to a

project's duration and cost are the Scale Drivers. You set each Scale Driver to describe your 

project; these Scale Drivers determine the exponent used in the Effort Equation.

The 5 Scale Drivers are:

• Precedentedness

• Development Flexibility

Page 13 of 16Vijay Kumar Sharma

7/29/2019 MB0033 ok_1.doc

http://slidepdf.com/reader/full/mb0033-ok1doc 14/16

Master of Business Administration – Semester-III MI0033 – Software Engineering Assignment Set- I

•  Architecture / Risk Resolution

• Team Cohesion

• Process Maturity

Note that the Scale Drivers have replaced the Development Mode of COCOMO 81.

The first two Scale Drivers, Precedentedness and Development Flexibility actually describe

much the same influences that the original Development Mode did.

Cost Drivers

COCOMO II has 17 cost drivers � you assess your project, development

environment, and team to set each cost driver. The cost drivers are multiplicative factors that

determine the effort required to complete your software project. For example, if your project

will develop software that controls an airplane's flight, you would set the Required Software

Reliability (RELY) cost driver to Very High. That rating corresponds to an effort multiplier of 

1.26, meaning that your project will require 26% more effort than a typical software project.

COCOMO II defines each of the cost drivers, and the Effort Multiplier associated with

each rating. Check the Costar help for details about the definitions and how to set the cost

drivers.

COCOMO II Effort Equation

The COCOMO II model makes its estimates of required effort (measured in Person-Months

� PM) based primarily on your estimate of the software project's size (as measured in

thousands of SLOC, KSLOC)):

Effort = 2.94 * EAF * (KSLOC)E

Where

EAF Is the Effort Adjustment Factor derived from the Cost Drivers.

E Is an exponent derived from the five Scale Drivers.

 As an example, a project with all Nominal Cost Drivers and Scale Drivers would have

an EAF of 1.00 and exponent, E, of 1.0997. Assuming that the project is projected to consist

of 8,000 source lines of code, COCOMO II estimates that 28.9 Person-Months of effort is

required to complete it:

Page 14 of 16Vijay Kumar Sharma

7/29/2019 MB0033 ok_1.doc

http://slidepdf.com/reader/full/mb0033-ok1doc 15/16

Master of Business Administration – Semester-III MI0033 – Software Engineering Assignment Set- I

Effort = 2.94 * (1.0) * (8)1.0997 = 28.9 Person-Months

Effort Adjustment Factor 

The Effort Adjustment Factor in the effort equation is simply the product of the effort

multipliers corresponding to each of the cost drivers for your project.

For example, if your project is rated Very High for Complexity (effort multiplier of 1.34),

and Low for Language & Tools Experience (effort multiplier of 1.09), and all of the other cost

drivers are rated to be Nominal (effort multiplier of 1.00), the EAF is the product of 1.34 and

1.09.

Effort Adjustment Factor = EAF = 1.34 * 1.09 = 1.46

Effort = 2.94 * (1.46) * (8)1.0997 = 42.3 Person-Months

COCOMO II Schedule Equation

The COCOMO II schedule equation predicts the number of months required to

complete your software project. The duration of a project is based on the effort predicted by

the effort equation:

Duration = 3.67 * (Effort)SE

Where

Effort is the effort from the COCOMO II effort equation

SE Is the schedule equation exponent derived from the five Scale Drivers

Continuing the example, and substituting the exponent of 0.3179 that is calculated

from the scale drivers, yields an estimate of just over a year, and an average staffing of 

between 3 and 4 people:

Duration = 3.67 * (42.3)0.3179 = 12.1 months

 Average staffing = (42.3 Person-Months) / (12.1 Months) = 3.5 people

Page 15 of 16Vijay Kumar Sharma

7/29/2019 MB0033 ok_1.doc

http://slidepdf.com/reader/full/mb0033-ok1doc 16/16

Master of Business Administration – Semester-III MI0033 – Software Engineering Assignment Set- I

The SCED Cost Driver 

The COCOMO cost driver for Required Development Schedule (SCED) is unique, andrequires a special explanation.

The SCED cost driver is used to account for the observation that a project developed

on an accelerated schedule will require more effort than a project developed on its optimum

schedule. A SCED rating of Very Low corresponds to an Effort Multiplier of 1.43 (in the

COCOMO II.2000 model) and means that you intend to finish your project in 75% of the

optimum schedule (as determined by a previous COCOMO estimate). Continuing the

example used earlier, but assuming that SCED has a rating of Very Low, COCOMO producesthese estimates:

Duration = 75% * 12.1 Months = 9.1 Months

Effort Adjustment Factor = EAF = 1.34 * 1.09 * 1.43 = 2.09

Effort = 2.94 * (2.09) * (8)1.0997 = 60.4 Person-Months

 Average staffing = (60.4 Person-Months) / (9.1 Months) = 6.7 peopleNotice that the calculation of duration isn't based directly on the effort (number of 

Person-Months) � instead it's based on the schedule that would have been required for the

project assuming it had been developed on the nominal schedule. Remember that the SCED

cost driver means "accelerated from the nominal schedule".

------------------- End -------------------------

Page 16 of 16Vijay Kumar Sharma