1/110
Looks like no tags are added yet.
Name | Mastery | Learn | Test | Matching | Spaced |
|---|
No study sessions yet.
It is assumed that data collected:
reliable, accurate, valid
Define Reliable Data
Data that are collected and tabulated according to the defined rules of measurement and metric; no error was committed in the actual recording and subsequent tabulation of the data.
The level of data’s accuracy is predetermined by:
The unit of measurement that was predetermined prior to data collection.
Because most software attributes are measured in a __________ form, the topic of significant figures is not much of an issue in software project management
discrete (countable)
Define Accurate Data
Data that are collected and tabulated according to the defined level of precision of measurement and metric.
____________ addresses the applicability of the data to assess the particular issue or to measure the particular attribute.
validity
Define Valid Data
Data that are collected, tabulated, and applied according to the defined intention of applying the measurement.
T/F: Software project managers need to be extra careful in considering the validity of the data when those data are utilized in the analysis of some attribute.
TRUE
T/F: In analyzing data, the validity issue is very important.
TRUE
What are common analyses methods software managers can undertake to yield beneficial evaluations of data?
Distribution (Skews, Trends, Range of Values)
Centrality & Dispersion (Averages, Medians, Standard Deviation)
Data Smoothing
Data Correlations
Data Normalization
Define Data Distribution
A description of a collection of data that shows the spread of the values and the frequency of occurrences of the values of the data.
Assuming that data collection was reliable, then one of the simplest forms of analysis is to look at the __________ of the collected data.
distribution
By viewing the __________, one may be able to readily detect some problems or trends.
spread
Software project managers may improve their understanding of the project’s status during the monitoring phase by evaluating the data distribution through:
analysis of the skew of the distribution, the range of data values, and trends in data.
Give an example of Data Distribution Skew
Looking at a graph of the known problems by severity level and seeing that most problems are skewed to a severity of 5.
Give an example of Data Distribution Range of Data Values
We examine known defects grouped by functional area. The highest count reveals 8 defects in one functional area. The lowest reveals 0 defects. This implies a range from 0 to 8. We compare this range to similar projects to make assumptions.
Give an example of Data Distribution Trends
We examine known defects across a period of time. We see that the number of known defects is decreasing over time. This would be a desirable trend as fewer problems are detected in the same functional area as the project progresses.
Why do trends offer a powerful way to analyze data?
In trend analysis in software projects, managers are often looking for some form of stabilization, whether in the schedule, the budget, or some other attribute.
Define Centrality Analysis
An analysis of a data set to find the typical value to represent that data set; evaluates the central tendency of the data distribution.
What is the benefit of centrality analysis?
It provides a convenient way to compare groups of data.
Analyzing the centrality and the dispersion of data provides software project managers:
A way to characterize a set of related data, whether those data deal with product quality, project productivity, or some other attribute.
The most common of the centrality analysis methods is the ___________.
computation of average
Define Average Value
One type of centrality analysis that estimates the typical (or middle) value of a data set by summing all the observed data values and dividing the sum by the number of data points.
It is well known that the average value may be influenced greatly by:
the inclusion of one or two extreme data points.
How do you find the median of the observed data?
All observed data are placed in ordered sequence.
The value that divides the collected data into upper and lower halves (i.e., the middle value) is the median.
Define Median
A value used in centrality analysis to estimate the typical (or middle) value of a data set. After the data values are organized, it is the data value that splits the data set into upper and lower halves.
A very common dispersion measurement is the _____________.
standard deviation
Define Standard Deviation
A metric used to define and measure the dispersion of data from the average value in a data set.
The larger the standard deviation, the greater the:
variability or dispersion from the average value.
In quality control of nonsoftware areas, such as manufacturing, ________________ are used to assess whether the average of any particular group falls within the range of “acceptable” limits.
control charts
Define Control Chart
A chart used to assess and control the variability of some process or product characteristic. It usually involves establishing the upper and lower limits (the control limits) of data variations from the data set’s average value. If an observed data value falls outside the control limits, then it would trigger evaluation of the characteristic.
How do you make a control chart?
Establish the upper and lower limits of data variations from the data set’s average value (the control limits).
If an observed data value falls outside the control limits, then it would trigger evaluation of the characteristic.
How can control charts + statistical process control be helpful?
They may help us improve and diminish the variations in the implementations of a defined software process.
How can control charts + standard deviations be helpful?
May be applied to tracking and observing a specific characteristic of a product or a methodology.
What is becoming quite common with control charts for quality management?
This application of the average value and the dispersion from the average value in control charts.
T/F: Data taken at a specific time provide only an instantaneous view.
TRUE
Define Moving Average
A technique for expressing data by computing the average of a fixed grouping (e.g., data for a fixed period) of data values; it is often used to suppress the effects of one extreme data point.
How is data smoothing accomplished?
By combining data points and viewing the aggregated values.
Define Data Smoothing
A technique used to decrease the effects of individual, extreme variability in data values.
Why do test managers use moving averages in long software system tests?
Because the variability in test data values may be substantial.
T/F:Data correlation speaks only to the potential existence of a relationship between attributes; it does not necessarily imply cause and effect.
TRUE
Define Data Correlation
A technique that analyzes the degree of relationship between sets of data.
One popular way to examine data correlation is to:
Analyze whether a linear relationship exists. Two sets of data may be plotted and the resulting graph reviewed to see how related they are. A more formal method, called linear regression analysis, may also be applied.
Define Linear Regression:
A technique that estimates the relationship between two sets of data by fitting a straight line to the two sets of data values.
How can linear regression be used?
To correlate other project or product attributes. Based on the projected value of the correlated attribute, certain adjustment actions may be applied to the ongoing project.
T/F: Extrapolation of the linear relationship outside of the range of the data points may be erroneous.
TRUE
A _____________ is one of the most easily identifiable relationships that may exist between two sets of data.
linear relationship
T/F: A pure comparison of the raw data sometimes does not provide an accurate comparison.
TRUE
Define Normalizing Data
A technique used to bring data characterizations to some common or standard level so that comparisons become more meaningful.
T/F: In many types of analysis, normalized data should be used. In all cases, the normalization factors must be well understood and defined.
TRUE
The _______ and __________ phases of the software project management should have properly defined and prepared the measurement schemes for reliable, accurate, and valid data.
planning, organizing
T/F: Measurement schemes for reliable, accurate, and valid data should be revisited one more time during the monitoring phase prior to the actual evaluation of collected data.
TRUE
The distribution of a set of collected data may be analyzed for:
Extreme values, skew, and trends.
Centrality and dispersion analysis of groups of data may be performed by:
Computing averages, median values, or standard deviations from the central value.
Data-smoothing techniques, involving the evaluation of trends through moving averages, are often used to:
Lessen the impact of exceptional data points (outliers).
The correlation of groups of data may be examined through many different methods. One very simple but popular method is:
linear regression
Normalizing the data ensures:
Groups of data are properly compared
The fundamental principle of project monitoring is comparing ______ versus _______.
planned, actual
In the project-monitoring phase of Earned Value Management (EVM), the major activity is to:
Assess the status of a project by reviewing what was planned and expected against what is really occurring.
Define Earned Value Management (EVM)
A tool, or a methodology, that allows the project managers to first track the planned against the actual project status and then to perform the analysis of that information.
What does EVM use to allow us to analyze both the cost variance and the schedule variance?
Task-effort units as cost and a set of defined measurement units.
EVM uses task-effort units as cost in conjunction with a set of defined measurement units that will allow us to analyze both the _______ variance and the ________ variance.
cost, schedule
Three very important activities must be performed in advance of carrying out EVM:
Project tasks must be well defined.
Effort for each task must be well estimated.
Actual effort for each task must be tracked.
Define Task Effort
A central unit used for measuring project cost; may be in person-hours, person-days, or some similar unit.
What are key definitions of EVM that depend on the notion of task effort?
Budgeted Cost of Work (BCW)
Budgeted Cost of Work Scheduled (BCWS)
Budget at Completion (BAC)
Planned Value (PV)
Budgeted Cost of Work Performed (BCWP)
Actual Cost of Work Performed (ACWP)
Budgeted Cost of Work (BCW):
Estimated work effort for each task defined in the project.
Budgeted Cost of Work Scheduled (BCWS):
The sum of the estimated work effort for all the tasks that were scheduled to be completed by a specified time.
**This is the sum of only those BCWs that were planned or scheduled to be completed by that specific date of interest; not the sum of actually completed tasks.
Budget at Completion (BAC):
The estimate of the total project effort that will be expended at the end of the project.
**This is the sum of all the BCWs for the entire project.
Planned Value (PV):
The percentage of the total estimated effort that is assigned to the particular task, j, or (BCW of j)/BAC.
Budgeted Cost of Work Performed (BCWP):
The sum of the estimated effort of the tasks that have actually been accomplished by the specified time; the sum of the planned effort number of the completed tasks
Actual Cost of Work Performed (ACWP):
The sum of the actual efforts expended for the tasks that have been completed by the specified time; the sum of the actual effort number of the completed tasks.
T/F: Budgeted Cost of Work Scheduled (BCWS) includes the estimated effort of the tasks that were planned to be completed at some time of interest. Thus, this number is not the sum of actually completed tasks at that time of interest.
TRUE
T/F: Budgeted Cost of Work Scheduled (BCWS) varies by the actual time when the measurement is taken.
TRUE
Define Earned Value (EV)
An indicator of how much of the total project is completed at a specific time of interest.
= BCWP / BAC
What are two indices that may provide an even better indicator for earned value?
Schedule Performance Index (SPI)
Cost Performance Index (CPI)
Define Schedule Performance Index (SPI)
Index indicator of schedule. Defines the estimated effort of all tasks actually completed as compared to the estimated effort of all those tasks that were estimated to be completed by a specific date.
= BCWP/BCWS
Define Cost Performance Index (CPI)
Index indicator of cost. Compares the estimated effort of all tasks actually completed by a specific date against the actual effort spent for those tasks.
= BCWP / ACWP
If SPI = 1, then project is:
on schedule.
If SPI > 1, then the project is:
ahead of schedule
the estimated effort of all tasks actually completed is larger than the estimated effort of all those tasks planned to be completed.
If SPI < 1, then the project is:
behind schedule.
If CPI > 1, then the project is:
under budget; estimated effort is greater than what was actually expended.
If CPI = 1, then the project is:
on target.
If CPI < 1, then the project is:
over budget.
SPI and CPI indices may also be viewed through a different perspective using:
variances
What are the two project variances?
Schedule, Cost
Instead of SPI, the _____________ can be used.
project schedule variance
Define schedule variance
BCWP – BCWS
If SV > 0, then:
The project is ahead of schedule.
The estimated effort of all the actually completed tasks is more than the estimated effort of all the tasks that were planned to be completed on that specific date.
Instead of using CPI, a _____________may be used.
cost variance (CV)
Define Cost Variance (CV)
= BCWP – ACWP
If CV > 0, then:
The project is under budget.
The estimated effort of all the actually completed tasks is greater than the actual effort spent for all those same completed tasks.
T/F: The two variance indicators serve the same purpose as the two indices, but the variance indicators provide a more specific number in terms of units of effort.
TRUE
Earned Value Management is a project-monitoring and control technique based on the concept of:
Comparing what was planned with what was actually accomplished.
Each major task in EVM is broken down to subtasks whose effort is estimated via some technique, such as the:
Work Breakdown Structure
In EVM, each estimated task effort is the:
Budgeted Cost of Work (BCW)
By the end of the project, the total amount of estimated effort expended will be the sum of all the _______, called the _________.
BCWs, Budget at Completion (BAC).
On any specific status date, the estimated effort for all the tasks that are planned to be completed is defined as the:
Budgeted Cost of Work Scheduled (BCWS)
On any specific status date, the total estimated effort for all those tasks that are actually completed is called the:
Budgeted Cost of Work Performed (BCWP)
On any specific status date, the total actual effort expended for those actually completed tasks is called the:
Actual Cost of Work Performed (ACWP)
Earned Value, or EV, is defined as:
BCWP/BAC