SV

11-Software metrics

Quantitative Software Metrics

  • Code Metrics
    • Static
    • Dynamic
  • Management Metrics
    • Cost
    • Duration, Time
    • Staffing
  • Testing Metrics
  • Productivity Metrics
  • Design Metrics
  • Maintainability Metrics

Software Quality Management

  • Ensures required quality levels within budget & time constraints.
  • Quality software must have minimal defects and meet standards (maintainability, reliability, etc.).
  • Requires a framework at both organizational and project levels.

Software Measurement

  • Derives numeric values for attributes of software products/processes for objective comparisons.
  • Limited systematic use in many organizations; few established standards exist.

Software Metrics

  • Any measurement related to software system, process, or documentation, e.g.:
    • Lines of code
    • Fog readability index
    • Person-days to develop component.
  • Used for predicting product attributes and controlling the software process.

Types of Process Metrics

  • Time Metrics: Duration to complete processes.
  • Resource Metrics: Efforts in person-days, costs, etc.
  • Event Occurrences: Track defects, requirements changes, etc.

Measurement Objectives

  • Understand development processes:
    • Cost-effectiveness of inspections
    • Effect of pair programming on productivity.
    • Project completion rates (on time/budget).

Predictor and Control Measurements

  • Control Metrics: Assess product.
  • Predictor Metrics: Inform management decisions.

Use of Measurements

  • Assign values to system quality attributes (e.g., LOC, complexity).
  • Identify sub-standard components by measuring characteristics.

Metrics Assumptions

  • Accurate measurable software properties exist.
  • Relationships between measurable and desired attributes need validation.

Product Metrics

  • Dynamic Metrics: Measure during program execution.
  • Static Metrics: Measure system representations.
  • Quality metrics should predict product quality.

Dynamic vs. Static Metrics

  • Dynamic Metrics: Directly relate to quality attributes (e.g., system response time).
  • Static Metrics: Indirectly relate to quality; e.g., complexity and maintainability.

Software Component Analysis

  • Analyze components using various metrics; compare values to identify issues.
  • Establish thresholds and monitor anomalous measurements.

Measurement Ambiguity

  • The context of collected data is crucial; incorrect interpretations can lead to misleading insights.

Software Context

  • Measured processes/products are affected by changing business environments, requiring careful analysis of results.

Measurement Surprises

  • Enhanced quality may lead to increased help desk calls due to user expectations shifting.

Problems in the Software Industry

  • ROI quantification of metrics programs is challenging.
  • Lack of standardization and definition in processes persists across companies.

Empirical Software Engineering

  • Bases itself on data collection and experiments to validate software engineering methods and techniques.

Software Analytics

  • Focuses on understanding and improving software systems through data analysis and insights.

Analytics Tools Requirements

  • User-friendly, quick outputs, comprehensive measurement capabilities, and interactive exploratory analysis.

Status of Software Analytics

  • Still developing, with uncertainties regarding effectiveness; dependent on collected data metrics family.

Metrics Areas Empirically Researched

  • Covers productivity, design, testing, maintainability, management, and code metrics.

Deciding What to Measure

  • Identify customer goals and relevant metrics to assess success accordingly.

Measurement Caveats

  • Software data is often imprecise; manage impact of measurement on processes and balance costs against benefits.