Measurements in Healthcare

Measurements


Importance of Measurements in Healthcare

  • **Patient Safety: **

    • Accurate measurements are crucial in healthcare, impacting patient safety directly.

    • Medication dosage errors as small as a few milligrams can lead to serious consequences.

    • Measurements include patient weight and laboratory test results that influence diagnosis and treatment.

    • Precision is essential.


Significance of Laboratory Measurements

  • Accurate Diagnosis:

    • Laboratory results provide objective data confirming or ruling out suspected conditions.

    • Example: A blood glucose level of 250 mg/dL indicates diabetes.

  • Treatment Monitoring:

    • Regular measurements track patient responses to treatments.

    • Example: Declining white blood cell counts may signal successful chemotherapy.

  • Patient Safety:

    • Measurements can detect dangerous changes before symptoms arise.

    • Example: Rising cardiac enzymes may indicate heart damage prior to chest pain.


Physical Quantities Measured in Laboratories

  • Length:

    • Measured in meters; examples include cell size, tissue samples, and bacterial growth zones.

  • Volume:

    • Blood samples, reagent solutions, and culture media are measured in liters.

  • Time:

    • Reaction durations, centrifugation, and growth periods are tracked in seconds.

  • Mass:

    • Mass is the measure for reagent preparation and sample weight.

  • Temperature:

    • Influential in incubation conditions, sample storage, and reaction kinetics.

  • Light Properties:

    • Measured parameters include absorbance, fluorescence, turbidity, pH, and conductivity.


Seven Basic SI Units

  • The SI system defines seven fundamental units forming the basis of measurements in science and medicine:

    • Length: Meter (m)

    • Mass: Kilogram (kg)

    • Time: Second (s)

    • Temperature: Kelvin (K)

    • Amount of Substance: Mole (mol)

    • Electric Current: Ampere (A)

    • Luminous Intensity: Candela (cd)


Understanding Metric Prefixes in Medicine

  • Micro- (μ):

    • Used for very small measurements like medication doses (e.g., 250 μg of vitamin B12).

  • Milli- (m):

    • Common for liquid medications and lab values (e.g., 5 mL of cough syrup, 100 mg of aspirin).

  • Base Unit:

    • Standard measurement without prefix (e.g., 1 gram of tissue sample, 2 liters of IV fluid).

  • Kilo- (k):

    • Used for larger measurements like body weight (e.g., 70 kg patient weight).


Importance of Understanding Units

  • Laboratory results convey crucial information only when units are understood properly.

  • Tests use different units based on substance type and measurement method.

  • Fluency in both SI and conventional units is essential for accurate clinical measurements.


Concentration Units

  • Units used to express the amount of substance dissolved in a fluid:

    • mol/L or mmol/L:

    • Commonly used for electrolytes, glucose, lactate.

    • mg/dL:

    • Used for glucose, cholesterol.

    • μg/dL:

    • Applied for hormones, lead levels.


Cell Counts

  • Used to quantify blood cells or expressed proportions:

    • Cells/μL:

    • For white blood cells (WBCs), red blood cells (RBCs).

    • %:

    • For hematocrit, lymphocyte differential.

    • g/dL:

    • For hemoglobin.


Enzyme & Protein Units

  • Used to measure enzyme activity or protein concentration:

    • IU/L:

    • Common for liver enzymes (ALT, AST, ALP).

    • g/L:

    • For total protein, albumin.

    • ng/mL:

    • For troponin, PSA.


Converting Between Metric and English

  • Key conversions:

    • 1 kilogram = 2.2 pounds

    • 1 meter = 3.28 feet

    • 1 centimeter = 0.394 inches

    • 1 liter = 1.057 quarts

    • 1 gram = 0.035 ounces

  • Example: Patient weight conversion from pounds to kilograms (150 lbs = 68 kg) and height conversion from feet/inches to centimeters (5 ft 8 in = 173 cm).


Errors in Measurement

  • Systematic Errors:

    • Consistent, predictable deviations often due to calibration issues, reagent deterioration, or methodological flaws.

  • Random Errors:

    • Unpredictable variations caused by environmental fluctuations or human variability (e.g., pipetting technique differences).

  • Pre-analytical Errors:

    • Occur before testing begins; issues like patient preparation, improper sample handling, etc.

  • Post-analytical Errors:

    • Occur after testing; include transcription errors, incorrect interpretation, or reporting delays.


Introduction to Lab Measurement Tools

  • Manual Instruments:

    • Pipettes: Glass, adjustable, multi-channel.

    • Balances: Analytical (0.0001g), precision (0.01g).

    • Glassware: Volumetric flasks, graduated cylinders.

    • Thermometers: Digital, infrared, temperature probes.

    • Microscopes: Bright-field, phase-contrast, fluorescence.

  • Automated Systems:

    • Analyzers: Chemistry, hematology, immunoassay.

    • Mass Spectrometers: LC-MS/MS, GC-MS.

    • Flow Cytometers: Cell sorting and characterizing.

    • Automated Liquid Handlers: High-throughput pipetting.

    • Automated Microscopy: Digital imaging systems.


Spectrophotometry

  • A quantitative lab technique to measure light absorption at specific wavelengths.

  • How It Works:

    • A light source emits a beam at a chosen wavelength.

    • The beam passes through a cuvette containing the solution.

    • A detector measures how much light is transmitted vs. absorbed.

    • The absorbance value calculates analyte concentration using Beer’s Law.


Blood Gas Analyzers

  1. pH Measurement:

    • Utilizes a hydrogen ion-selective glass electrode for blood acidity assessment.

    • Normal range: 7.35-7.45.

    • Critical for determining acid-base balance.

  2. Oxygen Partial Pressure (pO₂):

    • Measures dissolved oxygen via Clark electrode with an oxygen-permeable membrane.

    • Normal arterial range: 80-100 mmHg, essential for evaluating respiratory function.

  3. Carbon Dioxide Partial Pressure (pCO₂):

    • Severinghaus electrode measures CO₂ via pH change in bicarbonate solution.

    • Normal arterial range: 35-45 mmHg, a key indicator of ventilation adequacy.

  4. Electrolytes & Metabolites:

    • Modern analyzers measure Na⁺, K⁺, Cl⁻, Ca²⁺, glucose, and lactate simultaneously using ion-selective electrodes and enzymatic sensors for comprehensive assessments.


Cell Counters and Pulse Oximetry

Automated Hematology Analyzers

  • Employ multiple technologies:

    • Electrical Impedance: Measures cell size as they pass through an aperture disrupting current.

    • Light Scatter: Differentiates cell types by size and granularity.

    • Fluorescence Flow Cytometry: Labels specific cell markers for detailed classification.

  • Parameters measured include RBC, WBC, platelets, hemoglobin, hematocrit, and differential WBC counts.

  • Advanced systems flag abnormal cells for microscopic review.

Pulse Oximetry

  • Non-invasive method measuring oxygen saturation (SpO₂):

    • Utilizes two light-emitting diodes (660 nm red and 940 nm infrared).

    • Photodetector measures light absorption differences.

    • Normal SpO₂: 95-100%. Values below 90% indicate significant hypoxemia requiring intervention.


Electrophoresis and PCR

Electrophoresis

  • Separates molecules based on size and charge in an electric field.

  • Applications include:

    • Serum protein electrophoresis

    • Hemoglobin electrophoresis

    • Lipoprotein electrophoresis

    • DNA/RNA gel electrophoresis

Polymerase Chain Reaction (PCR)

  • Amplifies specific DNA sequences exponentially for detection and quantification.

  • Measurement parameters:

    • Qualitative PCR

    • Quantitative PCR

    • Digital PCR

    • Cycle threshold


Measuring Glucose and Hemoglobin A1c

  • Glucose Measurement:

    • Reference range (fasting): 70-99 mg/dL (3.9-5.5 mmol/L).

  • Hemoglobin A1c Measurement:

    • Reported as a percentage of total hemoglobin and estimated average glucose (eAG).

    • Diagnostic criteria:

    • < 7.0% (normal)

    • 7.0-8.0% (controlled diabetes)

    • > 8.0% (uncontrolled diabetes).


Enzyme Assays in the Clinical Lab

  • Liver Function Enzymes:

    • Normal Ranges:

    • ALT (Alanine aminotransferase): 7-56 U/L

    • AST (Aspartate aminotransferase): 5-40 U/L

    • ALP (Alkaline phosphatase): 44-147 U/L

    • GGT (γ-glutamyl transferase): 8-61 U/L


Water Grades

  • Impact of Water Quality in Medical Labs:

    • Analytical results can be significantly compromised by trace impurities, leading to incorrect diagnoses and harmful treatment decisions.

    • Contaminants inhibit enzymatic reactions, alter chemical equilibria, and create background signals that mask true results in sensitive diagnostic assays.

    • Diagnostic integrity is crucial; poor water quality is a risk to patient care and safety.

Water Types Overview

  • Type I Water:

    • Ultra-high purity for critical applications. Rigorous quality control.

  • Type II Water:

    • High quality for most laboratory analyses; less stringent than Type I but maintains excellent purity.

  • Type III Water:

    • Moderate purity; used for less sensitive procedures and as feed water for Type I systems.

  • Type IV Water:

    • Basic laboratory grade; minimal purification for non-critical applications.

Applications of Different Water Types
  • Type I Water:

    • Molecular diagnostics and PCR, sensitive enzyme assays, trace element analysis.

  • Type II Water:

    • General laboratory testing, media and reagent preparation, clinical chemistry analyzers, microbiological culture media.

  • Type III Water:

    • Glassware rinsing, feed water for Type I/II systems, basic solution preparation, water bath applications.

  • Type IV Water:

    • Basic cleaning procedures, initial feed water for higher-grade purification, non-critical laboratory processes.


Water Purification Systems

  1. Pretreatment:

    • Removes particulates, chlorine, and hardness minerals.

  2. Primary Purification:

    • Reverse osmosis can remove up to 99% of contaminants.

  3. Secondary Purification:

    • Deionization or distillation further removes ions.

  4. Polishing Technologies:

    • Address specific contaminants (e.g., UV oxidation, ultrafiltration).


Distillation Process and Properties

  • Distillation:

    • Involves boiling water and condensing the steam into a clean container, effectively removing most common minerals.


Reverse Osmosis Water Purification

  • The semi-permeable membrane technique removes contaminants larger than water molecules, including bacteria, colloids, and particulates.


Challenges in Water Storage and Distribution

  • Quality degrades over time due to:

    • CO₂ absorption from air.

    • Contaminant leaching from storage materials.

    • Bacterial growth and biofilm formation.

    • Accumulation of environmental particles.

  • Usage Recommendations: Type I/Grade 1 water should be used immediately or stored under controlled conditions for minimal time.


Impact of Poor Water Quality

  • Analytical Interference:

    • Increased background readings in assays, false results, shifting calibration curves.

  • Equipment Damage:

    • Mineral scale buildup, corrosion, clogging of instrument pathways.


Chemical Grades

Definition

  • Chemical Grades:

    • Refers to the purity levels of reagents in testing; determines suitability for specific uses.

Determining Factors

  • Purity Standards:

    • Governing bodies like ISO maintain specifications for chemical grades.

  • Certification Documentation:

    • Manufacturers provide quality certificates detailing purity and testing methods.

Most Common Chemical Grades

  • Technical Grade:

    • Lowest purity; for industrial use.

  • Laboratory Grade:

    • Suitable for general lab work.

  • CSC Grade:

    • High purity standards for clinical applications.

  • Analytical Grade:

    • Essential for precision in clinical chemistry.

  • Pharmaceutical Grade:

    • For research and testing.

  • Specialty Grades:

    • For molecular biology.


Effects of Grade on Laboratory Results

  • Low-Grade Reagents:

    • Can introduce variability and noise leading to false results.

  • High-Grade Reagents:

    • Enhance consistency, reproducibility, and accurate quantitation.


Regulatory Standards for Medical Labs

CLIA Requirements

  • Documentation for reagent quality and evidence of compliance with manufacturer’s claims.

  • Records of reagent handling and verification of quality.

ISO 15189 Standards

  • Procedures must be documented, reagents tracked, and performance verified to ensure accuracy and reproducibility in test results.


Reagents in Clinical Labs

Definition

  • A reagent:

    • A substance added to a system to cause or test for a chemical reaction.

Roles of Reagents

  • Identification:

    • React with specific substances to identify diseases or conditions.

  • Quantification:

    • Determine precise concentrations of substances in samples.

  • Quality Control:

    • Verify that instruments and processes function correctly for reliable results.

Organic vs Inorganic Reagents
  • Organic Reagents:

    • Carbon-based, used in organic chemistry and biochemical assays.

  • Inorganic Reagents:

    • Mineral-based compounds for basic reactions and tests.


Reagent Storage and Handling

  • Temperature Control:

    • Store at recommended temperatures and prevent light exposure.

  • Contamination Prevention:

    • Use airtight containers and never return unused reagents.

  • Labeling System:

    • Clearly mark date, concentration, and preparer's initials.


Types of Laboratory Reagents

  • Acids & Bases:

    • pH adjustment and digestion agents.

  • Salts:

    • Used in buffers, culture media.

  • Solvents:

    • Dissolve samples for analysis.

  • Buffers:

    • Maintain stable pH in reactions.

  • Biological Stains:

    • Visualize cells in microscopy.

  • Commercial Kits:

    • Pre-packaged and quality-controlled for consistency.


Common Reagents

70% Alcohol

  • Properties:

    • Mixture of ethanol and purified water; optimal for microbial killing.

  • Laboratory Applications:

    • Disinfecting surfaces, skin antisepsis, preserving specimens.

Acid Alcohol

  • Composition:

    • 3% hydrochloric acid in ethanol; used as a decolorizer in staining procedures.


Solvents in Medical Labs

  • Water:

    • Universal solvent; various grades used based on requirements.

  • Ethanol:

    • Used for disinfection and extraction; common for fixing specimens.

  • Acetone:

    • Strong solvent for cleaning glassware; highly flammable.


Lot-to-Lot Reagent Variation

  • Sources of Variation:

    • Variability can arise from differences in raw materials or storage.

  • Validation Procedures:

    • New lots should be tested alongside existing ones to ensure result accuracy.


Essential Solution Terminology

  • Solution:

    • Homogeneous mixture created by combining solute and solvent.

  • Solute:

    • Substance dissolved in a solution, usually in smaller quantities.

  • Solvent:

    • Dissolving agent, typically water in the lab.

  • Dilution:

    • The process of making a solution less concentrated by adding solvent.


Controls and Standards

Reconstitution

  • Quality controls are provided as lyophilized reagents needing reconstitution with the correct reagent water for accurate use.


Buffers

Definition

  • A buffer resists pH changes when acids or bases are added, critical for maintaining test integrity.

    • Composed of weak acid or base with its conjugate, stabilizing pH.

Importance in Laboratory

  • Enzyme Functionality:

    • Most enzymes need stable pH for activity; deviations can denature proteins and coat assay reliability.

  • Result Accuracy:

    • Consistent pH leads to reliable test results crucial for clinical decisions.

  • Method Standardization:

    • Buffers ensure procedures can be compared reliably across laboratories.


Chemistry of Buffer Systems

  • Acidic Buffers:

    • Pair of weak acid and conjugate base.

    • Example: Acetic Acid (CH₃COOH) + Sodium Acetate (CH₃COONa).

  • Basic Buffers:

    • Pair of weak base and conjugate acid.

    • Example: Ammonia (NH₃) + Ammonium Chloride (NH₄Cl).


Buffer Capacity

  • Buffer Capacity:

    • The amount of acid/base a buffer can neutralize before pH changes significantly.

  • Preparing a Buffer Solution:

    • Steps include selecting components, calculating concentrations, weighing/dissolving, checking pH, and final volume adjustments.


pH Measurement

Definition

  • pH:

    • Measurement of hydrogen ion concentration, ranging from 0 to 14:

    • 0-6.9: Acidic

    • 7.0: Neutral

    • 7.1-14: Alkaline/Basic


pH Measurement Techniques

  • Litmus Paper:

    • Provides qualitative pH estimates through color changes.

  • pH Meters:

    • Delivers accurate, quantitative readouts.

  • Electrodes:

    • Measure electrical potentials correlated with hydrogen ion concentration.


Accumet AB pH Meters

  • The Accumet AB series, especially the AB315 model, is favored in labs for its accuracy and functionality.


Temperature Compensation in pH Measurement

  • pH measurements are temperature-dependent affecting electrode sensitivity and ionic movement.

  • Accurate calibration is needed for reliable results in clinical applications.


Performing pH Measurement

  1. Calibration:

    • Use appropriate buffer solutions for calibration.

  2. Preparation:

    • Rinse electrode with deionized water, do not wipe.

  3. Measurement:

    • Immerse electrode in sample and ensure it is covered.

    • Document values meticulously following protocols.


Upcoming Quiz

  • Next Week: Quiz in Clinical Laboratory Science, Chapter 4 planned.