L07 - Support Vector Machine for Diagnosis

0.0(0)
studied byStudied by 0 people
GameKnowt Play
learnLearn
examPractice Test
spaced repetitionSpaced Repetition
heart puzzleMatch
flashcardsFlashcards
Card Sorting

1/15

encourage image

There's no tags or description

Looks like no tags are added yet.

Study Analytics
Name
Mastery
Learn
Test
Matching
Spaced

No study sessions yet.

16 Terms

1
New cards

Maintenance

Combination of all technical, administrative and managerial actions during the life cycle of an item intended to retain it in, or restore it to, a state in which it can perform the required function

Maintenance: All actions (technical, administrative, managerial) to keep an item working or restore it so it can work.

2
New cards

Inspection

Examination for conformity by measuring, observing, or testing the relevant characteristics of an item.

Inspection: Checking an item’s characteristics by measuring, observing, or testing.

3
New cards

Service

Measures to delay the reduction of the existing wear-out reserve

Actions to slow down the wear and tear of an item.

4
New cards

Line Maintenance

Regular, routine maintenance tasks that are performed on an aircraft while it is in service. It is designed to keep the aircraft in good working condition and ensure its safety and reliability.

Line Maintenance: Routine tasks done while the aircraft is in service (e.g., pre-flight check, small repairs).

5
New cards

Base Maintenance:

More extensive maintenance type and involves tasks such as major repairs, overhauls, and modifications to the aircraft. Unlike line maintenance, base maintenance is typically performed on a scheduled basis.
Base Maintenance: Larger tasks like overhauls and major repairs, usually on a schedule.

6
New cards

Diagnosis

A judgment about what a particular illness or problem is, made after examining it.

Making a judgment about a problem after examination.

Involves steps from raw data to assessing the health of a system/component.

7
New cards

OSA-CBM (Open System Architecture for Condition-Based Maintenance)

  1. Data Acquisition – Collect raw data

  2. Data Manipulation – Process data

  3. State Detection – Identify condition

  4. Health Assessment – Evaluate health

  5. Prognostics Assessment – Predict remaining life (RUL)

  6. Advisory Generation – Maintenance recommendations

  7. Presentation – Show results

  • Key idea: Without diagnosis, you can’t make accurate prognosis.

8
New cards

Support Vector Machines

Linear classification: Which Hyperplane?

SVM finds the optimal line by maximizing the distance from the closest points (support vectors).

  • Many possible separating lines exist for classification.

  • Some methods find a line, but not the best one.

  • If no points are near the line, decisions are more certain.

9
New cards

Support Vector Machines

Maximising the margin

  • SVM chooses the line with the largest margin (gap between classes).

  • Only the support vectors (closest points) matter for determining the line.

  • Solving SVM is a quadratic programming problem.

<ul><li><p>SVM chooses the line with the largest margin (gap between classes).</p></li><li><p>Only the support vectors (closest points) matter for determining the line.</p></li><li><p>Solving SVM is a quadratic programming problem.</p></li></ul><p></p>
10
New cards

Classification with SVMs for a given a new point x

  • For a new point xxx, calculate score = wTx+bw^T x + bwTx+b

  • If score > 0 → one class, score < 0 → other class.

  • Can set threshold ttt for confidence:

    • score > t → yes

    • score < -t → no

    • otherwise → don’t know

<ul><li><p>For a new point xxx, calculate <strong>score</strong> = wTx+bw^T x + bwTx+b</p></li><li><p>If score &gt; 0 → one class, score &lt; 0 → other class.</p></li><li><p>Can set threshold ttt for confidence:</p><ul><li><p>score &gt; t → yes</p></li><li><p>score &lt; -t → no</p></li><li><p>otherwise → don’t know</p></li></ul></li></ul><p></p>
11
New cards

Linear SVMs: Summary

  • SVM classifier is a separating hyperplane.

  • Only the support vectors (closest training points) define it.

  • Solved with quadratic optimization and Lagrange multipliers.

  • In the dual form, the model uses only inner products between points.

12
New cards

Soft Margin Classification

  • If data is not perfectly separable, allow errors with slack variables ξi​.

  • Some points can be misclassified for flexibility.

  • Minimize both margin violation and training errors.

  • C parameter controls trade-off between margin size and errors.

<ul><li><p>If data is not perfectly separable, allow errors with <strong>slack variables</strong> ξi​.</p></li><li><p>Some points can be misclassified for flexibility.</p></li><li><p>Minimize both margin violation and training errors.</p></li><li><p><strong>C parameter</strong> controls trade-off between margin size and errors.</p></li></ul><p></p>
13
New cards

Non-linear SVMs

  • Works well for linearly separable data (even with some noise).

  • If separation is too hard, map data to a higher-dimensional space.

  • This may make it linearly separable in the new space.

<ul><li><p>Works well for linearly separable data (even with some noise).</p></li><li><p>If separation is too hard, map data to a higher-dimensional space.</p></li><li><p>This may make it linearly separable in the new space.</p></li></ul><p></p>
14
New cards

The Kernel Trick

  • Map data into higher dimensions without explicitly computing coordinates.

  • Use a kernel function K(xi​,xj​) to calculate inner products in the new space.

  • Common kernels:

    • Linear

    • Polynomial

    • RBF (Radial Basis Function) – infinite-dimensional space

  • γ parameter controls how far a single data point’s influence reaches.

<ul><li><p>Map data into higher dimensions without explicitly computing coordinates.</p></li><li><p>Use a kernel function K(xi​,xj​) to calculate inner products in the new space.</p></li><li><p>Common kernels:</p><ul><li><p>Linear</p></li><li><p>Polynomial</p></li><li><p>RBF (Radial Basis Function) – infinite-dimensional space</p></li></ul></li><li><p><strong>γ parameter</strong> controls how far a single data point’s influence reaches.</p></li></ul><p></p>
15
New cards

Safety vs. False Alarms

  • Positive = Faulty, Negative = Healthy.

  • Safety goal: No real fault should be missed → False Negatives (FN) = 0.

  • Reduce false alarms: Avoid alarms for healthy machines → False Positives (FP) = 0.

16
New cards

Safety vs. False Alarms translation into evaluation metrics

  • True Positive (TP): The model says faulty and it’s actually faultycorrect alarm.

  • False Positive (FP): The model says faulty but it’s actually healthyfalse alarm.

  • True Negative (TN): The model says healthy and it’s actually healthycorrect no alarm.

  • False Negative (FN): The model says healthy but it’s actually faultymissed fault.

Precision = TP / (TP + FP)

Of all the alarms given, how many were actually correct?

Recall = TP / (TP + FN)

Of all the real faults, how many did we detect?

<ul><li><p><strong>True Positive (TP)</strong>: The model says <strong>faulty</strong> and it’s actually <strong>faulty</strong> → <strong>correct alarm</strong>.</p></li><li><p><strong>False Positive (FP)</strong>: The model says <strong>faulty</strong> but it’s actually <strong>healthy</strong> → <strong>false alarm</strong>.</p></li><li><p><strong>True Negative (TN)</strong>: The model says <strong>healthy</strong> and it’s actually <strong>healthy</strong> → <strong>correct no alarm</strong>.</p></li><li><p><strong>False Negative (FN)</strong>: The model says <strong>healthy</strong> but it’s actually <strong>faulty</strong> → <strong>missed fault</strong>.</p></li></ul><p><strong>Precision</strong> = TP / (TP + FP)</p><p>Of all the alarms given, how many were actually correct?</p><p><strong>Recall</strong> = TP / (TP + FN)</p><p>Of all the real faults, how many did we detect?</p><p></p>