5: Dimensional Analysis by Co-Occurrence

0.0(0)
studied byStudied by 0 people
learnLearn
examPractice Test
spaced repetitionSpaced Repetition
heart puzzleMatch
flashcardsFlashcards
Card Sorting

1/29

encourage image

There's no tags or description

Looks like no tags are added yet.

Study Analytics
Name
Mastery
Learn
Test
Matching
Spaced

No study sessions yet.

30 Terms

1
New cards

Latent Semantic Analysis

A technique of identifying association amongst words in a document - often used for natural language processing.

  • Removes the requirement of a square matrix that exists in PCA

  • Instead adds the requirement for co-occurrence of events

2
New cards

Occurrence Matrix

Describes how frequent different terms happen in each document

<p>Describes how frequent different terms happen in each document</p>
3
New cards

Terms

Events.

4
New cards

Documents

Random variables.

5
New cards

One-Mode Factor Analysis

Every value in the matrix represents the same type of variable (i.e. they’re all co-variances of the variables).

6
New cards

Requirement for covariance to be a good estimator of relations

All variables are unimodal or roughly centred.

7
New cards

Unimodality (of a variable)

There is one single highest value.

8
New cards

Reason to use LSA

It’s another way of looking at relations between variables, specifically co-occurrence of events - central tendency is not generally relevant with multi-model distributions.

9
New cards

Covariance Matrix Format

Where:

  • m is the set of random variables

  • n is the number of dimensions (rows)

<p>Where:</p><ul><li><p>m is the set of random variables</p></li><li><p>n is the number of dimensions (rows)</p></li></ul><p></p>
10
New cards

Covariance Matrix - Columns

A vector detailing how relevant the given document (dj) is to each event.

11
New cards

Covariance Matrix - Rows

A vector corresponding to a term (ti) and how related it is to each document (random variable).

12
New cards

Argand Basis

A coordinate basis used to represent complex numbers.

<p>A coordinate basis used to represent complex numbers.</p>
13
New cards

Complex Conjugate

A number with an imaginary part equal in magnitude but opposite in sign - i.e. z = a + bi and z* = a - bi

14
New cards

Complex Conjugate Properties

  • The product of a complex number and its conjugate is a real number: a2 + b2

  • Conjugation is distributive over four main operations for any two complex numbers.

  • Conjugation does not change the modulus of a complex number - |z*| = |z|

  • (z*)* = z

  • Conjugation is commutative for a power, i.e. (zn)* = (z*)n

15
New cards

Complex Conjugate Operation - Addition

(z + w)* = z* + w*

16
New cards

Complex Conjugate Operation - Subtraction

(z - w)* = z* - w*

17
New cards

Complex Conjugate Operation - Multiplication

(zw)* = z*w*

18
New cards

Complex Conjugate Operation - Division

(x / w)* = z* / w*, w ≠ 0

19
New cards

Complex Conjugate Application

De-rotation - rotating a vector by 1 + i will rotate it by 45 degrees, which can reversed with 1 - i (the complex conjugate).

20
New cards

Conjugate Transpose Matrix

Transposing a matrix and applying the complex conjugate to each item within.

<p>Transposing a matrix and applying the complex conjugate to each item within.</p>
21
New cards

Inverse Matrix

A matrix where AA-1 = A-1A = I

22
New cards

Cofactor Matrix

The minor of an element in a matrix (Mij) being multiplied by (-1)i+j.

<p>The minor of an element in a matrix (M<sub>ij</sub>) being multiplied by (-1)<sup>i+j</sup>.</p>
23
New cards

Adjoint Matrix

The transposed cofactor matrix - adj(A) = cof(A)T.

24
New cards

Calculating the Inverse Matrix

Calculating the adjoint matrix.

25
New cards

Lanczos Algorithm

An iterative method to find the m “most useful” eigenvalues and eigenvectors of a Hermitian matrix, where m is often but not necessarily much smaller than n.

26
New cards

Lanczos Algorithm - Properties

Efiicient but numerically unstable.

27
New cards

One-Sided Jacobi Algorithm

An iterative algorithm where matrix A is iteratively transformed into a matrix with orthogonal columns: Anew ← AoldJ(p, q, O) with J(p, q, O) being a rotation matrix.

28
New cards

Two-Sided Jacobi Algorithm

An iterative algorithm that generalises the Jacobi eigenvalue algorithm, where Anew ← JTG(p, q, O)AoldJ

  • G(p, q, O) is the Givens rotation matrix

  • J is the Jacobi rotation matrix

29
New cards

Rank

The number of columns of a matrix.

30
New cards

Latent Semantic Analysis as Singular Value Decomposition

knowt flashcard image