Coding and Cryptography

0.0(0)
studied byStudied by 0 people
learnLearn
examPractice Test
spaced repetitionSpaced Repetition
heart puzzleMatch
flashcardsFlashcards
Card Sorting

1/31

encourage image

There's no tags or description

Looks like no tags are added yet.

Study Analytics
Name
Mastery
Learn
Test
Matching
Spaced

No study sessions yet.

32 Terms

1
New cards

Define a communication channel

Accepts string of symbols from finite alphabet, outputs string of symbols from another finite alphabet. Modelled by the probabilities P(y1…yn received| x1…xn sent)

2
New cards

Define a discrete memoryless channel

p_ij = P(bj received | ai sent) are same for each channel use and independent of past and future use.

3
New cards

Define channel matrix of a discrete memoryless channel

r x s stochastic matrix P=(p_ij) (where p_ij = P(bj received | ai sent)

4
New cards

What’s the information rate of a code

rho(C) = 1/n log2 (m) (Where n is the number of letters sent per message and m is the number of messages)

5
New cards

What is the error rate of a code

max x in M of P(error | x sent)

6
New cards

When can a channel transmit reliably at rate R

Sequence of codes of length n for which lim as n → nifty rho(Cn) + R and lim as n→infty e(Cn) = 0. (information tends to R and error tends to 0).

7
New cards

What is the capacity of a channel

Supremum of all reliable transmission rates

8
New cards

Define a code

function c: A→B where A,B alphabets

9
New cards

Define the codewords of c

Elements in Im(c)

10
New cards

When is a code c decipherable

If c* is injective

11
New cards

Give some codes which are always decipherable

Block codes, comma codes, prefix-free codes

12
New cards

State Kraft’s inequality

sum from I=1 to m of a^-li <= 1 (m is size of alphabet of A, a is size of alphabet of B, c: A → B* is a code with codewords of length l1,…,lm)

13
New cards

Give an equivalent condition to a prefix-free code existing

Kraft’s inequality holds

14
New cards

Prove a prefix free code exists iff Kraft’s inequality holds.

rewrite K inequality, (forward direction) consider all strings length s in B and each codeword as a preifx of these, make sum, divide by a^s. (backward direction) Induction, s=1 clear, use expanded K inequality and multiply by a^s, calculate strings already taken up by prefixes, room to add n_s strings of length s, done.

15
New cards

What does any decipherable code satisfy

Kraft’s inequality

16
New cards

Prove any decipherable code satisfies Kraft’s inequality.

Send R codewords in a row, simplify expression, b_l <= |B^l| =a^l, give inequality again, R root, as R to infinity, RHS tends to 1

17
New cards

A decipherable code with prescribed word length exists if and only if…

a prefix-free code with the same word lengths exists

18
New cards

Define the entropy of a random variable

H(X)=H(p1, …, pn) = -sum from i=1 to n of (pi log(pi)) = -E[log pi] (logs base 2). x1, …, xn are values te RV X takes and p1,…,pn are the probabilities of each

19
New cards

State Gibbs inequality

-sum(pi log(pi)) <= - sum(pi log(qi)) where (p1,…,pn) and (q1,…qn) are discrete probability distributions.

20
New cards

When does equality hold for Gibb’s inequality

If pi=qi

21
New cards

Prove Gibb’s inequality

log x = ln x/ln 2, define I, ln inequality, qi-pi situ, RHS <=0, result, equality if qi sum = 1 and qi/pi=1 for all I, so pi=qi.

22
New cards

Give an inequality regarding H(p1,…,pn), when does equality hold?

H(p1,…,pn) <= log n, with equality iff p1=…=pn

23
New cards

When is a code called optimal

When it has the smallest possible expected word length (sum pi li = E[S]) of all decipherable codes

24
New cards

State Shannon’s Noiseless Coding Theorem

Expected word length of a decipherable code satisfies H(X)/log a <= E[S] < H(X)/log a +1

25
New cards

Prove Shannon’s Noiseless Coding Theorem

lower bound: Let qi=a^(-li)/D, so sum qi=1, Gibb’s inequality, simplify, McMillans inequality (log d neg), so first result (equality with D=1, pi=qi=a^-li/D).

upper bound: construct shannon fano code, let li= roof(-log_a pi), so -log_a pi<= li < -log_a pi +1, reverse inequality and exponentiate, Krafts satisfied, so prefix code exists with (calculate E[S]).

26
New cards

When does equality hold in Shannon’s Noiseless Coding Theorem?

If pi=a^(-li) with sum(a^(-li)) =1

27
New cards

If we have some messages in A with probabilities p1,…,pm and c is an optimal prefix-free code with word lengths l1,…,lm, then give two statements we can deduce and prove them.

If pi>pl then li<=lj, else can swap the ith and jth codewords.

Among all codewords of maximal length, there must be two which only differ by the last digit, else can just delete the last digit, and as prefix-free, we get another prefix-free code with smaller expected word length.

28
New cards

What can we say Huffman codes are?

Optimal

29
New cards

Prove that Huffman codes are optimal.

Induction, base case m=2 clear. Suppose true for m-1, and suppose we combine two least likely of our m probabilities to get m-1, then E[Sm]=E[Sm-1] + p_(m-1) + p_m.

Other optimal code c’m, wlog prefix free and by prev lemma, codewords of max length differ by final symbol, (so c’m(mu_(m-1))=y0 and c’m(mu_m)=y1 for some y in {0,1}*). Make c’(m-1) be prefix free code for X_m-1, E[S’m]=E[S’_{m-1}] +pm-1+pm, E[Sm-1]<=E[S’m-1] as by induction cm-1 is optimal, so E[Sm]<=E[S’m] so cm optimal.

30
New cards

Define the joint entropy of X and Y

H(X,Y)= - sum over x in A, sum over y in B of [P(X=x,Y=y)logP(X=x,Y=y)]

31
New cards

Give an inequality with joint entropy, when does equality hold?

H(X,Y)<=H(X) + H(Y), equality iff X and Y are independent

32
New cards

Prove H(X,Y)<=H(X) + H(Y), equality iff X and Y are independent

Define A, B, pij, pi and qj. Apply Gibbs to pij and piqj, expand with log rules into result. Equality iff pij=piqj for all ij, or rather if independent.