IR Exam 2 trivia

  • PageRank developed by Stanford grad students Brin and Page

    • Brin is Russian

    • submitted to SigIR conference but got rejected because their experiments were poor and cherry picked

  • Tim Berners Lee invented the web

  • Winston Churchill - democracy is the worst form of government except for all the ones we’ve tried

  • William of Ockham - 13th century monk known for Occam’s razor, the problem-solving principle that recommends searching for explanations constructed with the smallest possible set of elements

    • an example of bias

    • finding a simple hypothesis helps ensure generalization

  • David Hume - Scottish Enlightenment philosopher - problem of induction - the sun is not guaranteed to rise tomorrow morning. there is nothing that guarantees you’ll be right on held out data. you just pick what your bias is, and do well when it’s right and poor when it’s wrong

  • Spam is a meat product

    • origin of spam as junk - Monty Python sketch

  • Bayes didn’t prove Bayes theorem.

    • independently discovered and proved by Laplace

  • Frank Rosenblatt - created the perceptron

  • Marvin Minsky

    • Turing winner

    • a founding father of symbolic ai

    • thought the perceptron was useless because of linear separability issue; underestimated how easy it is to generalize perceptron to multilayer

  • “greed is good”

    • from Wall Street movie

  • Leibniz, Newton

    • inventors of calculus; Newton did it first but didn’t publish

    • dy/dx is Leibniz’s notation

    • is Newton’s notation

  • Geoffrey Hinton

    • just won the nobel prize in physics

    • godfather of AI

    • ‘85 paper on backprop, along with Rumelhart

  • George Box - “All models are wrong, but some are useful”

    • referring to probabilistic modeling

    • e.g. Naive Bayes as a generative model is wrong because features are most definitely not all statistically independent. But it’s useful bc it yields good results

  • Claude Shannon

    • first to come up with the idea of a language model; probabilistically predicting the next word

    • father of information theory

  • deep learning won 2 nobel prizes

    • Hinton in physics

    • Demis Hassabis (cofounder of DeepMind, AlphaGo) in chemistry - protein folding, AlphaFold

  • DeepMind was bought out by Google

  • Hinton, Yoshua Bengio, and Yann LeCun won Turing award

    • Hinton - backprop

    • Lecun - CNNs

  • US government funded a lot of machine translation (MT) research between English and Russian in the 50s

  • Firth: You shall know a word by the company it keeps

  • Jeffrey Elman - developed Simple Recurrent Network; PhD at UT

  • Ilya Sutskever - Alexnet, cofounded OpenAI, tried to pull a coup on Sam Altman, now at Safe Superintelligence

  • Cambridge Analytical mined data from Facebook to impact Brexit and 2016 election

  • 15 minutes of fame - Andy Warhol

    • 15 minutes of shame - HBO documentary on cancel culture, produce by Monica Lewinsky

  • genocide in Myanmar by Buddhists enabled by Facebook

  • Betteridge’s Law

    • Betteridge's law of headlines is an adage that states: "Any headline that ends in a question mark can be answered by the word no." It is based on the assumption that if the publishers were confident that the answer was yes, they would have presented it as an assertion; by presenting it as a question, they are not accountable for whether it is correct or not. 

  • Nylah Anderson - teenager who died doing the blackout challenge on TikTok

  • Norton

    • Symantec got bought up by Norton when viruses started being a thing

    • Symantec handwrote grammar for NL to MR for answering database questions

    • founder of Symantec got PhD from UT

  • WSD - word sense disambiguation

  • grawlix - taking profane words and turning them into @$@^#!

    • you can’t cram the meaning of a whole #&@$ sentence into a single #!^$@ vector - Mooney

  • turkish is an agglomerative language, complex morphology

  • Fred Jelinek - said every time i fire a linguist my performance goes up