Sometimes innovation is a matter of timing.
A big idea comes along at just the moment when the technology exists to implement it.
Other times, timing is out of kilter.
Charles Babbage published his paper about a sophisticated computer in 1837, but it took a hundred years to achieve the scores of technological advances needed to build one.
Progress comes not only in great leaps but also from hundreds of small steps.
Advances came from a combination of capabilities, ideas, and needs that coincided in multiple places.
Example: Hollerith's tabulator through punch cards for census.
Hollerith eventually founded IBM.
Digital: calculations using digits, discrete and distinct integers.
Analog: work by analogy.
A slide rule is analog, an abacus is digital.
Example: Lord Kelvin and James Thomson invented a machine to solve differential equations through integrals.
Vannevar Bush at MIT built on it so it could solve equations with multiple variables.
Properties that define modern computing:
Digital: based on digital computers.
Analog computing was revived in the 2010s to try to mimic the human brain.
Binary: using 1s and 0s. Circuits composed of on-off switched.
Electronic: Tommy Flowers in 1930s used vacuum tubes as on-off switches.
Later transistors and microchips.
Operations are faster than eletromechanical switches.
General purpose: machines can be programmed and reprogrammed.
Could handle other tasks rather than just maths.
Born into a British family with no wealth.
Early years raised by another family, his family went to India.
At 13, went to boarding school.
Discovered he's gay, but his love died of tuberculosis early on.
Won a scholarship to attend King's College to read mathematics.
Turing was particularly interested in the math at the core of quantum physics, which describes how events at the subatomic level are governed by statistical probabilities rather than laws that determine things with certainty.
this uncertainty and indeterminacy at the subatomic level permitted humans to exercise free will.
He would try to determine if the human mind was different from a deterministic machine.
David Hilbert proposed the mathematical formulation of the theory of general relativity at the same time as Einstein.
Posed three fundamental questions about any formal system of mathematics:
(1) Was its set of rules complete, so that any statement could be proved (or disproved) using only the rules of the system?
(2) Was it consistent, so that no statement could be proved true and also proved false?
(3) Was there some procedure that could determine whether a particular statement was provable, rather than allowing the possibility that some statements were destined to remain in undecidable limbo?
He answered yes to the first two.
Then Kurt Gödel showed that there are statements that could neither be proved not disproved.
Example: "This statement is unprovable".
Turing expressed the last question as:
Is there a “mechanical process” that can be used to determine whether a particular logical statement is provable?
This question is the "decision problem".
He created a Logical Computing Machine
Endless paper tape containing symbols that could be interpreted based on a table of instructions.
Any real number that was defined by a mathematical rule could be calculated.
There also existed noncomputable numbers, also called the halting problem, where no set of instructions would allow the machine to solve it.
Therefore, the third question was unsolvable.
No mechanical procedure can determine the probability of every mathematical statement.
Turing had a tendency to being a loner.
In 1936, he went to Princeton, to the same department as Einstein and Neumann.
In 1937, MIT's Claude Shannon turned in what was called the Magna Carta of the Information Age: A Symbolic Analysis of Relay and Switching Circuits.
He worked under Vannevar Bush.
Later he wen to work at Bell Labs, a research facility run by AT&T.
He applied the phone's switches to Boolean operations.
Switches and relays could be organized to perform logical operations (AND, OR).
This is what he wrote his thesis on.
Since logic was related to the way human minds reason, a machine that performed logical tasks could, in theory, mimic the way humans think.
George Stibitz built on these ideas to make the Complex Number Calculator (calculator).
Howard Aiken was a doctoral student from Harvard in 1937.
He needed a machine to help with complicated physics calculations.
He found Babbage's Difference Engine and wanted to use them for a computer.
He got IBM to build the Mark I based on his instructions.
He went to the US Navy for a time.
He convinced the Navy to take over the Mark I so he could finish building it.
His computer took 6 seconds to make a multiplication, compared to Stibitz's 1 second.
However, it was fully automatic.
All of these pioneers were beaten by a German engineer.
He built a binary calculator that could read instructions from a punched tape, the Z1, completed in 1938.
He wanted to mechanize the tedious process of solving mathematical equations.
He later switched from paper tape to discarded movie film.
The machine was clunky and jammed because all the components were handmade.
He did not have IBM engineers who could have built it better for him.
However, it proved that his logic would work.
He didn't use electronic vacuum tubes because it was expensive, but if he had, he'd have built the first modern computer.
For the Z2 and Z3, he used mechanical relays.
Zuse and his friend Helmut Schreyer proposed to the German Army to build this machine using vacuum tubes in 1942.
The Army said they'd win the war before the two years it took to build this machine.
Zuse was sent back to work on engineering airplanes.
In 1943, his computers and designs were destroyed in the Allied bombing of Berlin.
In 1937, he was also building a calculator that used vacuum tubes.
It didn't work reliably, but it was the first partly electronic digital computer.
His father was an engineer working for Thomas Edison.
His mom was a math teacher.
He studied logarithms in his youth.
He finished high school in two years.
Then studied electrical engineering.
And a doctorate in physics.
Because he had gone to Iowa State instead of Harvard, he also didn't have a team of engineers to help him out.
He tried analog calculators.
A bigger slide ruler.
A contraption that moulded paraffin to calculate partial differential equations.
He used Babbage's concept to store "memory".
He liked vacuum tubes but they were expensive, so he used condensers (capacitors).
While slower, they were cheaper.
And he used binary.
The idea came to him during a drive, he loved cars.
He faced the problem of recharging the capacitors.
He put them on rotating cylinder drums so they would come in contact with brushlike wires.
He enlisted an engineer assistant, Clifford Berry.
It could solve linear equations with up to 29 variables.
It would process two equations, eliminate one variable, reprint the resulting equations onto punch cards, which were then fed back into the machine.
He sent his proposal to a Chicago lawyer, who neglected it and never filed a patent for it.
When he was drafted into the Navy, his machine was sent to the basement of Iowa State and forgotten.
A student, not knowing what it was, dismantled it, because he needed space.
While it made lightning-fast calculations, the mechanical parts slowed it down considerably.
Additionally, it was not programmable, only made to solve linear equations.
He helped his dad make calculations on a desktop adding machine, so he loved data-driven meteorology and electrical circuits.
Studied at Johns Hopkins University and later a PhD in physics.
Later became a physics professor.
He loved to talk and was a showman as a professor.
He was inspired bu the 1939 New York World's Fair, and other such demonstration.
He saw IBM's punch card calculator and Stibitz's Complex Number Calculator.
He worked with the MIT Differential Analyzer.
So he decided to build a vacuum-tube computer.
He discovered neon tubes were slower but cheaper than vacuum and could be used as switches.
In December 1940, me met Atanasoff and wanted to see his machine.
He was impressed by the condensers.
But by being mechanical, it was too slow, and it couldn't be programmed.
So he gathered ideas from lots of different places and then combined them into his own.
There was a legal debate whether he had stolen Atanasoff's ideas.
Built a machine that measured a kiss's passion.
Was Mauchly's helper.
“A physicist is one who’s concerned with the truth,” he later said. “An engineer is one who’s concerned with getting the job done.
War mobilizes science.
WW2 provided the impetus to fund Mauchly's device.
To create a table for a category of a shell shot by one gun:
Using MIT's Differential Analyzers.
More than 170 women known as "computers" used desktop adding machines to solve equations.
It took more than a month to make one table, very ineffective.
Maulchy proposed a device to make this faster, endorsed by Lt. Goldstine, and obtained funding from the army.
Maulchy was visionary, Lt. Goldstine oversaw budget and Eckert was chief engineer.
Maulchy made people feel loved, Eckert pushed them for precision.
Began in June 1943.
ENIAC was digital but used a decimal system.
It used conditional branching (capability developed by Ada Lovelace) to hop around a program and repeat blocks of code.
It was fully operational in November 1945.
The size of a three-bedroom apartment.
Could do 5,000 additions and substractions in one second.
In 1937, Turing constructed the first stages of a coding machine that turned letters into binary numbers and encrypted them.
The German enigma code encrypted military messages by using a cipher that, after every keystroke, changed the formula for substituting letters.
Turing designed "the bombe" to decode it using subtle weaknesses in the coding, such as repeated German words.
Eventually it became too slow to properly decrypt the messages.
In 1943 London, another vacuum tube computer had been secretly built: Colossus to decrypt these messages.
By Max Newman and Tommy Flowers, with input from Alan Turing.
It helped during D-Day to know Hitler was not sending extra troops to Normandy.
Almost all 1950s computers trace their roots to ENIAC.