History of computers, Modern Computer and AI

0.0(0)
studied byStudied by 4 people
learnLearn
examPractice Test
spaced repetitionSpaced Repetition
heart puzzleMatch
flashcardsFlashcards
Card Sorting

1/31

encourage image

There's no tags or description

Looks like no tags are added yet.

Study Analytics
Name
Mastery
Learn
Test
Matching
Spaced

No study sessions yet.

32 Terms

1
New cards

The Turing Machine

A Theoretical machine that can compute anything that is possible to be computed.

2
New cards

Augmented Reality (AR)

A technology that enhances the real world by overlaying digital content—like images, sounds, or information—onto it. This content is typically displayed on devices such as smartphones, tablets, or AR glasses.

3
New cards

Virtual Reality(VR)

A simulated experience that which uses technology to create immersive, interactive environments.

4
New cards

Haptic Feedback

A technology that uses touch or vibrations to give physical responses to users when they interact with a device.

5
New cards

What is AI?

The ability of a computer or machine to perform tasks that normally require human intelligence.

6
New cards

Machine Learning

A branch of artificial intelligence (AI) where computers learn from data instead of being explicitly programmed.

7
New cards

Generative AI

A type of artificial intelligence that can create new content—like text, images, music, or code—based on patterns it has learned from data.

8
New cards

What is narrow AI?

A type of artificial intelligence that is designed to perform a specific task or a limited set of tasks.

9
New cards

Artificial General Intelligence

A type of AI that can understand, learn, and apply knowledge across a wide range of tasks, not just one specific task, just like a human.

10
New cards

Computer Vision

A field of artificial intelligence (AI) that enables computers to see, understand, and interpret images or videos, just like humans do.

11
New cards

What do percentages mean in regards to computer vision?

percentages usually represent how confident the AI or model is about what it sees or predicts. When a computer vision system says, for example, “This is a cat (95%)”, it means the system is 95% confident that the object in the image is a cat.

12
New cards

Bias in AI

refers to when an artificial intelligence system makes unfair or inaccurate decisions because the data it learned from was skewed or incomplete. Example: A hiring algorithm that favors men if the data mostly includes male applicants.

13
New cards

Supervised learning

A type of machine learning where a computer is trained using labeled data—that means each example in the training set includes the correct answer.

14
New cards

Semi-supervised learning

a type of machine learning that uses a small amount of labeled data and a large amount of unlabeled data to train a model.

Simple explanation:

It’s like giving the computer a few examples with answers and many without, and it learns from both.

15
New cards

Reinforcement learning

A type of machine learning where an AI learns by trial and error, receiving rewards or punishments based on its actions.

16
New cards

Neural Network

A type of artificial intelligence modeled after the human brain. It’s made up of layers of connected “neurons” that work together to recognize patterns and make decisions. Examples: image recognition and voice recognition.

17
New cards

Turing complete

A system (programming language, machine, or computational model) is Turing-complete if it can simulate any Turing machine—that is, given enough time and memory, it can compute any function that’s computable.

18
New cards

Turing test

A simple way to check if a computer can “think” like a human: if, in a text‐only conversation, a human judge can’t reliably tell the machine from a real person, the machine is said to have passed the test.

19
New cards

What is Deep Learning

a branch of machine learning where computers use many-layered neural networks to automatically learn how to recognize patterns and make decisions directly from raw data.

20
New cards

What was the word computer originally used for?

People who were relied on for complex mathematical problems.

21
New cards

Difference engine (1830s)

Machine used to count using gears to increase numbers. Designed to make regular and polynomial calculations. Invented by Charles Babbage

22
New cards

Analytical Engine (1830s)

The first fully automated and digital computer which could store memory on a memory unit and could produce hard copy printouts.

23
New cards

What is the significance of Ada Lovelace?

Considered the world’s first computer programmer. She wrote the first published code.

24
New cards

Advantages and Disadvantages of Vacuum Tubes

Advantages:

  • Suitable for high voltage

  • Simplicity

  • Resistant to EMPs

Disadvantages:

  • High power consumption and high levels of heat wastage

  • low physical strength

  • High cost

25
New cards

The ENIAC

First programmable, electronic, general purpose digital computer. It was one of the first and most famous computers to employ vacuum tubes.

26
New cards

How did transistors revolutionise computing?

  • Transistors are way smaller than vacuum tubes.

  • Far more reliable than vacuum tubes.

  • Allowed for the creation of more intricate circuits.

  • Easily mass producable.

27
New cards

Modern computers

An integrated system with hardware, software, and a user interface, used for calculations and logic.

28
New cards

Super computers

A robust computing device that can process data at speeds which are measured in floating points per second(flops) which is used to perform complex calculations and simulations usually in the field of research, AI and big data computing.

29
New cards

The internet of Things (IoT)

The Internet of Things (IoT) means everyday objects—like fridges, lights, or watches—are connected to the internet so they can collect and share data. This lets them work smarter, like turning off lights when no one’s in a room or tracking your steps.

30
New cards

Quantum Computing

Quantum computing is a new kind of computing that uses the weird rules of quantum physics to solve problems much faster than regular computers.

Instead of bits (which are 0 or 1), quantum computers use qubits, which can be both 0 and 1 at the same time—this helps them do many calculations at once.

31
New cards

Assistive Technology (AT)

A device or software designed to support people with disabilities or additional learning needs in accessing information, learning, or communication.

32
New cards

Key functions/examples of assistive tech

  • Text to speech (TTS)

  • Speech Recognition (speech to text)

  • Screen readers

  • Magnifiers