1/35
Flashcards covering key vocabulary and concepts from the Introduction to Information Technology lecture.
Name | Mastery | Learn | Test | Matching | Spaced |
---|
No study sessions yet.
Information Technology (IT)
The application of computing devices to create, store, manipulate, and exchange data.
Technology
A means for career advancement, a skill set needed to survive in society, and a way to make an impact beyond your own life, accelerating change around the world; a necessity, not a luxury.
Information Technology Advantages
Make us more productive, save time, provide entertainment, allow us to create things we never thought we could, easier communications, allow us to be creative and artistic, save lives.
QR (quick response) codes
Marketing Strategy. Sharing a resource using networks instead of owning it. For instance, cars, homes, etc.
Computer Literate
Understanding capabilities and limitations of computers. Knowing how to use computers wisely, safely and efficiently. Enables you to make informed purchasing decisions. Understand ethical, legal, and societal implications of technology.
Charles Babbage
“Father of Computing” who imagined the Analytical Engine
First Generation Computers (1939 – 1955)
Used Vacuum Tubes. Large mainframes developed during the war era.
ENIAC
Electronic Numerical Integrator and Computer; Big & clumsy; filled whole buildings; High electricity consumption; High failure rate.
Second Generation Computers (1956 – 1963)
Used Transistors – 1/10th the size of vacuum tubes. Faster and smaller than first generation computers; Computers produced less heat; Used punch cards for Input/Output
Third Generation Computers (1964 – 1970)
Used Integrated Circuits – small chip with 1000s of transistors. Increased reliability; Smaller size; Higher speed; Higher efficiency - less electric power; Lower cost
Fourth Generation Computers (1971 – present)
Used Microprocessors. Fuelled the development of PCs
Moore’s Law
Formulated by Intel co-founder Gordon Moore. The number of transistors that can be packed into a silicon chip of the same price would roughly DOUBLE every two years
Embedded Systems
A microprocessor used as a component in special purpose computers dedicated to perform specific tasks. Embedded devices have their program etched on the silicon chip known as firmware – many of which cannot be altered.
Servers
Central computers that provide services to other PCs over the network. Faster processing power; Serves multiple clients/users simultaneously - timesharing; More memory & storage capacity; Connected via high-speed network connection
Supercomputer
A computer with a high level of computing performance which may consist of a group of servers on one grid. Thousands of cores (microprocessor); Speeds measured in Petaflops per second
The Internet
A vast network of connected machines transmitting a wide variety of content enabled by advanced web browsers that support new formats
What is Information Technology (IT)?
IT is the application of computing devices to create, store, manipulate, and exchange data. It refers to a world of combined technologies integrated to deliver services.
Why does Information Technology matter?
It’s essential for productivity, creativity, communication, and even saving lives. It’s a necessity, not a luxury.
What are some advantages of IT?
Productivity, time-saving, entertainment, creativity, better communication, and improved healthcare.
How has technology changed society?
Changed consumption (e.g. shared economy), marketing (e.g. QR codes, online payments), and everyday life.
How does technology impact health?
Vaccine development, prosthetics, and 3D printed limbs.
How does technology impact the environment?
Use of atmospheric sensors and smart sprinklers for water conservation.
How has tech changed art?
Digital paintings, 3D modeling, and AI-generated images.
How has tech impacted science and education?
Virtual experiments, simulations, collaborative learning tools, and plagiarism checkers.
What does it mean to be computer literate?
Understanding how to use and assess computers wisely, safely, and ethically.
Who is the “Father of Computing”?
Charles Babbage – developed the concept of the Analytical Engine.
What were first-gen computers like?
Used vacuum tubes, were very large, and consumed a lot of electricity. Notable ones: Z1, ABC, Colossus, Mark I, ENIAC.
Key features of 2nd-gen computers?
Used transistors, smaller than 1st-gen, more reliable, less heat, used punch cards.
What was introduced in 3rd-gen computers?
Integrated Circuits – made computers smaller, faster, more efficient, and cheaper.
What defines 4th-gen computers?
Microprocessors – chips with all computing components. Based on Moore’s Law: transistors double every 2 years.
What are embedded systems?
Computers in devices (e.g., heart monitors, TVs) using firmware etched into silicon chips for specific tasks.
What is a server?
A powerful computer that provides services to multiple users over a network.
What are supercomputers used for?
High-performance tasks like weather forecasting, scientific modeling, and complex simulations.
How has the internet evolved?
Started in 1960s (research), became public in the 1990s. Over 4.5 billion users by 2020.
What are the major computing eras?
Institutional Computing (1950): Mainframes for experts.
Personal Computing (1975): PCs in homes/schools.
Interpersonal Computing (1995): Internet for public use.
Collaborative Computing (2005): Mobile & collaborative platforms (Web 2.0).
What are future trends in digital tech?
Flying cars, high-speed internet everywhere, and yet-to-be-imagined innovations.