H446 Section 4 Exchanging Data

5.0(1)
learnLearn
examPractice Test
spaced repetitionSpaced Repetition
heart puzzleMatch
flashcardsFlashcards
Card Sorting

1/73

Study Analytics
Name
Mastery
Learn
Test
Matching
Spaced

No study sessions yet.

74 Terms

1
New cards

Compression

The process of reducing the size of data to save storage space or transmission time, often by removing redundancy.

2
New cards

Lossy

Compression that permanently removes some data, resulting in a loss of quality. File types include jpg and mp3.

3
New cards

Lossless

Compression that reduces file size without losing any data or quality. File types include png and flac.

4
New cards

Run Length Encoding

A lossless data compression method that replaces sequences of the same data value with a single value and a count. It is particularly effective for data with many consecutive repeated values.

5
New cards

Dictionary-based compression

A lossless data compression technique that uses a dictionary of previously seen data sequences to replace repeated occurrences with shorter references, improving efficiency.

6
New cards

Encryption

The process of converting data into a coded format to prevent unauthorized access, ensuring confidentiality and security during transmission or storage.

7
New cards

Plaintext

The original, unencrypted data that is readable and understandable before any encryption is applied.

8
New cards

Ciphertext

The result of encryption, ciphertext is the encoded format of plaintext that appears as a random sequence of characters, making it unreadable without the appropriate decryption key.

9
New cards

Caesar cipher

A substitution cipher where each letter in the plaintext is shifted a fixed number of places down the alphabet, commonly used in classical encryption. Also known as a shift cipher.

10
New cards

Substitution cipher

A method of encryption where each letter in the plaintext is replaced with a letter from a fixed system, creating a coded message. This technique can involve shifting letters or replacing them with entirely different characters.

11
New cards

Vernam cipher

A type of one-time pad encryption that uses a random key that is as long as the message, ensuring perfect secrecy when used correctly. Only cipher proven to be unbreakable.

12
New cards

One-time pad

A type of encryption that uses a random key or pad that is as long as the message itself. Each letter from the plaintext is combined with a letter from the pad to create a ciphertext, ensuring perfect secrecy when used correctly.

13
New cards

bitwise exclusive or XOR

A binary operation used in cryptography and computer science that takes two bits and returns true if the bits are different, and false if they are the same. It is often used in encryption algorithms, including the Vernam cipher.

14
New cards

Cryptanalysis

The study of techniques for defeating cryptographic systems, focusing on finding weaknesses in encryption methods to recover the plaintext from ciphertext without the key.

15
New cards

Perfect Security

A theoretical level of security in which the ciphertext provides no information about the plaintext, making it impossible to decrypt without the key.

16
New cards

Symmetric (private key) Encryption

A method of encryption where the same key is used for both encryption and decryption of data, requiring both parties to keep the key secret.

17
New cards

Asymmetric (public key) encryption

A method of encryption that uses a pair of keys: a public key for encryption and a private key for decryption, allowing secure communication without sharing the secret key.

18
New cards

Hashing

A process of converting input data into a fixed-size string of characters, which is typically a hash code, making it irreversible and ensuring data integrity.

19
New cards

Cryptographic hash function

A mathematical algorithm that transforms input data into a fixed-size hash value, ensuring data integrity and security. It is designed to be one-way, meaning it cannot be reversed to retrieve the original input.

20
New cards

checksum or hash total

A value used to verify the integrity of data by detecting errors in transmission or storage. It is calculated from the data and compared to a previously computed checksum to ensure accuracy. Also known as a digest.

21
New cards

Digital Signature

A cryptographic mechanism that validates the authenticity and integrity of a message or document, ensuring that it has not been altered and confirming the identity of the sender.

22
New cards

Digital Certificate

An electronic document used to prove the ownership of a public key, linking it to an entity. It is issued by a trusted certificate authority and contains information about the key, the identity of its owner, and the digital signature of the authority.

23
New cards

Certificate Authorities

Organizations that issue digital certificates, verifying the identity of entities and ensuring secure communication e.g. Symantec or Verisign.

24
New cards

Public Key Infrastructure

A framework that manages digital certificates and public-key encryption, enabling secure data exchange and authentication over networks.

25
New cards

Entity

A category of object, person, event or thing of interest to an organisation about which data is to be recorded.

26
New cards

Attributes

Characteristics or properties that define an entity, providing specific details about it.

27
New cards

Flat file database

A type of database that stores data in a single table or file, where each record is a separate line and fields are typically separated by delimiters.

28
New cards

Relational database

A type of database that stores data in multiple related tables, allowing for complex queries and data relationships through the use of primary and foreign keys.

29
New cards

Identifier

A unique attribute used to distinguish each entity in a database, ensuring that each record can be accurately referenced and retrieved.

30
New cards

Primary Key

A unique identifier for a record in a relational database table, ensuring that each entry can be distinctly accessed and referenced.

31
New cards

Secondary Key

An attribute in a database that is used for data retrieval but does not uniquely identify a record. It allows for sorting and searching within the data.

32
New cards

One-to-One

A type of relationship in a database where a single record in one table is associated with a single record in another table, ensuring a one-to-one correspondence between the two.

33
New cards

One-to-Many

A type of relationship in a database where a single record in one table can be associated with multiple records in another table, allowing for multiple entries related to one entry.

34
New cards

Many-to-Many

A type of relationship in a database where multiple records in one table can be associated with multiple records in another table, enabling complex associations between the two.

35
New cards

Entity Relationship Diagramm

(ERD) is a visual representation of the relationships between entities in a database, used to model data structures and their connections.

36
New cards

Foreign Key

A field in a table that creates a link between two tables by referencing the primary key of another table, establishing a relationship between them.

37
New cards

Linking table

A table used to connect two or more tables in a many-to-many relationship, containing foreign keys that reference the primary keys of the related tables.

38
New cards

Composite Primary Key

A primary key that consists of two or more columns in a table, used to uniquely identify a record when a single column is insufficient.

39
New cards

Referential Integrity

A database concept that ensures relationships between tables remain consistent, preventing actions that would leave orphaned records.

40
New cards

Normalisation

The process of organizing data in a database to reduce redundancy and improve data integrity by dividing large tables into smaller, related tables.

41
New cards

First Normal Form

A property of a relational database table that ensures each column contains atomic values, and each record is unique, eliminating duplicate rows.

42
New cards

Second Normal Form

A database normalization stage where all non-key attributes are fully functionally dependent on the primary key, eliminating partial dependencies.

43
New cards

Third Normal Form

A database design principle where all the attributes in a table are dependent only on the primary key, eliminating transitive dependency.

44
New cards

Data Redundancy

The unnecessary duplication of data within a database, which can lead to inconsistencies and increased storage costs.

45
New cards

Data Integrity

The accuracy and consistency of data within a database, ensuring that it remains reliable and trustworthy throughout its lifecycle.

46
New cards

SQL

A programming language used for managing and querying relational databases, allowing users to perform operations such as data retrieval, insertion, updating, and deletion.

47
New cards

SELECT

A SQL command used to query and retrieve data from a database.

48
New cards

FROM

A SQL clause used to specify the table from which to retrieve data in a query.

49
New cards

WHERE

A clause used in SQL to specify conditions for filtering records in a query, allowing users to retrieve only those rows that meet certain criteria.

50
New cards

ORDER BY

A SQL clause used to sort the result set of a query by one or more columns in ascending or descending order.

51
New cards

JOIN

A SQL operation used to combine rows from two or more tables based on a related column between them, allowing for the retrieval of related data.

52
New cards

CREATE

A SQL statement used to create a new table or database object, defining its structure and attributes.

53
New cards

ALTER TABLE

A SQL statement used to modify an existing table structure, such as adding, deleting, or modifying columns and constraints.

54
New cards

ADD

A command used in SQL to add a new column or constraint to an existing table.

55
New cards

DROP COLUMN

A command used in SQL to remove a column from an existing table, along with its data.

56
New cards

MODIFY COLUMN

A command in SQL used to change the data type or attributes of an existing column in a table.

57
New cards

INSERT INTO

A SQL command used to add new rows of data into a table.

58
New cards

VALUES

A keyword in SQL used to specify the data that will be inserted into a table when using the INSERT INTO command.

59
New cards

UPDATE

A SQL command used to modify existing records in a table by changing one or more columns' values.

60
New cards

SET

A SQL clause used in the UPDATE command to specify the columns that need to be updated and their new values.

61
New cards

DELETE FROM

A SQL command used to remove one or more records from a table based on a specified condition.

62
New cards

MICR

A technology used to read and encode the numbers on checks, typically found at the bottom of a check, to facilitate electronic processing. It stands for Magnetic Ink Character Recognition.

63
New cards

ACID

A set of properties (Atomicity, Consistency, Isolation, Durability) that guarantee reliable processing of database transactions.

64
New cards

Atomicity

A property of database transactions that ensures all operations within a transaction are completed successfully or none at all, maintaining data integrity.

65
New cards

Consistency

The property that ensures a transaction brings the database from one valid state to another, maintaining all predefined rules and constraints.

66
New cards

Isolation

A property that ensures transactions are executed in isolation from one another, preventing concurrent transactions from interfering with each other.

67
New cards

Durability

The property that guarantees that once a transaction has been committed, it will remain so, even in the event of a system failure, ensuring data is never lost.

68
New cards

Record Lock

A mechanism used to prevent multiple transactions from modifying a record simultaneously, ensuring data integrity and consistency during concurrent access.

69
New cards

Deadlock

A situation in which two or more transactions are unable to proceed because each is waiting for the other to release resources, effectively causing a standstill.

70
New cards

Serialisation

The process of ensuring that transactions are executed in a sequential order, maintaining database consistency by preventing concurrent transactions from interfering with each other.

71
New cards

Timestamp Ordering

A concurrency control method that uses timestamps to determine the order of transaction execution, ensuring that transactions are processed in a way that maintains database consistency.

72
New cards

Commitment Ordering

A concurrency control technique that ensures transactions are committed in a specific sequence to maintain database integrity and prevent conflicts.

73
New cards

Redundancy

The inclusion of extra copies of data in a database to ensure reliability and availability, helping to prevent data loss and maintain consistency in case of failures.

74
New cards