W7-9

0.0(0)
studied byStudied by 0 people
learnLearn
examPractice Test
spaced repetitionSpaced Repetition
heart puzzleMatch
flashcardsFlashcards
Card Sorting

1/59

flashcard set

Earn XP

Description and Tags

finalisation ting

Study Analytics
Name
Mastery
Learn
Test
Matching
Spaced

No study sessions yet.

60 Terms

1
New cards

Normalization

A table design technique aimed at minimizing data redundancies, focusing on the characteristics of specific entities.

2
New cards

1NF, 2NF, and 3NF

First three normal forms, most commonly used in normalization.

3
New cards

Iterative ER process

The best practice when performing normalization to define all entities and their attributes so that all equivalent tables are in 3NF.

4
New cards

Data Redundancy

A situation in a database where data is unnecessarily repeated, leading to potential inconsistencies and inefficiencies.

5
New cards

Data Anomalies

An undesirable situation in a database where inconsistencies arise during insertion, deletion, or update operations due to data redundancy.

6
New cards

Primary Key

To fulfill 1NF, eliminate repeating groups and identify the .

7
New cards

2NF

Remove partial dependency to meet this normal form.

8
New cards

3NF

Remove tranistive dependency to meet this normal form.

9
New cards

Composite (Bridge) Entity

A table that is used to capture the relationship between two tables after a split, especially in many-to-many relationships.

10
New cards

Boyce-Codd Normal Form (BCNF)

A special case of 3NF, to cover some specific aspects and problems with 3NF, also known as 3.5NF.

11
New cards

candidate key

BCNF can be violated only when the table contains more than one .

12
New cards

determinant

If a table is in BCNF, every in all dependencies must be a complete CK.

13
New cards

Denormalization

The process of adding redundancy back into a normalized database to improve performance.

14
New cards

CREATE

SQL command used to create new tables, indexes, or database structures.

15
New cards

ALTER

SQL command used to modify existing database structures.

16
New cards

DROP

SQL command used to remove tables, indexes, or other database objects permanently.

17
New cards

INSERT

SQL command used to add new rows of data to a table.

18
New cards

DELETE

SQL command used to remove existing rows from a table.

19
New cards

UPDATE

SQL command used to modify existing data within a table.

20
New cards

SELECT

SQL command used to retrieve data from one or more tables

21
New cards

COMMIT

SQL command used to saves all changes made during the current transaction to the database.

22
New cards

ROLLBACK

SQL command used to revert changes made in the current transaction if an error occurs.

23
New cards

SAVEPOINT

SQL command used to creates a temporary save point within a transaction, allowing partial rollbacks.

24
New cards

GROUP BY

SQL clause used to group rows with the same values in one or more columns into a summary row.

25
New cards

HAVING

SQL clause used to filter the results of a GROUP BY query based on a specified condition.

26
New cards

Data Privacy

The rights of individuals and organisations to determine access to data about themselves.

27
New cards

Data Governance Model

A framework that outlines the roles, responsibilities, processes, and policies for managing and governing data within an organisation.

28
New cards

Ethics

Moral principles that control or influence a person’s behavior.

29
New cards

Data Privacy

Focuses on protecting personal private data and information.

30
New cards

Data Ethics

Relevant to all data use, regardless of privacy protection or the specific actions taken with the data.

31
New cards

Grant

Used to give user access privileges to a database.

32
New cards

Revoke

Used to revoke authorization, i.e., to take back permissions from the user.

33
New cards

Deny

Explicitly prevents a user from receiving a particular permission.

34
New cards

Need-to-know basis

Principle of information security to minimize the risk of unauthorized disclosure.

35
New cards

Big Data

Large and complex sets of raw data (difficult or impossible to capture in ER models).

36
New cards

Volume

Quantity of data to be stored.

37
New cards

Velocity

Speed at which data is entering the system.

38
New cards

Variety

Variations in the structure of the data to be stored.

39
New cards

Structured Data

Any data types that can be clearly defined, stored, accessed, and processed in a fixed format.

40
New cards

Unstructured Data

Anything that cannot be described as structured data.

41
New cards

Semi-Structured

Data is in between Structured Data and Unstructured Data.

42
New cards

First-Party Data

Data directly collected from companies’ own websites and apps.

43
New cards

Contextual Advertising

Placing ads based on the content of the website or app being viewed, rather than the user's browsing history.

44
New cards

Hadoop

Open-source framework for storing and analyzing massive amounts of distributed, unstructured data.

45
New cards

HDFS

Hadoop Distributed File System; a low-level distributed file processing system for storing files across networks.

46
New cards

MapReduce

Open-source application programming interface (API) and framework used to process large data sets across clusters.

47
New cards

Name Node

In HDFS, contains file system metadata.

48
New cards

Data Node

In HDFS, stores the actual file data.

49
New cards

Job Tracker

Central control program in MapReduce to accept, distribute, monitor, and report on jobs in a Hadoop environment.

50
New cards

Task Tracker

Program in MapReduce responsible for executing the individual map and reduce tasks assigned by the Job Tracker.

51
New cards

Data Ingestion Applications

Flume/Sqoop gather data from existing systems and ingest into Hadoop.

52
New cards

Hive

Sits on top of Hadoop to help create MapReduce jobs, using a SQL-like language called HiveQL.

53
New cards

Pig

Hadoop platform to write MapReduce programs using its own high-level scripting/programming language: Pig Latin.

54
New cards

HBase / Impala

Provides faster query access directly to HDFS without using MapReduce.

55
New cards

NoSQL

Not modeled using relational model / non-SQL / not-only SQL / Non-relational database, developed to address Big Data challenges.

56
New cards

Key-Value Database

NoSQL Database; Stores data as a collection of key-value pairs where keys are similar to primary keys in relational databases.

57
New cards

Column-oriented Databases

NoSQL Database; Blocks hold data from a single column across many rows, with relational logic.

58
New cards

Graph Databases

NoSQL Database; Suitable for relationship-rich data, using a collection of nodes and edges.

59
New cards

Document Databases

NoSQL Database; Stores data in key-value pairs in which the value components are tag-encoded documents (XML, JSON, or BSON).

60
New cards

Challenges Big data solves

Linear scalability, High throughput, Fault tolerance, Auto recovery, High degree of parallelism, Distributed data processing.