Lecture Eight
Lecture eight.
Housekeeping:
- Next week-reading week
- Follow week- indigeous history
- The week after- another guest speaker.
- Four more weeks left!
Interpretation
- Stuart Hall, the whites of their eyes, (1995)
o Theorized how media messages are produced, dissseminated and interpreted
o Communication is a negotiation between sender, text, and receiver.
o Dominant discourse exist through preferred readings built in institutional, ideological and political order imprinted on them
o We “encode: messages with value
o We decode by interpreting those.
o How powers the message- the company on the social media
o People who create, algorithms, drives, creates powerful key values.
- How do we know what we know
1. Discover it ourselves
2. Absorb it implicitly
3. Told it explicitly- mostly we rely on expertise to inform us.
- Expertise is a social judgement
- Experitise is realtve
- Experts don’t always agree-finite
- Experts aren’t always right- financial analyst
- Expertise is usually focused, narrows and specific in one area.
Remember to do who/when/where/why? Within the source.
- Bias, personality, race, ideology, political preseptcive.
- Fake news- propaganda
- Governmental propogada within World War Two .
Political Knoweldge, of Knoweldge Production and Circulation- lesile Chan.
- Stuck to the podium- how technology defines our behaviour
- Recipes- a routine of how to get to do something (Know as an algorithm>)
- Academia- not made for the average person- trying to out smart each other.
- Academia- works with reputation, what you publish is what defines you.
- Knoweldge, who gets to access their Knoweldge, what Knoweldge is seen as legit.
- 25 years ago internet wasn’t common within their island.
- Zine- calls into question academia works and how it works.
- What are the underline challenges? Within algorithm.
Timnit Gebur
- AI sounds like it knows something,
- Generating stuff.
- Bias within the Ai, or agorthim, I.e gender, sexuality, and race bias.
- Culture of google- wasn’t thunking about consequences- thinking of Money rather than the over all good for humanity.
- Distributed AI institute- created by her.
Joy Buolamwini
- Dark skinned people, weren’t able to recognized by facial software,
- Biased and belivied that they weren’t human.
- Using a creation of the white mask, Ai was able to read it
- Ai another way of creating fake crimes.
- Facial recgonition- highlights the bias within the data base.
- Women of color- highlights the biased with the data based within google.
- Google wants peoples attention- mean and nasty- more adiversting money for them.
What is Technology? (Franklin 1921-2016).
- “technology involves organization, produceres, symbols, new words, equations and, most of all, a mindset
· Material
· Process
· Interactions
· Contexts
· Agency.
- Technology as a social practice.
Types of technology
- Holostic technology ((when the doer/user is in control of the work process)
- “Prescriptive technology (e.g assembly line when the work is divided into specific steps each carried out by separate individuals.- Highligy efficient.
- Culture of complainace, technology of control.
- Canvas- every course looks more or less the same.
- Controls the learning enviorment, but also a teach enviorment as well
o Culture of compliance.
Franklin, the studies of technology is an attempt to understand technology practices affect the advancement of justice and peace
It is a study of power, who has control over whom, and how.
Private equity firms- Canvas- can have access of grades, plagiarism scores.
Labour market within A.I
Lack of consent with A.I
Lack of legislation within our rights of AI.
Whitaker 2023
- Explotation of raccalized people
- “Prefigured on the Plantation, developed first as technologies to control enslaved people.
- “ How can we program them to be as cheap and efficient as possible
- Red lining people
- Because peer review is not reviewed by people internationally, Is seen as not good enough for the data base of UOFT library.
- Knowledge determined by other company understanding of Knoweldge resources.
- Education should not be comofidation along with our social relationships.
- Changes in technology throughout the years.
- Querying system- who owns it, how as the power to control the system- highlights how to engage in the technology.
Lecture eight.
Housekeeping:
- Next week-reading week
- Follow week- indigeous history
- The week after- another guest speaker.
- Four more weeks left!
Interpretation
- Stuart Hall, the whites of their eyes, (1995)
o Theorized how media messages are produced, dissseminated and interpreted
o Communication is a negotiation between sender, text, and receiver.
o Dominant discourse exist through preferred readings built in institutional, ideological and political order imprinted on them
o We “encode: messages with value
o We decode by interpreting those.
o How powers the message- the company on the social media
o People who create, algorithms, drives, creates powerful key values.
- How do we know what we know
1. Discover it ourselves
2. Absorb it implicitly
3. Told it explicitly- mostly we rely on expertise to inform us.
- Expertise is a social judgement
- Experitise is realtve
- Experts don’t always agree-finite
- Experts aren’t always right- financial analyst
- Expertise is usually focused, narrows and specific in one area.
Remember to do who/when/where/why? Within the source.
- Bias, personality, race, ideology, political preseptcive.
- Fake news- propaganda
- Governmental propogada within World War Two .
Political Knoweldge, of Knoweldge Production and Circulation- lesile Chan.
- Stuck to the podium- how technology defines our behaviour
- Recipes- a routine of how to get to do something (Know as an algorithm>)
- Academia- not made for the average person- trying to out smart each other.
- Academia- works with reputation, what you publish is what defines you.
- Knoweldge, who gets to access their Knoweldge, what Knoweldge is seen as legit.
- 25 years ago internet wasn’t common within their island.
- Zine- calls into question academia works and how it works.
- What are the underline challenges? Within algorithm.
Timnit Gebur
- AI sounds like it knows something,
- Generating stuff.
- Bias within the Ai, or agorthim, I.e gender, sexuality, and race bias.
- Culture of google- wasn’t thunking about consequences- thinking of Money rather than the over all good for humanity.
- Distributed AI institute- created by her.
Joy Buolamwini
- Dark skinned people, weren’t able to recognized by facial software,
- Biased and belivied that they weren’t human.
- Using a creation of the white mask, Ai was able to read it
- Ai another way of creating fake crimes.
- Facial recgonition- highlights the bias within the data base.
- Women of color- highlights the biased with the data based within google.
- Google wants peoples attention- mean and nasty- more adiversting money for them.
What is Technology? (Franklin 1921-2016).
- “technology involves organization, produceres, symbols, new words, equations and, most of all, a mindset
· Material
· Process
· Interactions
· Contexts
· Agency.
- Technology as a social practice.
Types of technology
- Holostic technology ((when the doer/user is in control of the work process)
- “Prescriptive technology (e.g assembly line when the work is divided into specific steps each carried out by separate individuals.- Highligy efficient.
- Culture of complainace, technology of control.
- Canvas- every course looks more or less the same.
- Controls the learning enviorment, but also a teach enviorment as well
o Culture of compliance.
Franklin, the studies of technology is an attempt to understand technology practices affect the advancement of justice and peace
It is a study of power, who has control over whom, and how.
Private equity firms- Canvas- can have access of grades, plagiarism scores.
Labour market within A.I
Lack of consent with A.I
Lack of legislation within our rights of AI.
Whitaker 2023
- Explotation of raccalized people
- “Prefigured on the Plantation, developed first as technologies to control enslaved people.
- “ How can we program them to be as cheap and efficient as possible
- Red lining people
- Because peer review is not reviewed by people internationally, Is seen as not good enough for the data base of UOFT library.
- Knowledge determined by other company understanding of Knoweldge resources.
- Education should not be comofidation along with our social relationships.
- Changes in technology throughout the years.
- Querying system- who owns it, how as the power to control the system- highlights how to engage in the technology.