Why is AI a Social Justice Issue? | Buse Çetin
Personal Experience with the Instagram Algorithm
My journey into understanding algorithms began after a heartbreaking experience, which led me to a nail salon in Paris for what I called 'revenge nails.' The salon, located on a basement floor, was painted in vibrant pink and orange hues, reminiscent of a sunset, and was filled with the chemical smells typical of gel nail treatments. During my visit, I engaged in a conversation with Atanasia, the salon owner, about our professional backgrounds and the broader social implications of AI technologies. Atanasia posed a specific question that immediately discomforted me due to my general apprehension about discussing AI and technology: "Why doesn’t the Instagram algorithm show some of my photos to most of my followers?" I tentatively suggested she try creating Instagram Reels, but both of us felt that this explanation was unsatisfying.
Researching the Instagram Algorithm
Motivated by this frustrating conversation, I decided to delve deeper into understanding the Instagram algorithm. My online research led me to a 2019 article penned by an Instagram engineer, which shed light on the various algorithms utilized in recommendation systems, particularly for the 'For You' page. While I had a basic understanding of what algorithms were, I sought a more precise and comprehensive explanation to answer Atanasia's question more thoroughly.
Personal Experience with Social Media Content
My personal 'For You' page on social media platforms typically features content related to celebrities like Kim Kardashian and Bella Hadid, alongside topics such as fitness and weight loss. This content reflects not only my personal preferences but also how I project my social image online. Social media systems are adept at categorizing users based on the extensive data they collect. Every interaction—liking, clicking, sharing, and producing content—contributes significantly to this personal data, allowing platforms to build a detailed profile of each user.
Data Collection and Privacy Concerns
These systems gather information from various data sources, including users' browsing behaviors, which capture details about demographics such as age group and location. The omnipresent cookie consent pop-ups serve as a constant reminder that user data is being tracked, with these cookies frequently sent to major tech companies like Meta, Amazon, and Google. This collected information is then often purchased by data brokers, highlighting the widespread availability of personal data. Consequently, sophisticated social graphs are developed, capable of predicting highly sensitive personal attributes, including sexuality, religion, and political beliefs.
Anonymity Misconceptions
Despite claims about data anonymization, there remains a significant misconception regarding its effectiveness. The high specificity of collected data often allows it to be traced back to individuals, undermining privacy assurances. Privacy issues have, therefore, become a prevalent source of anxiety. Users are often faced with the dilemma of enjoying the benefits of free online services at the unavoidable cost of their personal data.
Algorithmic Applications and Consequences
Algorithms extend their applications far beyond marketing, influencing critical social outcomes. For instance, AI is increasingly used in job interviews to analyze emotions and psychological profiles of candidates. In the UK education system, AI tools are employed to predict children's academic performance, while criminal justice assessments leverage AI to forecast the likelihood of re-offending. This widespread integration raises important questions about efficiency versus ethical implications, as companies utilize AI to cut costs, often impacting significant life decisions with potentially biased outcomes.
Understanding Algorithms
Fundamentally, algorithms function by identifying patterns and correlations within data, yet they inherently lack genuine understanding. Data, in this context, serves merely as a representation of historical trends, without inherently revealing causality. A notable case study is Amazon’s attempt to streamline candidate selection with an AI hiring tool, which, despite being developed by a well-resourced company, demonstrated bias against women. The tool systematically reduced scores for resumes linked to women’s colleges or jobs, showcasing how biased data can lead to discriminatory outcomes. A report from Harvard Business School indicates that a staggering 99% of Fortune 500 companies now use automated hiring software, leading to instances of automated injustice, such as automatic rejections based on insufficient criteria like career gaps due to parental leave. This algorithmic bias affects diverse categories, including gender, race, ethnicity, and socioeconomic class.
Accountability and Personal Reflection
The complexity of algorithms makes it challenging to ascertain who is truly responsible for automated injustices, creating difficulties in establishing clear accountability. Despite feelings of powerlessness, I found empowerment through personal reflection. I shared my own experience of moving from Turkey to France and encountering prejudice, which illuminated the significant gap between societal perceptions and personal identity. This underscores the critical importance of awareness, especially amidst rapid technological advancement, in understanding how identities and labels function within society and their far-reaching implications.
Encouraging Critical Engagement
Following this discussion, I encourage individuals to engage critically with their personal 'For You' pages and reflect on what those reflections reveal about themselves. It is crucial to challenge one's comfort with algorithm-generated content. The ultimate aim is to cultivate algorithmic literacy, enabling us to recognize when we are facing algorithmic harms and collaboratively pursue systemic solutions to tackle pervasive algorithmic bias.
Conclusion and Moving Forward
The initial conversation at the nail salon served as a catalyst, leading to a profound understanding and connection for me. I acknowledge Atanasia’s encouragement, which helped me break stereotypes surrounding discussions about technology. Inspired by these insights, Musen Yazdine and I co-founded "Dreaming Beyond AI," an initiative dedicated to exploring AI's social impact. My ongoing personal Instagram algorithm, however, still struggles to grasp my evolving passions, suggesting a continuous challenge in aspiring to be "beyond the algorithm." Thank you.