Film Screening - Humans in the Loop
This sunday reading task was assigned by Dilip Barad Sir
Humans in the Loop
PRE-VIEWING WORKSHEET:
Topic 1: AI Bias & Indigenous Knowledge Systems
Questions: What do you understand by AI bias?
How might indigenous ecological knowledge challenge technological framings?
AI bias refers to the systematic and unfair errors in computer algorithms that produce prejudiced outcomes. These biases typically emerge because the datasets used to train AI are often reflective of historical inequalities or lack diverse representation. When AI models are built primarily on Western, urban data, they frequently fail to recognize or respect the cultural and social nuances of marginalized groups.
Indigenous ecological knowledge
challenges these narrow technological framings by offering a holistic, relational view of the environment. While modern technology often frames nature as a set of data points to be "optimized" or extracted, indigenous systems view the earth as an interconnected living entity. This ancestral wisdom prioritizes long-term sustainability and spiritual connection over raw industrial efficiency. By introducing these perspectives, we challenge the "one-size-fits-all" logic of AI. It proves that technology cannot truly understand or "solve" environmental issues without incorporating the lived experiences and traditional stewardship of the communities who have protected these lands for generations.
Topic 2: Labour & Digital Economies
Questions: What is invisible labour in digital economies?
Why is it significant to highlight such labour in narratives about AI?
In digital economies, invisible labour refers to the massive amount of human work required to make "automated" systems function, which remains hidden from the end user. This includes "ghost work" such as data labeling, content moderation, and image tagging. While AI is marketed as a product of pure machine intelligence, it actually relies on thousands of low-paid workers globally who manually sort data to "teach" the algorithms how to think.
It is significant to highlight this labour because it deconstructs the myth of "magical" automation. When narratives focus only on the software, they erase the human effort and the often exploitative conditions under which this work is performed. By bringing invisible labour to the forefront, we acknowledge that AI is a human-built product rooted in global economic inequality. Highlighting this reality ensures that discussions about the "future of tech" include essential conversations regarding workers' rights, fair wages, and the ethical responsibility of tech giants toward their hidden workforce.
Topic 3: Politics of Representation
Questions: From film publicity and reviews, how does representation—both of technology and of Adivasi culture—operate in the film?
Based on film publicity and reviews, the politics of representation often highlights a tension between modern technology and Adivasi traditions. Technology is frequently portrayed as an external, encroaching force—sometimes as a tool of state surveillance and other times as a "civilizing" influence. If representation is handled poorly, it risks framing Adivasi culture as "primitive" or "stuck in the past," creating a narrative where they are merely passive victims of technological progress or in need of external rescue.
However, more progressive representations frame Adivasi culture as a site of sophisticated traditional knowledge and active resistance. In these narratives, the community isn't just "behind" the times; they are offering a different, often more sustainable, version of the future. The way technology is shown—whether as a weapon used against them or a tool they adapt for their own empowerment—determines the film's political stance. Accurate representation avoids exoticizing Adivasi people and instead focuses on their agency, showing that their culture is a living, evolving force that interacts with the modern world on its own terms.
POINTS TO PONDER WHILE WATCHING
1. Narrative & Storytelling
Question: How does the film situate Nehma’s personal life with larger algorithmic structures? What narrative turns foreground labour, family, and knowledge systems? When Nehma “teaches” AI, what does this suggest about human-machine learning loops beyond technological jargon?
The film situates Nehma’s personal life within algorithmic structures by showing how her domestic survival depends on digital data labor. The narrative foregrounds labor through the friction between her traditional duties—such as forest foraging and family care—and the rigid, repetitive demands of her data-entry role. Her family life is effectively "datafied" as her lived experiences are translated into machine-readable code. When Nehma “teaches” the AI, it suggests that human-machine learning loops are actually forms of deep cultural extraction. Beyond the technical jargon, the process reveals that AI "intelligence" is not autonomous; it is a borrowed reflection of human intuition and heritage. This loop highlights a parasitic relationship where indigenous wisdom is harvested to build a digital future that may ultimately exclude the very people who provided the data.
2. Representation & Cultural Context
Question: How are Adivasi culture, language, tradition, and ecological knowledge represented? Does the film challenge or reinforce dominant media stereotypes about tribal communities and modern technology?
Adivasi culture and ecological knowledge are represented as sophisticated, living systems rather than "primitive" history. The film portrays the forest not just as a setting, but as a complex database of ancestral wisdom that the AI struggles to quantify. By highlighting the precision of Adivasi rituals, the film grants the community intellectual agency. This approach actively challenges dominant media stereotypes that often depict tribal communities as technologically illiterate or "stuck in the past." Instead, the film positions them as the essential, modern workers who possess the very knowledge required to make "smart" technology function. It avoids the "noble savage" trope, presenting instead a nuanced Adivasi identity that navigates both ancestral lands and digital workspaces with equal competence.
3. Cinematic Style & Meaning
Question: Use film vocabulary to note: Lines of mise-en-scène and cinematography—how are the forest, computer screens, workspace, and rituals framed visually? How do sound design and editing rhythms contribute to the contrast between analog life and digital labour?
The mise-en-scène and cinematography create a visual dichotomy between the organic and the digital. The forest is captured in wide, panoramic shots with natural lighting to emphasize its vastness and interconnectedness. Conversely, the workspace and computer screens are framed with tight, claustrophobic close-ups and a harsh, blue-tinted palette, reflecting how digital labor "boxes in" the human spirit. The sound design reinforces this contrast: the polyphonic, ambient sounds of the forest (birds, wind, water) are frequently interrupted by the mechanical, rhythmic clicking of keyboards and sterile digital pings. Furthermore, the editing rhythms shift from the slow, deliberate pacing of traditional rituals to the fast-paced, fragmented cuts of data processing, forcing the audience to experience the psychological toll of moving between an analog existence and a digital economy.
4. Ethical & Political Questions
Question: What ethical dilemmas are depicted when training AI with culturally specific data? How does the film’s human-in-the-loop metaphor operate beyond the technical term—politically, socially, and culturally?
The film highlights a major ethical dilemma regarding "digital colonialism"—the process where culturally specific knowledge is digitized and then owned by global corporations without the community's consent or benefit. There is a haunting question of whether Nehma is preserving her culture or accelerating its automation and eventual erasure. Politically and socially, the "human-in-the-loop" metaphor operates as a critique of modern capitalism. It suggests that humans are being reduced to "cogs" used only to correct the machine until the machine can replace them. It highlights a stark power imbalance: while the "humans-in-the-loop" (like Nehma) provide the essential cognitive labor, they remain marginalized and undercompensated by the very technological systems they helped to build.
POST-VIEWING REFLECTIVE ESSAY TASKS
TASK 1 — AI, BIAS, & EPISTEMIC REPRESENTATION
Prompt: Critically analyze how Humans in the Loop represents the relationship between technology (AI) and human knowledge. Discuss how the narrative exposes algorithmic bias as culturally situated and how the film highlights epistemic hierarchies.
Analysis:
1. Algorithmic Bias as Culturally Situated
The film moves the conversation of algorithmic bias away from "broken code" and toward "broken perspectives." In the narrative, the AI is not a neutral machine; it is a mirror of the data it consumes. When the AI fails to understand the nuances of Adivasi life, the film shows that this isn't just a technical glitch—it is a cultural erasure.
The bias is "situated" because the software is designed with urban, Western, or high-tech biases that categorize the world into binary boxes. By showing Nehma struggling to "fit" her complex reality into the AI's narrow labels, the film exposes how technology carries the ideology of its creators. The "bias" is actually a lack of empathy and cultural literacy built into the code.
2. Epistemic Hierarchies: Whose Knowledge Counts?
The film highlights a sharp epistemic hierarchy where scientific or data-driven knowledge is placed at the top, while indigenous, oral, and ecological knowledge is relegated to the bottom—or treated merely as "raw material" to be processed.
The "hierarchy" operates as follows:
The Corporation: Owns the "high" knowledge (the code and the profit).
The AI: Is the "new" knowledge (the future).
The Adivasi Community: Possesses the "original" knowledge (nature and ritual), which is only valued once it is digitized by Nehma.
The film suggests that technological systems treat indigenous wisdom as "primitive" until it can be harvested to make a machine "smarter."
3. Support with Film Studies Concepts
Representation: Use examples of how the camera frames Nehma’s face vs. the computer screen. The close-ups on her eyes during data entry represent the "human cost" of digital labor.
Mise-en-scène: Discuss the workspace. If the office is sterile and grey compared to the vibrant forest, the film is visually showing the "flattening" of culture into data.
Power Relations: Analyze the "invisible" nature of Nehma’s work. She is a "Human in the Loop," but the loop is a circle that keeps her trapped in low-wage labor while the tech company gains the intellectual property of her heritage.
TASK 2 — LABOR & THE POLITICS OF CINEMATIC VISIBILITY
Question: How does the film’s visual language represent labeling work and the emotional experience of labour?
The film’s visual language transforms "invisible" digital labor into a tangible, physical struggle. Through the use of tight close-ups (macro-cinematography) on Nehma’s eyes and hands, the film emphasizes the strain of repetitive motion. The screen light often casts a cold, flickering glow on her face, contrasting with the warm, natural light of her village. This visual "trapping" of the protagonist within the frame suggests that her labor is a form of digital confinement. The emotional experience is represented through montage editing, where the rapid-fire sequence of clicking, tagging, and labeling is intercut with her domestic duties. This creates a sense of "time poverty," showing that under digital capitalism, a worker’s mind is never truly off the clock; the emotional toll is a constant state of exhaustion and alienation from her own cultural surroundings.
Question: What does this suggest about the cultural valuation of marginalized work?
The film suggests that under digital capitalism, marginalized work is highly valued for its utility but completely devalued in terms of humanity. Nehma’s indigenous knowledge is the "fuel" for the AI, yet the system treats her as a replaceable part. This reflects a cultural hierarchy where the "high-tech" developers in cities are seen as the innovators, while the Adivasi workers providing the essential data are rendered invisible. By showing Nehma performing complex cognitive tasks for meager wages, the film exposes a "digital caste system." It suggests that society values the output (the "smart" AI) while systematically ignoring the input (the marginalized worker), reinforcing the idea that certain bodies are viewed only as data-generating tools rather than intellectual contributors.
Question: Does the film invite empathy, critique, or transformation in how labour is perceived?
The film operates on all three levels to shift the viewer's perception. It invites empathy by centering the narrative on Nehma’s personal stakes—her family and her heritage—making the audience feel the weight of her keyboard strokes. However, it moves beyond simple pity into a systemic critique of global power relations. It forces the viewer to recognize that every "seamless" app or AI tool they use is built on the backs of workers like Nehma. Finally, it suggests a transformation by giving Nehma "the gaze." By showing her moments of quiet resistance or her deep connection to the forest that the AI cannot "label," the film asserts that her life has a value that the digital economy can never fully capture. It transforms the viewer from a passive consumer of technology into a witness to the human cost of progress.
TASK 3 — FILM FORM, STRUCTURE & DIGITAL CULTURE
Question: How does the interplay of natural imagery versus digital spaces communicate broader thematic concerns?
The film uses a visual dialectic—a conflict between two opposing styles—to represent the tension between heritage and technology. The natural imagery is characterized by "deep focus" and organic camera movements (handheld or slow pans), suggesting a world that is expansive, interconnected, and ancient. In contrast, the digital spaces are depicted through "shallow depth of field" and static, rigid framing. When the camera focuses on a computer screen, the background (the real world) often blurs out, symbolizing how digital culture narrows a person’s perspective and isolates them from their environment. This interplay communicates the concern that as we become more embedded in digital spaces, our connection to the physical and ecological world becomes "out of focus" or secondary to the data we produce.
Question: How do aesthetic choices shape the viewer’s experience of labour, identity, and technology?
Aesthetic choices act as a bridge that allows the viewer to feel the psychological weight of Nehma's world.
Sound Design: The film likely employs asynchronous sound—where the mechanical clicks of a keyboard or the hum of a server might bleed into a scene of Nehma walking through the forest. This creates a sensory "haunting," suggesting that digital labor has colonized her mental space even when she is physically free.
Editing & Sequencing: By using cross-cutting between the slow, rhythmic pace of Adivasi rituals and the fast, erratic pace of data labeling, the film shapes the viewer’s experience of "time." It shows technology as a force that accelerates life to an exhausting degree, stripping away the patient identity required for cultural tradition.
Cinematography: The choice to use low-key lighting in the workspace versus the high-key, vibrant light of the forest creates a moral geography. Technology is framed as a dark, extractive "underworld," while the forest represents the clarity of indigenous identity. These aesthetic choices ensure that the viewer doesn't just watch the story but experiences the friction between being a human and being a "user."
References
Alonso, D. V. (2026). Imagining AI futures in mainstream cinema: Socio-technical narratives and social imaginaries. AI & Society.
Anjum, N. (2026) Aranya Sahay’s Humans in the Loop and the politics of AI data labelling. The Federal.
Apparatus: Film, Media and Digital Cultures of Central and Eastern Europe (ongoing academic journal). (n.d.). Retrieved February 15, 2026, from
Barad, D. (2026, January). Humans in the loop: Exploring AI, labour and digital culture [Blog post].
Bazin, A. (1967). What is cinema? (Vol. 1). University of California Press.
Bordwell, D., & Thompson, K. (2019). Film art: An introduction (12th ed.). McGraw-Hill Education.
Cave, S., Dihal, K., Drage, E., & McInerney, K. (2023). Shuri in the sea of dudes: The cultural construction of the AI engineer in popular film, 1920–2020. In Feminist AI: Critical perspectives on algorithms, data, and intelligent machines (pp. 65–82). Oxford University Press.
Deleuze, G. (1983). Cinema 1: The movement image (H. Tomlinson & B. Habberjam, Trans.). University of Minnesota Press. (Original work published 1983)
Film Theory. (2025). The Year’s Work in Critical and Cultural Theory.
Frías, C. L. (2024). The paradox of artificial intelligence in cinema. Cultura Digital, 2(1), 5–25.
Göker, D. (2025). Human-like artificial intelligence in Indian cinema: Cultural narratives, ethical dimensions, and posthuman perspectives. International Journal of Cultural and Social Studies, 11(2), 1–10.
Haris, M. J., Upreti, A., Kurtaran, M., Ginter, F., & Azimi, S. (2023). Identifying gender bias in blockbuster movies through the lens of machine learning. Humanities and Social Sciences Communications, 10, 94.
Humans in the Loop (film). (n.d.). In Wikipedia. Retrieved February 15, 2026, from
Indian Express Editorial. (2026). Humans in the Loop: Technology, AI and digital lives. The Indian Express.
McDonald, K. (2023). Film theory: The basics (2nd ed.). Routledge.
Mehrotra, K. (2022). Human Touch [Article referenced as film inspiration]. Fifty Two (52). (Note: original non-DOI editorial source).
Number Analytics. (2023). Film theory essentials: Key concepts and frameworks.
Sahay, A. (Director). (2024). Humans in the loop [Film]. India.
Shepherdson, C., Simpson, J., & Utterson, A. (Eds.). (2004). Film theory: Critical concepts in media and cultural studies (Vols. 1–4). Routledge.
Sui, Z., & Wang, S. (2025). Dogme 25: Media primitivism and new auteurism in the age of artificial intelligence. Frontiers in Communication, 10, Article 1659731.
Vighi, F. (2019). Critical theory and film: Rethinking ideology through film noir. Bloomsbury Academic India.
Wilfrid Laurier University. (n.d.). Film theory reading list.
Yu, Y. (2025). The reel deal? An experimental analysis of perception bias and AI film pitches. Journal of Cultural Economics, 49, 281–300.
THANK YOU....