"

7 You According to AI Class Activity

Exploring Identity and Representation

Introduction: Critical Engagement with AI Representation

A robot and a man both with gentle expressions.
Movie Poster of Bicentennial Man

As artificial intelligence increasingly mediates our understanding of ourselves and others, developing a critical lens toward AI-generated representations becomes essential. The “You According to AI” activity invites students to examine how AI systems represent human experiences, particularly their own, and to reflect on what these representations reveal about both AI systems and our self-understanding.

This activity provides a concrete entry point for students to engage with questions of positionality, representation, and the politics of knowledge production in the age of AI. By comparing AI-generated descriptions of student experiences with students’ own self-narratives, the exercise makes visible both the capabilities and limitations of artificial intelligence in capturing the nuanced realities of human life. Most importantly, it illuminates the power that is the student’s unique experiences and how that contributes to powerful and necessary meaning making.

Theoretical Foundations

This activity draws on several theoretical traditions:

  • Situated knowledge (Donna Haraway): Examining how knowledge is always partial and positioned from specific standpoints
  • Critical data studies (Catherine D’Ignazio and Lauren Klein): Questioning who designs data systems, what values they encode, and who benefits from them
  • The politics of recognition (Charles Taylor): Investigating how being “seen” accurately by others (including systems) relates to dignity and identity

Activity Guidelines

Materials Needed

  • AI-generated descriptions of student experiences (prepared in advance)
  • Annotation materials (colored markers, sticky notes, or digital annotation tools)
  • Reflection worksheets (template provided below)
  • Optional: Recording equipment (such as phones) for small-group discussions (with consent)

Time Required

  • 45-60 minutes total
  • 10 minutes for individual analysis
  • 10-15 minutes for pair discussions
  • 15-20 minutes for class discussion
  • 10 minutes for written reflection

Preparation (Essential Pre-Class Work)

1. Generate AI Descriptions

Before class, generate descriptions of “the typical college student experience” using an AI system such as ChatGPT, Claude, or similar. You can use prompts like:

  • “Describe a day in the life of a typical college student in 2025”
  • “What challenges do today’s college students face?”
  • “Write a brief profile of an undergraduate student in [your country]”
  • “What motivates college students to pursue higher education?”

Generate 3-4 different descriptions to distribute. Ideally, use different AI systems and prompting approaches to create variety in the outputs.

2. Create Analysis Worksheets

Prepare worksheets with sections for:

  • Noting accurate representations
  • Identifying inaccuracies or stereotypes
  • Highlighting omissions or gaps
  • Reflecting on emotional responses

Process

1. Individual Analysis (10 minutes)

Distribute AI-generated descriptions and analysis worksheets to students, then ask them to:

  • Read the AI-generated description carefully
  • Annotate the text, marking elements that:
    • Resonate with their experience (perhaps in green)
    • Misrepresent their experience (perhaps in red)
    • Oversimplify complex realities (perhaps in yellow)
  • Note aspects of their experience that are completely absent
  • Reflect on their emotional response to the description

2. Pair Discussions (10-15 minutes)

Have students pair up to:

  • Share key observations from their individual analysis
  • Discuss similarities and differences in their responses
  • Explore why certain aspects of their experiences might be absent or misrepresented
  • Consider what the AI description reveals about underlying assumptions in the system

3. Class Discussion (15-20 minutes)

Facilitate a whole-class discussion using prompts such as:

  • “What patterns did you notice in what the AI systems got right or wrong?”
  • “What aspects of your experiences were consistently missing?”
  • “Whose experiences seemed to be centered in these descriptions? Whose were marginalized?”
  • “How might these representations shape how others understand student experiences?”
  • “What might these descriptions reveal about the data these AI systems were trained on?”

4. Written Reflection (10 minutes)

Ask students to complete a brief written reflection addressing:

  • What surprised them about the AI representation
  • How they would correct or supplement the AI’s understanding
  • How this exercise relates to course themes or concepts
  • What implications this has for their understanding of AI systems

Implementation Guidance for Instructors

Creating a Critical but Constructive Space

This activity can raise complex emotions about representation and recognition:

  • Frame the activity as an exploration
  • Acknowledge that AI systems reflect human-created datasets and societal biases
  • Emphasize that the goal is to develop critical thinking, not to entirely dismiss technology
  • Be prepared to discuss how systems might improve while maintaining a critical stance

Adaptations for Different Contexts

For introductory courses:

  • Focus more on personal reactions and less on theoretical frameworks
  • Simplify by using just one AI-generated description for everyone

For advanced courses:

  • Add a component where students prompt AI systems themselves
  • Include analysis of the differences between various AI systems
  • Connect to scholarly readings on algorithmic bias or AI ethics

For discipline-specific courses:

  • Generate AI descriptions of “the typical psychology/sociology/education student”
  • Include discipline-specific questions about how AI represents the field itself
  • Connect to methodological discussions about knowledge production in the discipline

Ethical Considerations

  • Ensure students know they don’t need to share personal details they’re uncomfortable revealing
  • Be mindful that students from marginalized groups may have more negative experiences with misrepresentation
  • Consider potential emotional responses from students who feel erased or stereotyped

Post-Activity Reflection Questions

For Students:

  1. What does the AI description get right about your experience? What might this tell you about whose experiences are well-represented in AI training data?
  2. What important aspects of your identity or experience are missing or distorted? Why might these elements be missing?
  3. How would you “correct” the AI system to better represent your experience? What information or understanding would you want to add?
  4. How does your positionality (e.g., race, class, gender, ability status, first-generation status) affect what feels accurate or inaccurate in the AI description?
  5. How might AI-generated representations shape how others understand or make decisions about college students?

For Instructors:

  1. What patterns emerged in student responses that reveal limitations in AI systems’ understanding of diverse student experiences?
  2. Which aspects of student experiences seemed most consistently misrepresented or absent?
  3. How did students with different social positions respond differently to the AI descriptions?
  4. What insights does this activity offer about how to discuss AI tools and their limitations in your course?
  5. How can future class discussions build on the critical thinking skills practiced in this activity?

Theoretical Connections and Extensions

Critical AI Literacy

This activity develops what scholars have termed “critical AI literacy”—the ability to question, analyze, and evaluate AI systems and their outputs rather than accepting them as objective or neutral. As Safiya Noble argues in Algorithms of Oppression, AI systems often reproduce and amplify existing social inequalities while presenting themselves as objective.

By analyzing AI representations of their own experiences, students develop habits of questioning that can transfer to other encounters with AI-mediated knowledge:

  • Who created this system and for what purpose?
  • What data was it trained on, and whose experiences does that data center?
  • What values and assumptions are encoded in how the system represents reality?
  • Who benefits from this particular representation, and who might be harmed?

Standpoint Epistemology and AI

This exercise connects powerfully to feminist standpoint theory, which holds that knowledge is always situated and partial, emerging from particular social positions. As Patricia Hill Collins argues, standpoint epistemology recognizes that those who are marginalized often have insights into systems of power invisible to those in privileged positions.

When students identify gaps between their lived experiences and AI representations, they are engaging in a form of standpoint critique that:

  • Values experiential knowledge
  • Recognizes the partiality of dominant narratives
  • Illuminates how power shapes what counts as knowledge
  • Centers the perspectives of those often excluded from knowledge production

Technology as Co-Constructor of Reality

Building on Sherry Turkle’s work on how technologies shape our self-understanding, this activity helps students recognize AI not as a neutral mirror reflecting reality, but as an active participant in constructing reality. When AI systems represent “the college student experience” in particular ways, they don’t simply describe—they prescribe by normalizing certain experiences and marginalizing others.

This recognition helps students develop what Donna Haraway calls “response-ability”—the capacity to respond thoughtfully and critically to technologies rather than accepting them as inevitable or unchangeable.

Integration with Course Concepts

This activity can be connected to numerous course concepts across disciplines:

Psychology:

  • Social identity theory and self-categorization
  • Stereotype threat and its effects on performance
  • Representation and mental health
  • Technology and cognition

Sociology:

  • Social construction of categories and identities
  • Algorithmic stratification and digital inequality
  • Institutions and the reproduction of norms
  • Power and knowledge production

Education:

  • Educational technology and its impact on learning
  • Student identity development
  • Critical pedagogy and student agency
  • Assessment practices and technology

Communication/Media Studies:

  • Representation in digital media
  • Algorithmic culture and identity
  • Digital literacy practices
  • Platform ethics and governance

Conclusion: Beyond Critique to Creative Possibility

While this activity begins with critical analysis, its ultimate aim is to open spaces for creative reimagining. By identifying the gaps between AI representations and lived experiences, students don’t just critique technology—they begin to envision alternatives.

As Ruha Benjamin suggests in Race After Technology, “we need to develop a robust imagination that isn’t constrained by what exists.” When students articulate what AI systems miss or misunderstand about their experiences, they are engaging in precisely this kind of imaginative work.

The comparative analysis between AI-generated descriptions and students’ self-understanding offers a concrete entry point into larger questions about knowledge, power, and representation that lie at the heart of critical thinking in the digital age. It positions students not as passive consumers of technology but as critical interlocutors with the capacity to question, challenge, and ultimately reshape the technological systems that increasingly mediate their understanding of themselves and the world.


References

Benjamin, R. (2019). Race After Technology: Abolitionist Tools for the New Jim Code. Polity.

D’Ignazio, C., & Klein, L. F. (2020). Data Feminism. MIT Press.

Haraway, D. (1988). Situated knowledges: The science question in feminism and the privilege of partial perspective. Feminist Studies, 14(3), 575-599.

Hill Collins, P. (2000). Black Feminist Thought: Knowledge, Consciousness, and the Politics of Empowerment. Routledge.

Kitchin, R., & Dodge, M. (2011). Code/Space: Software and Everyday Life. MIT Press.

Noble, S. U. (2018). Algorithms of Oppression: How Search Engines Reinforce Racism. NYU Press.

Taylor, C. (1994). The politics of recognition. In A. Gutmann (Ed.), Multiculturalism: Examining the Politics of Recognition (pp. 25-73). Princeton University Press.

Turkle, S. (2011). Alone Together: Why We Expect More from Technology and Less from Each Other. Basic Books.

Appendix: Sample Worksheet Template

“You According to AI” Analysis Worksheet

AI-Generated Description: [Insert the AI-generated text here]

Annotation Guide:

  • Green: Resonates with my experience
  • Red: Misrepresents my experience
  • Yellow: Oversimplifies complex realities
  • Circle: Notable word choices or phrases

What’s Missing: List aspects of your experience as a student that are completely absent from this description: 1. 2. 3.

Emotional Response: How does reading this description make you feel? Why might you have this response?

Critical Questions:

  • Whose experiences seem centered in this description?
  • What assumptions about students does this description make?
  • How might this representation shape others’ understanding of students?
  • What might this description reveal about the data used to train this AI?

Your Counter-Narrative: If you could correct or supplement this AI description to better represent your experience, what would you add or change?

Connections to Course Concepts: How does this exercise relate to concepts or theories we’ve discussed in class?

Media Attributions

  • Movie Poster of Bicentennial Man

License

AI and Positionality Copyright © 2025 by Emese Ilyés. All Rights Reserved.