Should critical computing scholars engage more directly with design PRACTICES?
2026-04-08
This post is adapted from a position paper I originally wrote for the CHIdeology Workshop at CHI 2026. I have revised it here for my homepage blog, with the hope of opening up a broader conversation.
Introduction
“I understand the critique of the existing assumptions and approaches in HCI, but I wonder what a system that actually acts on these commitments would look like.”
This question stays with me. As a junior HCI researcher engaging with critical computing with a design background, I encountered similar doubts repeatedly, sometimes posed by colleagues (particularly those outside critical HCI), sometimes arising from my own reading. Bearing this question in mind, I began browsing papers published under the Critical Computing Subcommittee on CHI homepages over the years. Across these publications, I encountered theoretically grounded critiques and empirical studies that compellingly challenge dominant technological assumptions and imagine alternative futures. Yet I found myself returning to the same question: how, and where, does design figure in these critical commitments?
By design, I am not referring to design implications articulated in articles, nor to approaches such as speculative design [20, 26] or design fiction [5, 6], which mobilize artifacts to provoke reflection, generate knowledge, or imagine alternative futures. Rather, I am concerned with design as outcomes that enter everyday technological use, such as systems, interactions, or arrangements that people actually live with, and that attempt to enact critical commitments under existing technological conditions.
The question pushed me to more closely observe the landscape of critical computing, particularly through my engagement with feminist HCI [2, 3]. When I came across the CHIdeology workshop, I tried to understand the research and practices within our field through an ideological lens [9]. Drawing on the framing of worldviews, desired futures, and action, I started to reflect on what critical computing has achieved, what it has promised, and what is left out.
One of the major achievements of critical computing has been its sustained challenge to dominant technological worldviews. By questioning inherited assumptions, naming dominant epistemologies, and making visible the political dimensions of technologies, critical computing has helped establish a broad recognition that technologies are not neutral, objective, or merely technical artifacts, but are embedded in wider ideological formations and power relations [1, 13]. In response to these critiques, critical computing has also embraced rich practices for articulating alternative desired futures. Approaches such as speculative design and design fiction have played a central role in expanding what can be envisioned beyond the constraints of existing systems [7, 14, 16, 22].
However, when it comes to action, I find myself feeling uncertain. As a second-year PhD student, I offer these reflections as a junior researcher, grounded in my research experiences in feminist HCI, and I welcome critique and open discussion.
Action in Critical Computing?
My central reflection or concern is that we have proposed numerous design implications for designers, engineers, policymakers, and institutions, with the hope of realizing more feminist, care-centered, and justice-oriented futures through technological design. Yet, it is difficult to claim that such designs have flourished in HCI in any substantial or sustained way. For instance, while feminist AI or feminist datasets are frequently articulated in both academic and public discourse [17, 23], concrete implementations remain limited [8]. Much justice-oriented AI work remains concentrated at the level of principles, audits, or critiques of datasets [12, 17], and far less frequently extends into end-to-end system building or the long-term maintenance of deployed technologies.
The gap cannot be explained solely by a lack of awareness or goodwill. Rather, it reflects a set of persistent structural constraints: technology companies often lack strong incentives to adopt feminist or justice-oriented systems; capitalism rarely rewards care or approaches that are difficult for large-scale replication [11]; and state institutions tend to prioritize maintaining technological control rather than relinquishing it [18]. Moral appeals or political correctness do play an important role in pushing the field forward. However, when action relies primarily on such appeals, it risks remaining fragile and superficial [10]. These limitations are often treated as pragmatic obstacles or external realities. However, they can also be understood as manifestations of ideology at work—conditions produced by the very belief systems that critical computing seeks to challenge.
Action Toward Intended Design Outcomes
It is one thing to examine how contemporary technologies are ideological, and another to ask what can be done when those technologies are already deeply embedded in everyday life. In this sense, design should be understood not only as a way of articulating design implications or imagining alternatives in articles, but also as an intended outcome, one that enters real-world use. In this paper, I argue that action toward such designs deserves greater attention within critical computing. What is at stake, then, is how such action can grapple with the conditions under which design outcomes become possible, durable, and accountable in people’s everyday lives.
One promising direction is what I call reappropriation, though I do not mean it to be exclusive or exhaustive. Reappropriation broadly refers to allocating or reclaiming something for new purposes. Here, I use the term to describe design practices that actively reappropriate “imperfect” technologies. Rather than aiming to purify or replace existing systems, such reappropriation works within and against the present, intervening in how technologies are used, governed, and experienced as they are materially and institutionally configured. We already see the practices that can orient us toward such reappropriation. For example, although social media platforms have been extensively critiqued for reinforcing surveillance, harassment, and extractive attention economies, they have also been actively taken up for digital activism and feminist organizing [19, 21, 25]. More recent HCI work further shows how women strategically reappropriated hashtags, originally designed to maximize traffic and visibility, to avoid unwanted attention from male users and to maintain safer, more controlled spaces [24].
These initiatives are situated practices enacted by end users in their everyday lives, working within and against the constraints of existing technological infrastructures. Building on this recognition, critical HCI scholars should move beyond analyzing such practices after the fact, and instead engage more deliberately in designing with and for reappropriation. Consider AI and its biases as an example. While their risks and harms, particularly for marginalized groups, are well documented [4, 15], treating AI solely as an object awaiting fairness or transparency overlooks how people are already engaging with these systems in practice. From a perspective that treats design as an intended outcome, it invites questions about how interaction, configuration, or use might be designed to mitigate harm while leveraging existing capabilities in support of feminist and justice-oriented agendas. Ciolfi Felice and colleagues [8], for instance, presented a case of AI development in Latin America that addressed a critical data gap around gender-based violence. In their work, design functioned both as a process for generating situated knowledge about how to enact feminism in AI development, and as an outcome that supported criminal court officers in opening justice data in real-world settings. Reflecting on this practice, the authors also asked how AI techniques might be put in the service of feminist and social justice causes, what they describe as “AI for feminisms.”
From what has been discussed, I also argue that critical scholars should take greater responsibility for engaging firsthand with difficult, situated, and often constrained design conditions. Such engagement will make visible to broader communities how critical commitments can be enacted in practice. It also creates conditions under which critical commitments can extend beyond discourse, taking material form in ways that may generate tangible, if partial, effects in everyday life. HCI already offers established research approaches for such engagement. For instance, research through design [27] is a way to work with wicked problems by developing design artefacts as outcomes. To support the enactment of diverse ideologies, worldviews, and desired futures through actions such as design, additional approaches may be needed.
Notably, I am not calling on every critical scholar to engage directly in system building or technical implementation. Rather, it is a call to encourage forms of contribution grounded in real-world practice, and to more actively cultivate connections with design and technical communities. At the same time, I remain cautious of technosolutionism and the tendency to overestimate what design or technical intervention alone can accomplish. Design as outcome does not replace critique or political struggle. Instead, it offers one situated mode of engagement that can be partial, imperfect, and always at risk of co-optation. Nonetheless, it allows critical commitments to be tested, negotiated, and sometimes sustained within the realities of everyday technologies.
In closing, many of the reflections presented here emerge from my research trajectory, including my recently accepted CHI 2026 paper, which offered a critical examination of the opportunities and risks of bringing large language models into everyday feminist practices on social media. Although that work was empirical in nature, colleagues outside critical computing, as well as participants, repeatedly raised questions similar to the one quoted at the beginning of this paper. I therefore offer these reflections not as settled conclusions, but as considerations intended to inform my future research directions and, hopefully, contribute to broader conversations in the field.
References
[1] Jeffrey Bardzell and Shaowen Bardzell. 2013. What Is “Critical” about Critical Design? In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, Paris France, 3297–3306. doi:10.1145/2470654.2466451
[2] Shaowen Bardzell. 2010. Feminist HCI: Taking Stock and Outlining an Agenda for Design. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, Atlanta Georgia USA, 1301–1310. doi:10.1145/1753326.1753521
[3] Shaowen Bardzell and Jeffrey Bardzell. 2011. Towards a Feminist HCI Methodology: Social Science, Feminism, and HCI. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, Vancouver BC Canada, 675–684. doi:10.1145/1978942.1979041
[4] Marion Bartl, Abhishek Mandal, Susan Leavy, and Suzanne Little. 2025. Gender Bias in Natural Language Processing and Computer Vision: A Comparative Survey. Comput. Surveys 57, 6 (June 2025), 1–36. doi:10.1145/3700438
[5] Eric P. S. Baumer, Mark Blythe, and Theresa Jean Tanenbaum. 2020. Evaluating Design Fiction: The Right Tool for the Job. In Proceedings of the 2020 ACM Designing Interactive Systems Conference (DIS ’20). Association for Computing Machinery, New York, NY, USA, 1901–1913. doi:10.1145/3357236.3395464
[6] Mark Blythe. 2014. Research through Design Fiction: Narrative in Real and Imaginary Abstracts. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI ’14). Association for Computing Machinery, New York, NY, USA, 703–712. doi:10.1145/2556288.2557098
[7] Nadia Campo Woytuk, Anupriya Tuli, Joo Young Park, Laia Turmo Vidal, Deirdre Tobin, Anuradha Reddy, Beatrice Vincenzi, Jan Maslik, Marianela Ciolfi Felice, and Madeline Balaam. 2025. Toward Feminist Ways of Sensing the Menstruating Body. In Proceedings of the 2025 CHI Conference on Human Factors in Computing Systems. ACM, Yokohama Japan, 1–17. doi:10.1145/3706598.3713466
[8] Marianela Ciolfi Felice, Ivana Feldfeber, Carolina Glasserman Apicella, Yasmín Belén Quiroga, Julián Ansaldo, Luciano Lapenna, Santiago Bezchinsky, Raul Barriga Rubio, and Mailén García. 2025. Doing the Feminist Work in AI: Reflections from an AI Project in Latin America. In Proceedings of the 2025 CHI Conference on Human Factors in Computing Systems. ACM, Yokohama Japan, 1–18. doi:10.1145/3706598.3713681
[9] Robert Eccleshall, Vincent Geoghegan, Richard Jay, Michael Keeny, Ian MacKenzie, and Richard Wilford. 2014. Political Ideologies: An Introduction. Routledge.
[10] Nina Frahm and Kasper Schiølin. 2024. The Rise of Tech Ethics: Approaches, Critique, and Future Pathways. Science and Engineering Ethics 30, 5 (Oct. 2024), 45. doi:10.1007/s11948-024-00510-3
[11] Nancy Fraser. 2016. Capitalism’s Crisis of Care. Dissent 63, 4 (2016), 30–37.
[12] Isabel O. Gallegos, Ryan A. Rossi, Joe Barrow, Md Mehrab Tanjim, Sungchul Kim, Franck Dernoncourt, Tong Yu, Ruiyi Zhang, and Nesreen K. Ahmed. 2024. Bias and Fairness in Large Language Models: A Survey. Computational Linguistics 50, 3 (Sept. 2024), 1097–1179. doi:10.1162/coli_a_00524
[13] Donna Haraway. 2013. Situated Knowledges: The Science Question in Feminism and the Privilege of Partial Perspective 1. In Women, Science, and Technology (3 ed.). Routledge.
[14] Ana O Henriques, Anna R. L. Carter, Beatriz Severes, Reem Talhouk, Angelika Strohmayer, Ana Cristina Pires, Colin M. Gray, Kyle Montague, and Hugo Nicolau. 2025. A Feminist Care Ethics Toolkit for Community-Based Design: Bridging Theory and Practice. In Proceedings of the 2025 CHI Conference on Human Factors in Computing Systems. ACM, Yokohama Japan, 1–26. doi:10.1145/3706598.3713950
[15] Jackie Kay, Atoosa Kasirzadeh, and Shakir Mohamed. 2024. Epistemic Injustice in Generative AI. Proceedings of the AAAI/ACM Conference on AI, Ethics, and Society 7 (Oct. 2024), 684–697. doi:10.1609/aies.v7i1.31671
[16] Os Keyes, Burren Peil, Rua M. Williams, and Katta Spiel. 2020. Reimagining (Women’s) Health: HCI, Gender and Essentialised Embodiment. ACM Transactions on Computer-Human Interaction 27, 4 (Aug. 2020), 1–42. doi:10.1145/3404218
[17] Lauren Klein and Catherine D’Ignazio. 2024. Data Feminism for AI. In The 2024 ACM Conference on Fairness Accountability and Transparency. ACM, Rio de Janeiro Brazil, 100–112. doi:10.1145/3630106.3658543
[18] Donald MacKenzie and Judy Wajcman. 1999. The Social Shaping of Technology. Open University, Buckingham, UK.
[19] Aaron Mueller, Zach Wood-Doughty, Silvio Amir, Mark Dredze, and Alicia Lynn Nobles. 2021. Demographic Representation and Collective Storytelling in the Me Too Twitter Hashtag Activism Movement. Proc. ACM Hum.-Comput. Interact. 5, CSCW1 (April 2021), 107:1–107:28. doi:10.1145/3449181
[20] Ronda Ringfort-Felner. 2025. “Otherware” and the Not-Yet: A Methodological Investigation into Speculative Design for HCI by the Example of Sophisticated Autonomous Systems. In Companion Publication of the 2025 ACM Designing Interactive Systems Conference. Association for Computing Machinery, New York, NY, USA, 77–81.
[21] Michelle Rodino-Colocino. 2014. #YesAllWomen: Intersectional Mobilization Against Sexual Assault Is Radical (Again). Feminist Media Studies 14, 6 (Nov. 2014), 1113–1115. doi:10.1080/14680777.2014.975475
[22] Sharifa Sultana, François Guimbretière, Phoebe Sengers, and Nicola Dell. 2018. Design Within a Patriarchal Society: Opportunities and Challenges in Designing for Rural Women in Bangladesh. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. ACM, Montreal QC Canada, 1–13. doi:10.1145/3173574.3174110
[23] Harini Suresh, Rajiv Movva, Amelia Lee Dogan, Rahul Bhargava, Isadora Cruxen, Angeles Martinez Cuba, Guilia Taurino, Wonyoung So, and Catherine D’Ignazio. 2022. Towards Intersectional Feminist and Participatory ML: A Case Study in Supporting Feminicide Counterdata Collection. In 2022 ACM Conference on Fairness Accountability and Transparency. ACM, Seoul Republic of Korea, 667–678. doi:10.1145/3531146.3533132
[24] Ruyuan Wan, Lingbo Tong, Tiffany Knearem, Toby Jia-Jun Li, Ting-Hao ’Kenneth’ Huang, and Qunfang Wu. 2025. Hashtag Re-Appropriation for Audience Control on Recommendation-Driven Social Media Xiaohongshu (Rednote). In Proceedings of the 2025 CHI Conference on Human Factors in Computing Systems. ACM, Yokohama Japan, 1–25. doi:10.1145/3706598.3713379
[25] Melinda R. Weathers, Jimmy Sanderson, Alex Neal, and Kelly Gramlich. 2016. From Silence to #WhyIStayed: Locating Our Stories and Finding Our Voices. Qualitative Research Reports in Communication 17, 1 (Jan. 2016), 60–67. doi:10.1080/17459435.2016.1143385
[26] Richmond Y. Wong and Vera Khovanskaya. 2018. Speculative Design in HCI: From Corporate Imaginations to Critical Orientations. In New Directions in Third Wave Human-Computer Interaction: Volume 2 - Methodologies, Michael Filimowicz and Veronika Tzankova (Eds.). Springer International Publishing, Cham, 175–202. doi:10.1007/978-3-319-73374-6_10
[27] John Zimmerman, Jodi Forlizzi, and Shelley Evenson. 2007. Research through Design as a Method for Interaction Design Research in HCI. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI ’07). Association for Computing Machinery, New York, NY, USA, 493–502. doi:10.1145/1240624.1240704