Author Archives: Meha Gupta

Class Discussion: University Infrastructures

This week, we are introduced to the complexities of public higher education and more broadly, the university’s role as an infrastructure. First, Michael Fabricant and Stephen Brier play with the constant issues public education is facing because of “​​austerity” policies that are based on finding capital gains through the university. They give many examples to show how cost-cutting measures affect the university. Through the history of technological advancement in the university, they also discuss the effect of Open Education Resources like MOOCs which they conclude, shows that “Technological solutions are never value neutral” (198). Second, Stefano Harney and Fred Moten envision the concept of  “fugitive planning” in the university. The text challenges conventional academic structures and proposes alternative ways in which critics might operate in the “undercommons”. At the backdrop is a Marxist understanding that capital production of education produces “students as problems”. They explore the possibilities of radical imagination in the academy which they envision through seven theses that all question what boundaries to freedom exist for people within the university. Theirs is a manifesto of a kind where they argue that acts that have been normalized in the university, like teaching for food have become a stagnant stage in academia. For them, the Undercommons is an alternate hidden space for resisting academic conventions, the “maroons” as fugitives who “put into question the knowledge object, let us say in this case the university, not so much without touching its foundation, as without touching one’s own condition of possibility, without admitting the Undercommons and being admitted to it” (106).

Discussion Questions:

  1. Is the Open Education Resources Movement Western-dominated or not? Do Open Education resources best operate outside the university space where they can’t be exploited for capital gain? Does it affect the internationalization of the American University further promoting austerity blues (think: significantly higher tuition rates, VISA costs for international students)? 
  2. Moten and Harney end their chapter by pointing to the “uncanniness” of “abolition. They write, “The uncanny that disturbs the critical going on above it, the professional going on without it, the uncanny that one can sense in prophecy, the strangely known moment, the gathering content, of a cadence, and the uncanny that one can sense in cooperation, the secret once called solidarity. The uncanny feeling we are left with is that something else is there in the Undercommons” (115). What is this “something else”? Have you ever experienced the “something else” in the Undercommon? Are we working towards it and in what ways?
  3. Brier and Fabricant say that the austerity policies in public higher education cause its underfunding. How is the money distributed within the university itself? Who faces the first budget cuts, let’s say, at CUNY? Can we also think of this with respect to LMS like Blackboard?

[PS: Sorry for posting this late. My bus was supposed to have Wifi. It didn’t]

Class Discussion: Racial Infrastructures

One of the key insights common to all arguments is that there is a correlation between the concept of race and technological elements that go beyond just issues of access to computers, resources, etc. This week, we were introduced to methods of reading formats within technology parallel to the rise of the theorization of race. First, Biewen and Kumanyika provide context about the origin of the concept of race, its rootedness in Western Imperialism, its biological inaccuracy, and its prevailing realness in society. Through a historical overview of travel chronicles from Greek writer Herodotus, Moroccan traveler Ibn Battuta, and enlightenment thinkers Linnaeus and Blumenbach, they show how while “race” is a fairly recent term, due to socioeconomic gains, societies have always maintained biases against Other communities within themselves. Kumanyika concludes by iterating on the need to understand the underlying causes for the perpetuation of dehumanizing narratives against people of color. Second, McPherson picks up on issues of computation and devises an interesting parallel between the “modular” format of operating systems like UNIX (UNiplexed Information Computing System) and the “lenticular logic” in post-civil war USA both of which follow a system of disconnection and decontextualization. Postulating their argument as a conceptual framework for digital scholars, McPherson writes that we need to be aware of and somehow attempt to undo the “fragmentary knowledges encouraged by many forms and experiences of the digital [that] neatly parallel the logics that underwrite the covert racism endemic to our times, operating in potential feedback loops, supporting each other”. Third, Johnson introduces the idea of “markup bodies” to state the methodologies employed by black digital scholars as they attend to the descendants of enslaved people and provide attention to their narratives. Johnson writes with a critique of data arguing that “Black digital practice requires researchers to witness and remark on the marked and unmarked bodies, the ones that defy computation, and finds ways to hold the null values up to the light” (71-72). Finally, Ghorbaninejad, Gibson, and Wrisley decode a case study of the lack of programming tools, UI/UX design precision, and metadata settings, of right-to-left (RTL) languages like Arabic, Urdu, etc. They argue that because the RTL DH scholarship is distanced from academia in the Global North and provides little to no financial incentive to most DH scholars in the West, “Digital humanists must become advocates for broad-level RTL integration” (63). Overall, studying the technological backgrounds of racialized knowledge infrastructures allows us to question what an interdisciplinary perspective means i.e. how the “digital” and the “humanities” really work together. It urges us to ask how can we critique knowledge infrastructures when their seemingly “invisible” technical operations systematically influence our understanding of “race” today.

For me, I was deeply invested in the abundance of technical information in these texts, something I am new to as a scholar of English. So, I ask these questions:

  1. McPherson traces the similarities in form between computational systems like UNIX/relational databases, and the liberal perception of “race” using the “Rule of Modularity” i.e. the separation of parts of a code into workable sections. She argues that computational systems and cultural imaginaries born in the 1960s “mutually infect one another”. To many scholars in the humanities aware of the Sokal affair, McPherson’s argument can seem a bit radical. Similar to the defense of “black boxing” in supply chains, digital technocrats might argue that “modularity” is necessary to produce simplicity during coding. Does our specialization in a particular academic department make us less susceptible to interdisciplinary arguments like these? How might our lack of interdisciplinary knowledge (eg. many DH scholars not being coders) affect our judgment of racial infrastructures?
  2. Kumanyika ends the podcast by saying, “You know, I’ve seen things on race, like they pull Kanye, they might pull Shaquille O’Neal, like hey why are you interviewing these people to talk about race? It’s not their thing.” Do we need “experts” to talk about race? Can’t any person of color have the right to speak critically about race or is a certain mode of knowledge necessary? In other words, are knowledge infrastructures “invisible” only to a few?
  3. Ghorbaninejad et al. argue that contemporary content in RTL languages seems to contain less “value” than LTR in the Global North today. In order to sustain RTL scholarship, they recommend Internationalization followed by Localization where “software is first designed to be locale-indifferent before it is localized to meet the regional, linguistic/cultural, and technical requirements of each locale” (55). Can we think of any examples where software is Internationalized but not Localized causing what the authors fear causes “cultural mistranslation.” Further, should the responsibility fall on people to whom the culture is “local” to produce such software? Do issues like funding further complicate this possibility?
  4. Most of the readings this week recommend advocacy from the DH scholar. What does advocacy look like for you?
  5. For Jhonson, “data” is the “objective and independent unit of knowledge”. It also signifies “the independent and objective statistical fact as an explanatory ideal party to the devastating thingification of black women, children, and men” (58). What might Jhonson say are the ethical challenges for “quantitative” digital studies like sentiment analysis or geospatial analysis?

Tracing a Knowledge Infrastructure: Pedagogy

In the past, when I thought of where academic journals exist physically, I thought of a specific university space. I now believe that some academic journals mirror the very hiddenness of academia’s curriculum, making academia a tough space to navigate for the newcomer. A reason for my initial thought process is that I was introduced to the idea of journals when I heard the faculty at Saint Louis University (SLU), where I finished my MA, often say that the editor of the African American Review “brought the journal” to SLU. The phrase “brought the journal” drew the image that the department was deeply linked with the workings of the journal even though, as graduate students, we rarely heard about the journal. Interestingly, the title “Saint Louis University” was also nowhere to be found on African American Review’s website. I quickly realized that the journal was most likely just another point factor for department rankings and while the editor of the journal was dedicated to its workings, the English department at SLU didn’t do more than pay the editor for his professorship. More importantly, it was Johns Hopkins University Press that ensured the supply of the journal.

In order to understand the relationship between the journal editor, their institutional affiliation, and the journal’s distribution, I looked into one of the esteemed journals in my research field, Pedagogy: Critical Approaches to Teaching Literature, Language, Composition, and CulturePedagogy is published by Duke University Press but its editorial office is in the Department of English, Calvin University. Pedagogy came out with its first issue in 2001 wherein the then editors marked its inception as a necessary step to nurture pedagogical conversations in the field of English studies. While the journal “seeks to reverse the long history of the marginalization of teaching and of the scholarship produced around it,” the majority of its online issues hide behind a paywall. In their first published Editor’s Review, co-editors Jennifer L. Holberg and Marcy Taylor end by thanking “[T]he staff of Duke University Press also have been generous with their guidance and creativity” along with “the moral and financial support of our respective institutions, Calvin College and Central Michigan University” (Holberg and Taylor 5). The words “guidance” and “creativity” give me the impression that most content-based decisions are guided by the press while the editors are responsible for the mission of their institutions given that they provide “financial support”.

As for distribution, Duke University Press works like any other entrepreneurial brand today, they “use a number of strategies to attract new readers, from direct mail campaigns and social media publicity to website development” (“Support for Journal Editors”). However, these strategies don’t seem very personalized. Funnily enough, they share the same Northeast sales representative as that of Columbia University Press: Conor Broughan. Other major journals like Harvard University Press are also marketed by the “Columbia University Press Sales Consortium”. In an interview about the pandemic’s effect on publishing, Conor highlights how individual customer-driven sales can be: “One store in Halifax, Canada, found out when it reopened that a number of its web orders came from women in their 20s who wanted to support the store. It is now ordering with them in mind” (Rosen). Since I had never experienced the publishing industry’s workings, I had not realized how spread out a press that published most of my bibliography can be. In terms of access, while Duke University Press does participate in Open Access, with some of their publications being accessible for all online a week before they’re mailed out, Pedagogy remains a subscription-based journal. Perhaps institutional financial aid is not enough. It is still unclear to me how the subscription money is used since authors don’t get paid for publishing.

Overall, I think all that I know about the labor involved in the production of academic journals has been acquired through my personal experience in academia, eg. conversions with the invisible graduate research assistants to journal editors. If not blackboxing, I think the information about academic journal publications is peripheral, the facts are available but they’re difficult to link together. Where are the agreements between the journal and the press? Can someone like the “woman in her 20s” who might be interested in reading an issue of Pedagogy find how the journal distributes its labor? Can we figure out the logistical details of Calvin University’s financial assistance to Pedagogy or the editors? Graduate students might submit to these journals multiple times but are we really made able to critique any aspect of Pedagogy beyond an angsty book review? Are readers to journals what peer reviewers are to submitters: only valued for the content they provide? As someone who has no experience with academic publishing, I am urged to think that the process is quite hidden.

Personal Narrative: Hypothes.is

One of my go-to applications for both personal and collaborative annotation is “Hypothes.is”. Hypothesis is a Learning Management System most popularly used through its browser extension that allows you to annotate web pages i.e. anything between New Yorker articles and Project Gutenberg book pages. You can make an account for private annotations or create groups for collaborative work on a webpage. I have most often used it to annotate poems on the Poetry Foundation website since much of my research and creative work involves poems. Along with my academic work, I also assign my students to annotate texts together using Hypothesis which often allows them to engage with each other’s thought processes. The best part about Hypothesis is that many people make their annotations public meaning that you might open a link to a Walt Whitman poem and see what both high school and graduate students have to say about the text. The company itself is big on open access, as it writes on its website: “Our mission is to enable conversations over the world’s knowledge.” 

Hypothesis is an “open source” platform so they take the word “collaboration” pretty seriously. Upon doing some research, I discovered that Dan Whaley, the founder of Hypothesis, developed this software to allow climate scientists to have conversations on climate research through web annotations on ClimateFeedback.com. As someone who had only used Hypothesis in an educational context, it was interesting to see the political kairos of the application since for the longest time, I assumed that it was created out of an academic necessity. More importantly, its functional methodology is not the only collaborative aspect of Hypothesis. They also invite developers to contribute to technical conversations on GitHub, Slack [which I joined and was surprised at how it had less than 100 members], and a mailing list thread. While this deeply collaborative nature made me comfortable in my use of the application, I grew skeptical of the long-term promise of “accessibility” as I saw that Hypothesis is funded by ITHAKA, the same organization that funds JSTOR. 

The most exciting part is that Hypothesis is quite open about the materiality of its processes too. An article posted on their website titled “Beyond Borders: Why We’re Now Also Hosting Data in Canada” (2022) mentions, albeit critically unaware of the very environmental effects the application was born to discuss, the importance of privacy in data centers:

“Having all social annotation data housed within their borders helps Canadian schools comply with national, regional, and institutional data storage policies. Hypothesis will work the way it always has for Canadian users but, behind the scenes, all their data can now sit within Amazon Web Services data centers located in Canada.”

While the point here seems more of legality than privacy, I appreciate the very use of the term “data center” now that I am aware of its infrastructural role. Of course, I am reminded of the very modular nature of the supply chain, and how the seemingly ethical and accessible company relies on the notorious “Amazon” for the physical space it fosters. The truth also is that Hypothesis’ access is restricted to the “knowledge” of those with access to the internet.

Finally, I think it is this realization, as Miriam Posner discusses, of the inevitable reliability of supply chain elements on alienated unknown nodes that makes me feel okay with my use of the Hypothesis. I have used other close-knit and personalizable annotation technologies, like Doccano, which needs to be hosted on platforms like Heroku but it involves the labor of manually creating user profiles for each student, even if it comes with the perks of being in charge of the HTML code of the annotations and being able to export them in the ways I wish. For large-scale use, however, and for the pleasure of global connectivity with annotators, I think Hypothesis is here to stay in my work.