Brian Ballentine
West Virginia University
brian.ballentine@mail.wvu.edu

Download the full article in PDF format

ABSTRACT

Technical communication instructors, especially those with expertise in visual rhetoric, information design, or multimedia writing are well-suited to teach an introductory Digital Humanities (DH) course. Offering a DH course provides an opportunity to reach extrafield audiences and work with students from a variety of humanities disciplines who may not have the option of taking such a course in their home department. The article advocates for a DH course that offers a methods-driven pedagogy that engages students with active learning by requiring them to research, dissect, and report on existing DH projects, as well as work with existing datasets and methods from prior student research projects or existing DH tools. The sample student project reviewed here uses the data visualization software ImagePlot, and discussion includes how the student used the tool to examine changes in brightness, hue, and color saturation, as well as calculate the total number of distinct shapes from 397 comic book covers. Ultimately, the students are tasked with developing a research question and moving to an articulated methods-driven approach for exploring the question. The student project along with the tools and sample datasets available with them are treated as a module that may be included in an introductory DH course syllabus or training session.

INTRODUCTION

In 1979, Carolyn Miller published her often cited claim for the humanistic value of technical writing, a field, she argued, that needed to shake off a “positivist legacy” and (re)position the rhetorical tradition at its core (p. 611). On one hand, technical writing was guilty by association with the long traditions of Descartes and Bacon. The field inherited a legacy of science understood as an empirical endeavor in pursuit of singular, verifiable answers. Technical writing was perceived as solely in the service of producing a “windowpane of language” through which readers may see the unobstructed results of well-executed science (Miller, 1979, p. 613). On the other hand, Miller argues, it was the field’s own pedagogical practices that were perpetuating positivism by confining the role of writing to matters merely of form and style. Her article outlines several steps for reparations including reemphasizing the rhetorical canon of invention. Miller concludes: “Good technical writing becomes, rather than the revelation of absolute reality, a persuasive version of experience. To continue to teach as we have, to acquiesce in passing off a version as an absolute, is coercive and tyrannical; it is to wrench ideology from belief” (p. 616). To embrace technical writing as rhetorical, then, is to replace an absolute reality with one that is probable, contingent, and constructed.

Miller’s article is more than 40 years old, and technical writing, with its evolution into technical and professional communication, has blossomed as an academic field and as a profession (Kynell- Hunt & Savage, 2003, 2004; Rude, 2009). Academic researchers and instructors in the field do largely see their work as “based in rhetoric,” but they have also become comfortable with borrowing and then teaching research methods from the sciences and social sciences including “psychology, anthropology, sociology, and related approaches”(Spinuzzi, 2003, p. 5). In other words, rhetoric often serves as a methodology and is paired with research methods adapted from other fields. Spinuzzi (2003) offers a reminder that the two terms should not be used interchangeably: “A method is a way of investigating phenomena; a methodology is the theory, philosophy, heuristics, aims, and values that underlie, motivate, and guide the method” (p. 7, italics in original). Employing rhetoric as an underling theory that informs technical communication research helps balance the tension of using methods that may appear to a humanist as overly scientific or “to an extent even positivistic” (Porter, 2013, p. 133). As a humanistic pursuit, technical communication has come to terms with an epistemology that can leverage scientific methods without succumbing to scientism.

This article begins with the premise that the burgeoning field of the Digital Humanities (DH) wrestles with similar epistemological concerns and, like technical communication, can better address those concerns by making specific adjustments to its approaches to pedagogy including overt attention to research methods and methodologies. The additional claim is that technical and professional communication instructors are uniquely positioned to offer methods-driven courses to a broader range of students than they currently do. Literature, history, art history, geography, political science, and other majors should be offered access to coursework that provides instruction on a range of computational research methods while also making room for their own humanities field-specific topics and pursuits. Even if this instruction is being offered in the home departments of these majors, technical communication has evolved from a rhetorical tradition that has successfully integrated computational research methods. The result leaves instructors with a “tactical” advantage across humanities disciplines.

Kirschenbaum originally identified DH as “tactical” as it is “unabashedly deployed to get things done” (2012a, p. 415). In the introduction to their collection, Rhetoric and the Digital Humanities, Ridolfo and Hart-Davidson seize on this tactical quality for rhetoric and writing fields including technical communication by suggesting researchers resituate their projects “under the umbrella of DH in order to leverage funding, institutional recognition, and extrafield audiences” (2015, p. 4). Technical communication instructors offering courses in digital rhetoric, visual communications, information design, and multimedia writing courses often offer DH instruction in all but name. E.g., see Stephens’s (2019) Narrative Information Visualization course wherein she expands on Ferster’s (2013) “ASSERT” framework of ask a question, search for evidence, structure information, envision an answer, represent data, and tell a story. The tactical advantage of being able to appeal to “extrafield” majors is significant especially when technical communication programs are housed within English departments or operate within what has been described as the “English as host” model (Yeats & Thompson, 2010, p. 226). At the author’s institution, the professional writing program, which includes an MA as well as an undergraduate concentration, is housed within the Department of English. Graduate students from the department’s MFA program and the MA and PhD in English programs frequently enroll in professional writing courses for not only the courses’ content but also their perceived utility. Like DH, these courses have a reputation for getting things done in the humanities and have curricula featuring a mixture of theory and practice. For more than a decade, the professional writing faculty have been offering a course titled Humanities Computing and, in an adjustment that can only be described as tactical, recently renamed the course as Introduction to the Digital Humanities.

This article starts with a brief overview of issues and challenges surrounding DH pedagogy including a claim that the field’s initial concerns over, and even resistance to, methods-driven, computational research are challenges that technical communication has overcome. DH, too, is making progress with efforts to more directly confront the anxiety surrounding the importing of scientific methods into humanistic studies such as English. Just a few years ago, arguments against the field’s embrace of technology, along with the scientific and computational methods that outline its use, had gone as far to suggest that DH was part of a larger “neoliberal takeover of the university,” and that it promoted a “redefinition of technical expertise as a form (indeed, the superior form) of humanist knowledge” (Allington et al., 2016, paras. 1 & 3). Literary scholars and librarians active in DH have pushed back on these claims (Greenspan, 2019; Underwood, 2016; Varner, 2016), but suspicions and critiques of computational research conducted within the humanities, in particular literary studies, continues within the DH community (Da, 2019b; Hoover, 2016). While technical communication has its own challenges within both academia and industry, it also has “long-running ties to digital technology as both object and medium of inquiry” (Ridolfo & Hart-Davidson, 2015, p. 2). The recommendation here is that technical communication continues its own critical engagement with humanities computing practices while also working to push to normalize the presence of humanities computing practices by way of embracing a methods- driven pedagogy.

The course discussed in this article is an introduction to DH course suitable for advanced undergraduate or master’s-level students that is explicitly methods-focused and includes suggested readings and course activities. The article offers details from a student project that employed different DH tools, as well as a discussion on how students developed research questions and methods for tackling their projects. The student project demonstrates the movement from a research question to an articulated methods-driven approach for exploring the question. Also, access to sample datasets early in the semester is key to student success later in the course. It is indeed true that “any old data will not do,” so the focus here is on selecting data that students can work with to form meaningful questions and devise methods for pursuing answers (Goldstone, 2019, p. 214). This discussion is not meant to be overly prescriptive but offers more of a “lessons learned” reflection from attempting this pedagogical approach. The student project is treated as a module that readers could insert into a DH course populated with a variety of topics and tools.

DH PEDAGOGY

The DH community makes room for reflections on itself including extensive definitional debates regarding what counts or qualifies as DH (Alvarado, 2012; Berry, 2012; Kirschenbaum, 2012b; Liu, 2013; Svensson, 2012). These debates can showcase an anxiety that “the importance of human hermeneutic interpretation potentially diminishes” as we embrace digital methods (Evans & Rees, 2012, p. 21). For academics, these discussions have helped advance the field, even if that advancement has led some to conclude that “DH’s resistance to definition may be its most unifying trait” (Ingraham, 2015, p. 11). With a history of focusing on definitions and terms, accusations have been made that “teaching and learning are something of an afterthought for many DHers” (Brier, 2012, pp. 390–391). Arguably, though, there has been more recent attention to DH pedagogy (Fyfe, 2016; Goldstone, 2019; Hirsch, 2012; Jakacki & Faull, 2016; Norton, 2019) and even an overt acknowledgement by some that “DH pedagogy at the undergraduate and graduate levels is essential to the futures of our [humanities] fields” (Cordell, 2016, p. 460). The argument here is that in an introductory course, students are better served by not wading into extensive definitional debates. Instead, students first receive an open acknowledgement that teaching digital methods of any kind is challenging and, second, are provided with a transparent plan for a DH course’s pedagogical strategy for meeting those challenges (Goldstone, 2019). The objective is to maximize time for concentrated methods-driven DH instruction and training.

According to some leading scholars in the field, limited access to DH training is exactly what is slowing the field’s advancement. In, “Digital Humanities as a Semi-Normal Thing,” Underwood notes: “At most universities, grad students still cannot learn how to do distant reading. So there is no chance at all that distant reading will become the “next big thing”—one of those fashions that sweeps departments of English, changing everyone’s writing in a way soon taken for granted. We can stop worrying about that” (2019, p. 97, italics in original). Distant reading has expanded to include a variety of methods and approaches to studying large corpora of written and visual “texts.” As defined by Moretti, it allows researchers to “focus on units that are much smaller or larger than the text: devices, themes, tropes – or genres and systems” (2000, p. 57). The patterns that can emerge from a distant reading of, say, thousands of novels can expand our analysis of the literary canon well beyond what is typically possible with traditional close readings of texts. Of course, distant readings of all kinds also require separate instruction. But, rather than spending time lamenting the absence of instructional and curricular opportunities for students to do DH work such as distant reading or a quantitative analysis of literary corpora, Underwood takes solace in DH having established even a minimal foothold within the humanities. He concludes that the absence of instructor training impedes the progress of DH more than lingering controversies over computational methods.

But if there is not yet enough DH training available there is arguably an increasing amount of it. The more established venues for DH training have blossomed in both size and scope. For example, the archive of the Digital Humanities Summer Institute (DHSI) hosted each year at the University of Victoria, Canada, dates back to 2001. While 20 years ago the institute offered just a handful of sessions, DHSI 2021 had 47 unique sessions ranging from “Programming for Humanists” to “Introduction to Web Scraping” to pedagogy sessions such as “Critical Pedagogy and Digital Praxis in the Humanities” (“Course Archive (2001-2021),” 2021). Those that subscribe to the DHSI email listserv or follow the institute on Twitter @DHInstitute know that the organization frequently shares information about DH-related events, jobs, and calls for proposals. This fall, reminders went out on the DHSI listserv for “GIS Days 2021,” a five-day, online conference facilitated by Western University in collaboration with other universities across Ontario, Canada. This free event catered to all skill levels and the organizers emphasized that “Everyone from anywhere in the world is welcome” (F. Berish, personal communication, November 10, 2021). Other DH organizations that formed originally with regional audiences in mind now, out of necessity, moved to virtual formats that enable broader participation. For example, Keystone DH hosted their first conference in 2015 at the University of Pennsylvania and their mission focuses on “advancing collaborative scholarship in digital humanities research and pedagogy across the Mid-Atlantic” (About, n.d.). Their 2020 conference hosted by Temple University was cancelled due to the pandemic but ran virtually for three days in July of 2021.

As the audiences, topics, and training opportunities expand, there are examples of the field doing more to ensure space for marginalized voices. Jim Casey and Kevin Winstead, both with the Center for Black Digital Research at Penn State University and the Colored Conventions Project, delivered their keynote at the 2021 Keystone DH conference titled, “What can Black digital humanities be? Movements, Collectives, Principles.” The presentation is available from the Temple University library and contains outstanding resources including crowd-sourced teaching materials at #BlackDigitalSyllabus (Temple University Libraries | Past Program and Event Videos, n.d.). Participants at DHSI 2021 had access to sessions such as “Race, Social Justice and DH: Applied Theories and Methods” and “Pedagogy of the Digitally Oppressed: Anti-Colonial DH Critiques & Praxis.” These sessions, the training they provide, and the scholarship they help produce and promote are essential to the field for keeping our “reliance on digital tools” in check by Noble (2019, p. 27). As Noble explains in “Toward a Critical Black Digital Humanities,” we must critique the development and deployment of our digital tools lest our dependence on them inadvertently “exacerbates existing patterns of exploitation and at times even creates new ones” (2019, p. 27). In other words, there is a danger of replicating old, oppressive practices with and through new digital tools and methods. Risam (2018), offers a similar warning in New Digital Worlds stating that “‘new’ methodologies, are not conjured out of thin air by digital humanities practitioners, but are built on the histories and traditions of humanities knowledge production that have been deeply implicated in both colonialism and neocolonialism” (2018, p. 4). Chapter four of her book is dedicated to “Postcolonial Digital Pedagogy” and discusses specific projects Risam has worked on with her students that actively demonstrate the “geographical and spatial dimensions of postcolonial writing” (109). Her projects show how changes to our pedagogy may actively alter and even supplant entrenched practices. Risam contends that “postcolonial digital humanities pedagogy facilitates the development of twenty- first-century literacies and positions students to be critical readers and creators of knowledge as they learn about the politics that surround knowledge production and how they can intervene in it” (2018, p. 108).

Scholars in technical and professional communication have been making their own interventions and there is a productive parallel with this field’s recent social justice turn. In 2016, Jones, Moore, and Walton began to shed light on what gets left behind collectively when the field is “most concerned with objective, apolitical, acultural practices, theories, and pedagogies” (2016, p. 212). The problem with the field placing a premium on such a pragmatic understanding of itself is that the resulting identity sidelines inclusion. Their article advocates for an “antenarrative” approach that promotes inclusion with “efforts to forward a more expansive vision of TPC, one that intentionally seeks marginalized perspectives, privileges these perspectives, and promotes them through action” (Jones et. al., 2016, p. 214). In their subsequent book, Technical Communication After the Social Justice Turn, they boldly call out the field as “complicit in injustice” (Walton et al., 2019, p. 8). Beginning with detailed discussions on positionality, privilege, and power, the authors demonstrate how technical communicators can identify and replace oppressive practices with intersectional, coalition-building action (Walton et al., 2019, pp. 133–134). Integrating social justice with technical communication means “an understanding of the human impact of our work, scholarship, and pedagogy” (Walton et al., 2019, p. 83). Again, because technical communication instructors often already possess both the necessary technical proficiencies and familiarity with methods-driven research, the DH classroom becomes an opportunity to demonstrate to students a vast diversity of research and the ways it can and should be valued.

PROGRAM AND COURSE CONTEXT

The Department of English at West Virginia University has four graduate programs and faculty teaching at the graduate level expect an interdisciplinary mix of students in their courses. MA and PhD literature students along with MFA (fiction, non-fiction, and poetry genres) and MA in Professional Writing and Editing students share many graduate courses. The graduate-level DH course is popular among all four programs and has historically been taught by a faculty member affiliated with the professional writing program. The relationships among DH, English and literary studies, and rhetorical studies, broadly defined, are not without their contentious points and include claims not necessarily for primacy but more so for a pioneering presence. For example: “Yet much of the territory claimed by DHers was inhabited by rhetoric and composition long before DH arrived…Indeed we have been here for decades…Yet rarely is our field’s literature cited by DHers” (Carter et al., 2015, p. 35). Be that as it may, the concerns described here deal less with evolutionary disputes or delineating historical “waves” of DH and more with the opportunity technical communication instructors have to teach a diverse range of students, regardless of background, for methods-driven studies (A Digital Humanities Manifesto » The Digital Humanities Manifesto 2.0, 2009, para. 10). Many of the authors in Ridolfo and Hart-Davidson’s (2015) collection are academics teaching and researching in English departments or stand-alone rhetoric and technical communication programs of various names. Many also offer courses and degrees in technical and professional communication (or similar designation such as professional writing and editing). For students in the DH course, the interdisciplinary mix of the classroom is part of their daily lives but very few of them outside of the professional writing program have had any exposure to the methods-driven pedagogy waiting for them in the course.

Methods at the Epicenter

A week before the semester begins, I circulate Smagorinsky’s (2008) article, “The Method Section as Conceptual Epicenter in Constructing Social Science Research Reports.” Students are warned that the article, published in the journal Written Communication, has nothing overtly to do with DH. Instead, the piece serves to set priorities for the course and asks students to begin thinking in terms of a methods-driven approach to their work. Smagorinsky advises that “[t]he Method section, then, has evolved to the point where, in order for results to be credible, the methods of collection, reduction, and analysis need to be highly explicit. Further, the methods need to be clearly aligned with the framing theory and the rendering of the results” (2008, p. 392). That is, methods and methodology as delineated by Spinuzzi (2003), should not only be transparent to the reader but understood as rhetorical for their part in supporting a study’s integrity.

It helps, too, that Smagorinsky is a rather colorful writer. To illustrate why a poorly formed method section is a hazard for research, he asks his audience to imagine reading a vague food recipe with preparation directions that do not name specific ingredients, amounts and measurements, or cooking times and temperatures. After randomly gathering and combining the contents, he concludes, “Put them in cookware, heat, and serve” (Smagorinsky, 2008, p. 393). Students reported that the analogy resonated with them. The phrase “heat and serve” became shorthand in student conferences and peer feedback for underdeveloped methods sections.

Despite assigning Smagorinsky’s piece willfully outside of its expected context (it is a common source for scholars in writing, rhetoric, and technical communication that do any kind of qualitative research), the article pairs well with other readings scheduled early in the semester to introduce DH concerns over hermeneutics and methods. In the second edition of the collection, Debates in the Digital Humanities (2016), Clement notes that “most critiques of DH point to a decoupling of methods from the theoretical perspectives that would ordinarily help situate the kind of intellectual effort being engaged” (2016, p. 158). Students receive the message early on that DH researchers must be mindful of discrepancies and disconnects between method and methodology. Not dissimilar to Smagorinsky’s warning regarding “heat and serve” methods sections, Clement asks her readers to “consider how reductive it would seem to describe the mere presence of the techniques and methods as doing digital humanities. It would be like saying that doing ethnography simply entails establishing relationships, watching people, transcribing interviews, and keeping a diary” (2016, p. 158, italics in original). While this message should be continuously reinforced for students, from the perspective of the field of technical communication, a core premise of Clement’s chapter is not new. She writes that “an articulation of methodology helps researchers reinforce the systematic nature of their chosen approach. As I will argue, this is an act that ultimately facilitates a deeper engagement with theory” (p. 155). Indeed, researchers in technical communication may find little need to argue this point. Popular texts that provide instruction on conducting qualitative studies will insist that, for example, “[Q]ualitative sampling is often decidedly theory driven” (Miles et al., 2013, p. 31 italics in original). A bit more bluntly, Porter instructs, “You can’t do very much useful work–basically none–as a technical communicator without theory” (Porter, 2013, p. 130).

Explicitly ascribing value to theory when humanists employ research methods borrowed or derived from sciences and social sciences has been a key to success for technical communication and rhetoric and composition in general. New DH publications are in fact making explicit connections between theory and method. Chapters in the collection, Research Methods for Creating and Curating Data in the Digital Humanities (Hayler & Griffin, 2016) contend with the tension between employing computational methods and retaining a role for theory so valued by humanists. The editors assert early that “the making of any new digital artefact (which includes collections and archives) is always shot through with both theory and unrecognized assumptions that later need to be teased out and analyzed so that the items might give up at least some of its secrets” (Hayler & Griffin, 2016, p. 2).

The debates over the humanistic value of technical writing helped position the field to contend with its legacy of positivism by advancing adjustments to its pedagogy but also by publishing work to support those adjustments. Case studies, particularly studies of workplace and networked writing environments using both qualitative and quantitative methods are not uncommon for the field (Spinuzzi, 2003, 2008). Students and scholars of technical writing also have access to field-specific publications on how to begin studying human subjects and the writing they produce (Hayhoe et al., 2020; Leavy, 2017), and journals such as Technical Communication Quarterly have published reviews of new guidebooks from outside the field for doing, for example, qualitative data analysis and data coding (Hashimov, 2015). These tactics, strategies, and tools employed by writing researchers have much to offer the DH classroom.

Mature DH Project Discussion

After introducing students to the premise that their course will have a methods-driven approach at its core, students locate, examine, and report on successful DH work. In the first week of class I give a sample presentation to model the core components required for a review of what constitutes a “mature” DH project. This assignment is by no means unique and has its origins in assignments found online from collections such as the Zotero Digital Humanities Group archive. This version of the assignment asks students to focus on the primary questions the investigators were exploring with their research and to distill and report on the investigator’s methods for doing so. Students are provided with links to several well-known DH archives and collections (e.g., Berkeley, South Carolina, Stanford, UCLA) that model DH work but I do not vet or try to restrict what students decide to investigate. If inclusivity is to be a “central goal” for technical and professional communication as well as DH, course instruction should clearly demonstrate how it “intentionally seeks marginalized perspectives, privileges these perspectives, and promotes them through action” (Jones et al., 2016, pp. 213–214). As Jones, Moore, and Walton remind us, “the only stories that are heard are the stories that are (re)told” (2016, p. 214). At this point in the course, students should also receive the link to the crowd sourced Google doc with hundreds of links to “Black Digital Humanities Projects and Resources.” The list contains projects that are powerful examples of mature DH work (e.g., see the Digital Library on American Slavery and Digital Harlem). The last iteration of the course had a broad, interdisciplinary range of graduate students including those studying literature, history, creative writing, and professional writing. It can be a challenge to locate sample projects that speak to such a wide range of interests, but students reported they felt the readily accessible DH archives had something for everyone.

For example, early in the semester a student located The Virtual St. Paul’s Cathedral Project hosted at North Carolina State University, specifically its first phase called the Virtual Paul’s Cross Project. This portion of the project endeavors to replicate an “experience of hearing John Donne’s sermon for Gunpowder Day, November 5th, 1622 in Paul’s Churchyard” (Wall, n.d.). Further, the project sought to explore how an audience member’s experience might have been affected by ambient noise, the setting’s acoustics, the total number of people in attendance, where a person was standing when they heard the sermon, and even the weather conditions speculated for the day of the sermon. While this is a historical project, at its core it asks us to consider how an immersive experience of this past event may deepen our understanding of a public sermon as a social, cultural, and communal event. As is often the case, robust DH projects such as this result in academic articles authored by a principle investigator (Wall, 2014a, 2014b) and even project reviews in scholarly journals (Smith, 2014). Students are responsible for locating these sources and sharing them with the class as part of their presentation, but also for bringing these sources into conversation with their own assessment of the project. The student presentation needs to make sense in terms of the methods-driven focus of the course and the details available on this project work well for the purposes of the course. In a section titled, “epistemology of digital modelling” Wall describes the project’s visual and acoustic models as “simulations of things created according to principles of interpretation, organisation, and display” (2014b, para. 11). The project’s interface allows users to hear a recording of a professional actor delivering Donne’s sermon but what they hear changes based on the selection of eight different standing locations and whether the crowd has 500; 1,200; 2,500; or 5,000 members. If the user selects an audience of 5,000, all those people both make noise and muffle sound.

After engaging with the interface, students understand that what the project offers is an array of experiences, and not the definitive experience that lays claim to having replicated exactly what it must have been like to be there. Returning to Miller’s earlier quoted remarks, the project is not “passing off a version as an absolute” (Miller, 1979, p. 616). There is a wealth of data assembled and constructed to make this remarkable project possible, but the outcome remains a probable construct. Because the broader concern of the project pertains to public sermons as a cultural phenomenon, students also understand that to the extent the Virtual St. Paul’s Cathedral Project is able to answer questions, it asks just as many.

DH opens up new avenues for well-established and well- researched fields to revisit old questions and even ask new ones. This represents a great deal of DH’s appeal. In his Los Angeles Review of Books DH interview, Galloway wondered, “what more can you really say about Shakespeare today? There isn’t a whole lot…You see this frequently in very old, extremely erudite, well- established disciplines where there is very little territory left in which to do research. I think DH has opened up a new territory. It has allowed people to find a new space” (2016, para. 3). Literary and cultural studies is accustomed to trying out new critical theories or methodologies as a way of rereading and reinvigorating well- researched subject matter. Old territory is rejuvenated by applying fresh critical lenses as with Shakespeare and Early Modern scholars reading their field through object oriented ontology, post humanism, and ecocriticism (Cohen & Yates, 2016). Established literary fields are less accustomed, however, to adopting new methods to approach their critical work. Doing so would require fundamental shifts in both pedagogy and practice. As Clement remarked, “That reading is universally understood as a reliable hermeneutical method in the humanities means that humanists are not typically required to argue for it as a method” (Clement, 2016, p. 163). In my own department’s graduate-level, literary research methods course, the primary concern is with how and where literary and cultural studies scholars perform research (archival work and searching databases) along with instruction on important conventions of a genre such as an abstract. For the papers that the students produce, there is no expectation of an identifiable, separate methods section. The literature review demonstrates the student’s due diligence in researching, citing, and integrating the appropriate sources for the subject matter. Reading remains the implied method.

The requirements for shifting our perspectives on methods is perhaps best captured by Ramsay’s Reading Machines: Toward an Algorithmic Criticism, another core reading requirement for the course. In summary, algorithmic criticism “is simply an attitude toward the relationship between mechanism and meaning that is expansive enough to imagine building as a form of thinking…Its partisans neither worry that criticism is being naively mechanized, nor that algorithms are being pressed beyond their inability. The algorithmic critic imagines the artifacts of human culture as radically transformed, recorded, disassembled, and reassembled” (Ramsay, 2011, p. 85). Ramsay’s project takes issue with a wholesale adoption of the traditional understanding of scientific metaphor or method being applied to humanistic pursuits. He has a very honest moment early on in the book when he says, “We are not trying to solve Woolf. We are trying to ensure that discussion of The Waves continues” (2011, p. 15). This is a reminder that students are not charged with defending their “version as an absolute” and instead are being asked to contribute to a larger conversation remains central to the course.

Working with Existing Data

While DH work such as the Virtual St. Paul’s Cathedral Project serves to continue to foreground methods-driven research, these projects can also overwhelm students. The research often has private foundation or government funding and may be years in the making. The project teams have a wealth of expertise that cuts across disciplines. So, in an effort not to turn students off to the prospects of diving into DH and methods-driven work, students have the opportunity to work with the datasets, tools, and even the methods sections of past students who have taken the course.

Students of literature, history, and other humanistic disciplines in the course should know that the opportunity to repeat or replicate research methods outside of close or critical readings is not standard for many of our fields. The more traditional understanding has been that “few scholars in the humanities have the time – or the expertise – to backtrack through cited studies and evaluate them for correctness and replicability” (Hayles, 2012, p. 68). DH scholarship invites research replication or at least provides the opportunity for other researchers to challenge a study’s methods in ways humanities scholars have not done in the past. We expect, for example, that one scholar’s close reading of a text or critical interpretation of a literary period would be met with other scholars countering that work with their own analyses. With a study’s individual methods on display for critique, challenges to DH scholarship are different in that counter arguments don’t just focus on analysis but the formation of the study itself.

For example, Nan Z. Da offered critiques of several prominent DH research projects specifically by scholars doing computation literary studies or “the statistical representation of patterns discovered in text mining fitted to currently existing knowledge about literature, literary history, and textual production” (2019a, p. 602). In one instance she scrutinizes the methods within a study designed to use detective fiction to examine whether genres changed over time. Da notes that just because a researcher has been able to formulate a study that produces desired results doesn’t mean those results are accurate or viable. After explaining how the use of pre- and post-1941 detective fiction and a corpus of “random fiction” are misaligned in the study’s models, she concludes the author is unable to support the claim of “genres not changing every generation” (2019a, p. 607). The study falls into a category she describes as presenting a “statistical no-result finding as a finding” (2019a, p. 607). Simply put, the results don’t uphold the claims. In this case she states pointedly that when the findings don’t support a claim, it may be that the study’s “method might have too little power— to capture this kind of change.” In other words, desirable results from flawed methods don’t “necessarily mean that you have found something” (Da, 2019a, p. 608). Indeed, it is important to convey to students that a study’s methods are an essential foundation of a researcher’s argument.

Similar to Da’s critique, Bode (2018) offers an assessment of the methods found in the scholarship of prominent DH researchers including Moretti’s past publications (2005, 2013) and takes issue with how he understands and utilizes the vast literary datasets he works with to produce his distant reading studies. To be fair, while novel at the time, DH scholars have advanced beyond Moretti’s methodological approaches to distant reading. Bode’s own project is a good example as she begins with her quest to study fiction published in nineteenth-century Australian newspapers and unpack and articulate the “the complex relationships between documentary record, digitization, data curation, and historical analysis” (2018, p. 3). To do so, she relies on Trove, a digital archive that brings together collections from Australian libraries, museums, galleries, media sources, and government agencies. Her issue with Moretti and other earlier digital humanists stems from their presentation of “literary data and digital collections as precritical, stable, and self-evident…In Moretti’s work on literary history, literary data are consistently presented as facts rather than interpretations” (Bode, 2018, p. 20). Texts, as they circulate through communities of readers, take on varying meaning and significance that can be studied and investigated but not presented, as Bode reminds, as “fact.” Bode’s correction to how we treat data is reminiscent of Miller’s earlier rejection of positivism and a call for technical and professional communication to understand its work as probable, contingent, and constructed. Bode is working with a public data source and she published her book with a Creative Commons license through University of Michigan Press. Her work therefore invites reengagement with her data and her studies. According to Piper (2021), one of the biggest challenges for the field is its “lack [of] quality training data to model important concepts for cultural study” (para. 5). Piper serves as the editor for the Journal of Cultural Analytics where scholars publish not just their research articles but also open access data sets inviting replicated research.

My argument here is that access to data and the replicated research it allows also reveals pedagogical opportunities. In the process of replicating smaller-scale projects carried out by prior students, new students may find ways to improve the methods, data, and analysis of older work. Improving or advancing a former study isn’t a requirement but questioning and engaging with the former work is. Developing assignments that use existing research methods and datasets is an effective means to indoctrinate students into methods- driven studies without first overwhelming them with the daunting task of designing a start-to-finish project. A data visualization assignment introduced early in the semester of a digital humanities course can better prepare students for longer research projects in that course and beyond. The assignment recycles datasets prepared by a former student (with permission, of course) and revisits their methods used to collect and render their data. Again, the assignment is used here as a course module inserted at the beginning of a course designed to cover a variety of DH topics.

The project uses the open-source tool ImagePlot, software made popular by Manovich’s (2012) article, “How to compare one million images?” The student project examines the hue, brightness, and color saturation values of Marvel comic book covers from over half a century. The student’s methods section details their process for collecting and assembling a robust dataset to complete their visualizations. The student project demonstrates how to leverage replicated research so it has effective pedagogical value including showing new students how formulating a research question, curating a large dataset, and discussing the research results are crucial for their success in the course. But, equally important, they will also see the necessary role a well-defined methods section plays in the study. With access to the dataset, methods, and results for the project students are also actively learning DH in process of replication.

Data Visualization and Comic Books

The proliferation of more affordable digital devices and their networked capabilities has helped spawn more new media content as well as enable access to it. Massive amounts of new and newly available content “has created a fundamentally new cultural situation and a challenge to our normal ways of tracking and studying culture” (Manovich, 2012, p. 250). The more traditional humanistic approaches of close reading or analysis of a small sampling of content is insufficient for identifying the many stylistic nuances that can appear over time across large amounts of data. This project takes its cues from Manovich’s influential work (2012) where he demonstrates what he terms “cultural analytics” by studying 1,074,790 contemporary Japanese manga pages (p. 262). The two-part methods section in Manovich’s piece introduces students to the ImagePlot software macro that 1) enables “digital image processing” of large datasets and that can calculate visual characteristics of that data including values such as hue, saturation, brightness, and number of distinct shapes 2) creates visualizations of the dataset characteristics to represent, for example, changes in hue over time (p. 262-263).

Manovich’s Software Studies Initiative website hosts print and video tutorials for using ImagePlot as well as sample datasets for new users to get started with the application. Because learning and using any new software tool can be a challenge for both student and instructor, ImagePlot and its supporting documentation make for a good choice in terms of its ease of integration into the course. The cross-platform tool is relatively simple to use and students reported only minor setbacks installing and running it (ImagePlot is technically a macro that requires the installation of a separate, free application called ImageJ in order to run and the relationship between the two applications can cause some confusion). While these tutorials are useful for students, using ImagePlot to replicate a former student’s research project is a more effective means for new students to experience a methods-driven approach to a research question requiring the curation of a larger dataset.

In this project, the student studied the now-iconic comic book hero, Captain America. Created in 1941, the second world war meant that the character emerged from a uniquely charged socio-political time. The character disappeared after the war and then reappeared as the copyright changed hands a couple of times before being acquired by Marvel. According to the student’s research, 1986 represents a pivotal turn in the comic book industry. The once one-dimensional crime fighters, or in the case of Captain America, Nazi fighters, were now being developed as individuals and became more complex and even darker, grittier characters. It was the year comics such as the Watchmen and Batman: The Dark Knight Returns debuted. Written by Frank Miller, the cover of the first issue of the Dark Knight shows just the silhouette of Batman leaping down from the upper-left corner against a dark blue sky and a lightning bolt bisecting the page. The student wanted to know if a similar shift, reflected in the darkening of hue, saturation, and brightness values of comic book covers along with a sparseness of lines and shapes, could be seen in other titles including their favorite, Captain America. Individual analyses of specific covers would simply report the more obvious presence of red, white, and blue color values. But what about measuring and visualizing the hue, saturation, and brightness as well as the standard deviation values of hundreds of covers over time? What would such an analysis reveal? ImagePlot also allows users to detect and count the distinct number of shapes found in every image. Fewer shapes wouldn’t necessarily signal a darker turn in the comic (as with the famous Dark Knight cover) but identifying shifts to and from a “busy” or shape filled cover to simplified design may reveal whether or not Captain America follows comic industry trends. The Captain America figure is frequently depicted as the morally upright representative of American ideals who is then sharply contrasted by his foes that stand in for fascism, communism, and, in his present instantiation, international terrorism or alien invaders. What might we infer, if anything, if the Captain America franchise were to take a grittier turn?

Replicating the research that explores these questions begins with the methods section of the former student’s paper. It educates current students on the process of identifying, creating, formatting, and naming and saving image data and metadata. It also discusses some of the decisions the former student needed to make in terms of including or excluding data. The student needed to contend with Captain America’s publication hiatus as well as his appearance amongst other comic heroes. When he was reintroduced in 1964, for example, Captain America appeared as a member of the now wildly popular Avengers and in another Marvel series titled Tales of Suspense where he shared lead billing with Iron Man.

After discussing the challenges of their data selection, the student’s methods section describes how to access the Marvel Comics Unlimited website in order to obtain comic book cover images of consistent size and image quality. Their dataset ultimately contained 397 image files beginning in 1941 through 1996. Using the .png file format, each cover was saved with a “c” to designate it as a comic and then a sequential number. So, Captain America Comics #1 was saved as “c1” and all files were archived in a single image directory. After capturing and naming each image, the student managed the rest of their data in an Excel worksheet that included columns for File Name, Series Title, Issue Number, Publisher, Month Published, and Year Published. After loading their files into ImagePlot, they would export the results and update the Excel file to include median and standard deviation values for hue, brightness, and color saturation. Their data visualizations of these values over time were saved as high resolution .tif files. Figure 1 below shows this author’s attempt to replicate the student’s work visualizing the median brightness found in all 397 Captain America comic covers over time. Figure 2 is a close-up of a small section of the data visualization to show the overlapping brightness found in the comic covers as well as a couple of outliers from the mid 1980s. Because we have copies of the former student’s Excel file, I also showed students where I made an adjustment to the project’s methods. Specifically, it was easier to import the data into ImagePlot if I created a single date column that merged the month and day. To do so, I showed the class how to convert months to decimals so they fed correctly in this DH tool. E.g., January 1941 became 1941.1668, February became 1941.2502, March became 1941.3336, and so on. The demonstration provided an important lesson on how the affordances of a particular tool may be incompatible with research objective.

Thumbnail images of Captain America comic book covers appear overlapping and clustered together primarily in 120 to 200 brightness value ranges of the y-axis. There are four covers that appear to be outliers in between the years 1981 and 1991 with brightness values ranging from approximately 15 to 40.
Figure 1: Median brightness values (0-255) found in Captain America comic covers displayed over time
This image shows the four covers that appear to be outliers in the dataset with significantly lower brightness values. The x-axis is visible showing date ranges of 1981 through 1991.
Figure 2: Close-up section of figure 1 data visualization focusing on time between 1981 and 1991
In PixPlot, Captain America covers appear down the left-hand side in a narrow menu as clickable thumbnail images. The thumbnails represent clusters of the Captain America comic book covers grouped together by PixPlot. A majority of the screen shows a constellation of all the comic book covers in their clusters.
Figure 3: Captain America dataset rendered with PixPlot
The thumbnails on the left-hand side of the screen that represent clusters are clickable. This figure shows PixPlot after clicking on “cluster 9.” The comic book covers that make up that cluster are now visible in the right-hand side of the screen. Details including predominate colors of the comic book covers are easier to see up close.
Figure 4: Interactive hotspot of the Captain America dataset shown as “Cluster 9” within PixPlot

What the student found was a surprising amount of consistency of hue, brightness, and saturation across the Captain America comic covers. In addition to the expected reds, whites, and blues, yellow dominates as a hue value and is pervasive across the midsection of the hue data visualizations they created. There are two moments later in the comic’s publication where darker blues and blacks appear but there is also a small spike in white and lighter coloring. This perspective on a larger amount of data is an instance of the expansion of distant reading where a large set of images and their corresponding metadata are visualized en masse, generating new vantages on work. It is then up to the student in terms of how to analyze and discuss this new perspective and these results. With the new visualizations, the “critic seeks not facts, but patterns. And from pattern the critic may move to grander rhetorical formations that constitute critical reading” (Ramsay, 2011, p. 17).

A common concern over the deployment of DH tools like ImagePlot is the worry that “[s]ome tools encourage intellectual laziness by obscuring methodology” (Tenen, 2016, p. 84). A benefit of using past projects is new students get to see how former students developed their methods, results, and analysis. As noted, there are also opportunities for instructors and students to make modifications. The added benefits of a tool like ImagePlot are the sample datasets, directions, and discussion provided by Manovich and his team. Again, for the purposes of the course, students come to understand that what the “results mean and why they matter is open to interpretation” (Tenen, 2016, p. 85). The data visualizations and the patterns that emerged trigger any number of new questions to explore. Do the color changes coincide with a change in artist direction or an illustrator? Perhaps they coincide with a change in intellectual property ownership and the publisher? World War II provided a very particular socio-political context for the comic in the 1940s, but what about the 1960s through the 90s? What if we were to expand the dataset to include the 21st century and latest iterations of the Avengers? Research from scholars such as Risam (2018) and Walton, Moore, and Jones (2016, 2019) should prompt us to expand our questioning through the cultural implications of the comic book covers selected. What might the study reveal about depictions of nation-state iconography? How do the covers promote national or even jingoistic ideals? New tools and methods that spawn new questions means DH has been a boon for literary scholars interested in, as Ramsay says, making sure discussions continue. But for those in the fields of technical and professional communication, document design, and experience architecture, we need to continue to teach our students how to approach their work with fresh perspectives, too.

Tools and Data

Any instructor approaching an introductory course such as this one will need to decide how much time is dedicated to a specific digital tool or tools. This course is set up so students are exposed to a great number tools by way of the presentations on mature DH projects. I make myself available to students when they are trouble-shooting software that they have identified for use in their research projects but I also set the expectation that they are responsible for researching and learning the tools necessary for their work. (For a discussion on focusing on a single tool, see Litterio’s (2021) “Digital Humanities in Professional and Technical Communication: Results of a Pedagogical Pilot Study” that centers on the open source web-publishing platform Omeka). In the world of software, ImagePlot is old technology. Even though the software is quite functional for the purposes of the course, an interesting twist on the assignment is to use the student’s research question and data as a starting point but introduce new software tools. For example, PixPlot from Yale’s DH lab (2017) renders clusters of image data that allow web-based interactivity enabling a user to pan and zoom through the data visualizations (Fig. 3). It, too, allows for the introduction of a variety of metadata to help shape the outputs and even create interactive hotspots within the data visualization (Fig. 4). There are others, too, such as Tableau, Timeline JS, and RAWGraphs that all bring differing abilities for rendering visualizations. What is important for students to understand when using these DH tools, or any software tool, is that the interpretation and analysis of the results is still a separate process and will depend on the different theories and methodologies employed to underpin and contextualize the work.

For those who don’t have existing projects to teach from, many if not most DH tools have sample datasets and methods-driven instructions for using the tools and analyzing the data. ImagePlot is no exception, and Manovich’s team assembled sample data containing 776 van Gogh paintings created by the artist between 1881 and 1890. They added metadata in a spreadsheet for each of the paintings that included the year and month the painting was created as well as a classification for it (e.g., self-portrait, landscape, still-life). They also included the title of the painting and the where the artist was living at the time (Paris or Arles in the south of France). The researchers wanted to challenge some common generalizations about the artist’s work including language used to describe the art found in prominent museums that house his paintings. Specifically, van Gogh’s move to Paris is often said to come with a concurrent brightening of his palette and shorter brush strokes in his works. The distant reading of hue, brightness, and color saturation over time reveal much more overlap in artist’s color palette than expected. Visualizing the work based on visual properties of hundreds of paintings all at once suggests that it is difficult to claim stylistic periods based solely on where he lived.

Instructors may wish to introduce this research replication assignment with the requirement that students recreate all or a portion of a dataset. As a general rule, I require students to at least recreate a portion of the sample dataset. Student researchers can begin the ImagePlot assignment by visiting the online van Gough museum and searching for paintings and drawings that can also be filtered by location. They can also visit the Marvel comics website for (now limited) open access cover art or the comic book cover archive, Cover Browser. Data gathering and curation are essential to student final projects so asking students to go through the process at this stage can help ensure success at the end of the semester. The activity is also a key link to a core understanding about data. Specifically that “the practice of identifying, classifying, and arranging objects and concepts relative to each other in other forms of information storage and representation is by is by its very nature a rhetorical endeavor” (Applen & McDaniel, 2009, p. 97). Or as Bode says of her work with Trove, her “curated dataset embodies an argument” (2018, p. 204).

Students who aren’t interested in working with images and image metadata may want to practice with any number of tools that collect and curate text. For example, technical communication researchers may already be familiar with the MassMine project, a DH tool designed to collect (scrape) social media data from sites like Twitter. Academic members of the technical communication community already help makeup the advisory board for MassMine (MassMine Team, n.d.). MassMine runs from the command line. Users set up tasks and queries along with other filters to specify the data they will be collecting. Because social media sites like Twitter contain a robust amount of metadata or data about a tweet, the tool is a good way to expose students to thinking about what they want out of a dataset or that they should begin asking these questions of their data gathering in the first place. Depending on the size and level of the course, the data curation step is an effective means to get students doing DH work.

Again, the activities described above using ImagePlot and existing data from a prior student project are treated as a module that I typically include early in the semester of an introductory DH course. Introducing students to ImagePlot, sample student data as well as the sample data that comes packaged with the tools require two weeks’ worth of class time. More time is added if other tools such as PixPlot or Tableau are part of the instruction. Teaching these tools together makes sense in a course where several modules, or perhaps the entire course, focus on data and humanities visualizations. Regardless of the modules assembled for the course, time is allotted each class period for student presentations on mature DH projects in an effort to keep the focus on methods. In addition to the Zotero archive, Carnegie Mellon University’s Digital Humanities Literacy Guidebook has a thorough collection of DH topics. The topics range from 3D Modeling to Web Archives and are useful to instructors wishing to assemble DH modules for their courses.

Formulating Questions

Because this course requires students to complete a methods-driven digital research project by the end of the semester, beginning with a focus on methods may appear to be putting the proverbial cart before the horse. That is, why focus on methods in advance of students having a defined research question? Regular communication and transparency with students about the pedagogical strategy for the course are essential. From the outset, students are assured that their research questions will be developed and refined over the first half of the semester and that it benefits them to develop their questions amidst a course staged with methods at the epicenter of the pedagogy. The strategy gets students ready to argue with and for their methods – something most of them have reported not having to contend with. Nevertheless, in an introductory course such as this, there is often anxiety surrounding “the development of the researchable question” (Teston & McNely, 2013, p. 219). As Hoover advises, “the fact that a problem is computationally tractable does not mean that a definitive or certain solution is necessarily possible; nor does the fact that a problem is computationally intractable mean that a legitimate and effective argument cannot be made about it” (Hoover, 2016, p. 233). He gives the example of questioning whether or not William Faulkner’s sentences are longer than those of Henry James. A tractable problem to solve but one which the answer does little to advance the fields of study around either author. Regardless, as students begin to draft research questions for their final projects, I do steer them toward specific questions and not broad topics about, for example, Southern literature. Where possible, the focus is on computationally tractable questions that contribute to a larger conversation. Because I do spend more time in class with data visualization tools like ImagePlot, PixPlot, and Tableau to replicate prior student work, students often do formulate questions that naturally involve those tools. For example, in different iterations of the course there were students interested in film who devised questions using ImagePlot. One was looking for patterns across movie posters representing the films that won academy awards in different categories. The other wanted to examine the common understanding that the eight films in the Harry Potter series get visually darker as they progress. The student wanted to prove the theory but also be able to answer how dark, exactly, and how quickly its hues and brightness dim over the span of the films.

CONCLUSIONS

This course begins with a declaration of its methods-driven pedagogy and Smagorinsky’s influential article on research methods serving as an epicenter for our learning (2008). Students see a sample presentation on a mature or established DH project from the instructor that highlights how the project’s research questions and methods (including tool selection) are reflected in the work. Students are then provided guidelines for their own presentations as well links to DH archives. Each week, students will see a different presentation from one of their peers. If the enrollment in the course is high, students are grouped into presentation teams. The instructor provides links to prior student projects including their datasets and papers containing their research questions, methods, results, and analyzes. As noted above, if an instructor is teaching a DH course for the first time there are sample datasets as well as research methods and discussions that accompany many tools including Manovich’s ImagePlot. According to Norton, “Part of the power of using DH methods as teaching tools derives from the active learning baked into the process” (Norton, 2019, p. 302). Whether students are replicating the research of former students or using sample datasets from other DH tools, this hands-on element is essential to the course. It is also recommended that even if a curated dataset is available from a past project that students are tasked with recreating at least a portion of the dataset. Again, the construction of the dataset itself should be understood as a rhetorical act. Curation means facing decisions about what data to include, how to capture and format it, as well as any necessary metadata needed to inform the work. This type of hands-on, active learning can be messy and at times, frustrating. Students experience how “doing DH is not as simple as choosing a digital tool and then combining that tool and tactic with a given methodological approach” (Boyle, 2015, p. 112, italics in original). Students may have to experiment with different tools to genuinely understand the affordances of each and how well they do or do not serve their research questions and methodological frames. In several iterations of an introductory DH course that uses versions of this framework I have found that the course is able to make room for the research interests of the diverse humanities majors who enroll in the course and provide them with enough structure and guidance to successfully complete their own projects.

ACKNOWLEDGEMENTS

The author would like to thank the editorial team at Communication Design Quarterly and the peer reviewers for their thoughtful and thorough feedback. He would also like to thank the students in his DH classes for their smart, insightful work.

REFERENCES

A Digital Humanities Manifesto » The Digital Humanities Manifesto 2.0. (2009, May 29). http://manifesto.humanities.ucla.edu/2009/05/29/the-digital-humanities-manifesto-20/

About. (n.d.). Keystone DH 2021. Retrieved November 16, 2021, from https://keystonedh.network/2021/about

Allington, D., Brouillette, S., & Golumbia, D. (2016, May 1). Neoliberal Tools (and Archives): A Political History of Digital Humanities. Los Angeles Review of Books. https://lareviewofbooks.org/article/neoliberal-tools-archives-political-history-digital-humanities

Alvarado, R. C. (2012). The digital humanities situation. In M. K. Gold (Ed.), Debates in the digital humanities (pp. 50–55). University of Minnesota Press.

Applen, J. D., & McDaniel, R. (2009). The rhetorical nature of XML constructing knowledge in networked environments. http://site.ebrary.com/lib/alltitles/docDetail.action?docID=10320403

Berish, F. (2021, November 10). You’re Invited: Virtual GIS Days (Nov. 15-19) [Personal communication].

Berry, D. M. (Ed.). (2012). Understanding digital humanities. Palgrave Macmillan.

Bode, K. (2018). A world of fiction: Digital collections and the future of literary history. University of Michigan Press. https://doi.org/10.3998/mpub.8784777

Boyle, C. (2015). Tactical and Strategic: Qualitative Approaches to the Digital Humanities. In J. Ridolfo & W. Hart-Davidson (Eds.), Rhetoric and the digital humanities (pp. 111–126). The University of Chicago Press.

Brier, S. (2012). Where’s the pedagogy? The Role of Teaching and learning in the digital humanities. In M. K. Gold (Ed.), Debates in the digital humanities (pp. 390–401). University of Minnesota Press.

Carter, S., Jones, J., & Hamcumpai, S. (2015). Beyond territorial disputes: Toward a “disciplined interdisciplinarity” in
the digital humanities. In J. Ridolfo & W. Hart-Davidson (Eds.), Rhetoric and the digital humanities (pp. 33–48). The University of Chicago Press.

Clement, T. E. (2016). Where is methodology in digital humanities? In M. K. Gold & L. F. Klein (Eds.), Debates in the digital humanities: 2016 (pp. 153–175). University of Minnesota Press.

Cohen, J. J., & Yates, J. (Eds.). (2016). Object Oriented Environs. punctum books.

Cordell, R. (2016). How not to teach digital humanities. In M. K. Gold & L. F. Klein (Eds.), Debates in the digital humanities: 2016 (pp. 459–474). University of Minnesota Press.

Course Archive (2001-2021). (2021). Digital Humanities Summer Institute. https://dhsi.org/course-archive-2001-2021/

Da, N. Z. (2019a). The computational case against computational literary studies. Critical Inquiry, 45(3), 601–639. https://doi.org/10.1086/702594

Da, N. Z. (2019b, March 27). The digital humanities debacle. The Chronicle of Higher Education. https://www.chronicle.com/article/the-digital-humanities-debacle/

Evans, L., & Rees, S. (2012). An interpretation of digital humanities. In D. M. Berry (Ed.), Understanding digital humanities (pp. 21–41). Palgrave Macmillan.

Ferster, B. (2013). Interactive visualization: Insight through inquiry. MIT Press.

Fyfe, P. (2016). Mid-sized digital pedagogy. In M. K. Gold & L. F. Klein (Eds.), Debates in the digital humanities: 2016 (pp. 104–117). University of Minnesota Press.

Galloway, A. (2016, March 27). The digital in the humanities: An interview with Alexander Galloway (M. Dinsman, Interviewer) [Interview]. Los Angeles Review of Books. https://lareviewofbooks.org/article/the-digital-in-the-humanities-an-interview-with-alexander-galloway

Goldstone, A. (2019). Teaching qualitative methods: What makes it hard (in literary studies). In M. K. Gold & L. F. Klein (Eds.), Debates in the digital humanities (pp. 209–223).

Greenspan, B. (2019). The scandal of digital humanities. In M. K. Gold & L. F. Klein (Eds.), Debates in the digital humanities (pp. 92–95).

Hashimov, E. (2015). Qualitative Data Analysis: A Methods Sourcebook and The Coding Manual for Qualitative Researchers: Matthew B. Miles, A. Michael Huberman, and Johnny Saldaña. Thousand Oaks, CA: SAGE, 2014. 381 pp. Johnny Saldaña. Thousand Oaks, CA: SAGE, 2013. 303 pp. Technical Communication Quarterly, 24(1), 109–112. https://doi.org/10.1080/10572252.2015.975966

Hayhoe, G. F., Brewer, P. E., & Hughes, M. A. (2020). A research primer for technical communication: Methods, exemplars, and analyses (Second edition). Routledge.

Hayler, M., & Griffin, G. (Eds.). (2016). Research methods for creating and curating data in the digital humanities. Edinburgh University Press.

Hayles, N. K. (2012). How we think: Digital media and contemporary technogenesis. The University of Chicago Press. http://site.ebrary.com/id/10547388

Hirsch, B. D. (Ed.). (2012). Digital humanities pedagogy: Practices, principles and politics. Open Book Publ.

Hoover, D. (2016). Argument, evidence, and the limits of digital literary studies. In M. K. Gold & L. F. Klein (Eds.), Debates in the digital humanities: 2016 (pp. 230–250). University of Minnesota Press.

Ingraham, C. (2015). Theory in a transdisciplinary mode: The rhetoric of inquiry and digital humanities. Poroi, 11(1), 1–25. https://doi.org/10.13008/2151-2957.1188

Jakacki, D., & Faull, K. (2016). Doing DH in the classroom: transforming the humanities curriculum through digital engagement. In C. Crompton, R. J. Lane, & R. G. Siemens (Eds.), Doing digital humanities: Practice, training, research (1st edition, pp. 358-372). Routledge.

Jones, N. N., Moore, K. R., & Walton, R. (2016). Disrupting the past to disrupt the future: An antenarrative of technical communication. Technical Communication Quarterly, 25(4), 211–229. https://doi.org/10.1080/10572252.2016.1224655

Kirschenbaum, M. (2012a). Digital humanities as/is a tactical term. In M. K. Gold (Ed.), Debates in the digital humanities (pp. 415–428). University Of Minnesota Press.

Kirschenbaum, M. (2012b). What is digital humanities and what’s it doing in english departments? In M. K. Gold (Ed.), Debates in the digital humanities (pp. 3–11). Univ Of Minnesota Press.

Kynell-Hunt, T., & Savage, G. J. (Eds.). (2003). Power and legitimacy in technical communication. Baywood Pub.

Kynell-Hunt, T., & Savage, G. J. (2004). Power and legitimacy in technical communication, volume II strategies for professional status. Baywood Publishing Company, Inc.

Leavy, P. (2017). Research design: Quantitative, qualitative, mixed methods, arts-based, and community-based participatory research approaches. Guilford Press.

Litterio, L. M. (2021). Digital humanities in professional and technical communication: Results of a pedagogical pilot study. Technical Communication Quarterly, 30(1), 77–88. https://doi.org/10.1080/10572252.2020.1789744

Liu, A. (2013). The meaning of the digital humanities. PMLA/ Publications of the Modern Language Association of America, 128(2), 409–423. https://doi.org/10.1632/pmla.2013.128.2.409

Manovich, L. (2012). How to compare one million images? In D. M. Berry (Ed.), Understanding digital humanities (pp. 249–278). Palgrave Macmillan.

MassMine Team. (n.d.). Retrieved July 8, 2021, from https://www.massmine.org/about.html

Miles, M. B., Huberman, A. M., & Saldaña, J. (2013). Qualitative data analysis: A methods sourcebook.

Miller, C. R. (1979). A humanistic rationale for technical writing. College English, 40(6), 610–617. JSTOR. https://doi.org/10.2307/375964

Moretti, F. (2000). Conjectures on world literature. New Left Review, 1, 54–68.

Moretti, F. (2005). Graphs, maps, trees: Abstract models for a literary history. Verso.

Moretti, F. (2013). Distant reading. Verso.

Noble, S. U. (2019). Toward a critical black digital humanities. In M. K. Gold & L. F. Klein (Eds.), Debates in the digital humanities 2019 (pp. 27–35).

Norton, D. (2019). Making time: Workflow and learning outcomes in DH assignments. In M. K. Gold & L. F. Klein (Eds.), Debates in the digital humanities 2019 (pp. 300–306).

Piper, A. (2021, October 29). Our Data | Published by Journal of Cultural Analytics. https://culturalanalytics.org/post/1214-our-data

Porter, J. E. (2013). How can rhetoric theory inform the practice of technical communication? In J. Johnson-Eilola & S. A. Selber (Eds.), Solving problems in technical communication (pp. 125–145). The University of Chicago Press.

Ramsay, S. (2011). Reading machines: Toward an algorithmic criticism. University of Illinois Press.

Ridolfo, J., & Hart-Davidson, W. (Eds.). (2015). Rhetoric and the digital humanities. The University of Chicago Press.

Risam, R. (2018). New digital worlds: Postcolonial digital humanities in theory, praxis, and pedagogy. Northwestern University Press. https://muse.jhu.edu/book/62714

Rude, C. D. (2009). Mapping the Research Questions in Technical Communication: JBTC. Journal of Business and Technical Communication, 23(2), 174. ProQuest Central. https://doi.org/10.1177/1050651908329562

Smagorinsky, P. (2008). The method section as conceptual epicenter in constructing social science research reports. Written Communication, 25(3), 389–411. https://doi.org/10.1177/0741088308317815

Smith, M. J. (2014). Meeting John Donne: The virtual Paul’s Cross Project. Spenser Review, 44(2). June 17, 2021. https://www.english.cam.ac.uk/spenseronline/review/volume-44/442/digital-projects/meeting-john-donne-the-virtual-pauls-cross-project/

Spinuzzi, C. (2003). Tracing genres through organizations: A sociocultural approach to information design. MIT Press.

Spinuzzi, C. (2008). Network: Theorizing knowledge work in telecommunications. Cambridge University Press.

Stephens, S. H. (2019). A narrative approach to interactive information visualization in the digital humanities classroom. Arts and Humanities in Higher Education, 18(4), 416–429. https://doi.org/10.1177/1474022218759632

Svensson, P. (2012). Beyond the big tent. In M. K. Gold (Ed.), Debates in the digital humanities (pp. 36–49). University of Minnesota Press.

Temple University Libraries | Past Program and Event Videos. (n.d.). Retrieved November 17, 2021, from https://library.temple.edu/watchpastprograms/show?id=cd128c3c-d03d-47cc-8a52-978da44d721d

Tenen, D. (2016). Blunt instrumentalism: On tools and methods. In M. K. Gold & L. F. Klein (Eds.), Debates in the digital humanities: 2016 (pp. 83–91). University of Minnesota Press.

Teston, C., & McNely, B. (2013). Undergraduate research as collaborative knowledge work. In R. McClure & J. P. Purdy (Eds.), The new digital scholar: Exploring and enriching the research and writing practices of NextGen students (pp. 211–232). American Society for Information Science and Technology by Information Today, Inc.

Underwood, T. (2016, May 4). Versions of disciplinary history. The Stone and the Shell. https://tedunderwood.com/2016/05/04/versions-of-disciplinary-history/

Underwood, T. (2019). Digital humanities as a semi-normal thing. In M. K. Gold & L. F. Klein (Eds.), Debates in the digital humanities 2019 (pp. 96–98).

Varner, S. (2016, May 6). A few thoughts on the whole DH, neoliberalism, LARB thing – Stewart Varner. https://stewartvarner.net/2016/05/a-few-thoughts-on-the-whole-dh-neoliberalism-larb-thing/

Wall, J. (n.d.). Virtual Paul’s Cross Project: A Digital Re-creation of John Donne’s Gunpowder Day Sermon. https://vpcp.chass.ncsu.edu/

Wall, J. (2014a). Transforming the object of our study: The early modern sermon and the Virtual Paul’s Cross Project. Journal of Digital Humanities, 3(1). http://journalofdigitalhumanities.org/3-1/transforming-the-object-of-our-study-by-john-n-wall/

Wall, J. (2014b). Recovering lost acoustic spaces: St. Paul’s Cathedral and Paul’s Churchyard in 1622. Digital Studies/Le Champ Numérique, 0(0). https://doi.org/10.16995/dscn.58

Walton, R. W., Moore, K. R., & Jones, N. N. (2019). Technical communication after the social justice turn: Building coalitions for action (First edition). Routledge.

Yale DHLab—PixPlot. (n.d.). Retrieved March 31, 2022, from https://dhlab.yale.edu/projects/pixplot/

Yeats, D., & Thompson, I. (2010). Mapping Technical and Professional Communication: A Summary and Survey of Academic Locations for Programs. Technical Communication Quarterly, 19(3), 225–261. Education Research Complete. https://doi.org/10.1080/10572252.2010.481538

ABOUT THE AUTHOR

Brian Ballentine is a professor and the chair of the Department of English at West Virginia University where he teaches in the Professional Writing and Editing program.

Digital Humanities and Technical Communication Pedagogy: A Case and a Course for Cross-Program Opportunities