Claire Lauer
Arizona State University

Download the full article in PDF format


The Desert Southwestern US is in the middle of a 20-year drought that has led policymakers to reassess the amount of water allocated from the Colorado River to various states in the region. On May 20, 2019, leaders from the Colorado River Basin (CRB) states, the Navajo Nation, and Mexico, all signed the historic Drought Contingency Plan (DCP) that would help mitigate, for the short term, the effects of drought that have depleted the water levels of the Colorado River and associated reservoirs to between 35–41% of their capacity. The Colorado River supplies 15 million acre-feet per year of water to the region, so ensuring an adequate and consistent supply to these states is critical to maintaining economic stability. Adding to this burden on the water supply of CRB-reliant states have been increases in population, development, and uncertainty about future drought. The passage of the Drought Contingency Plan demonstrates an understanding of the sacrifices that need to be made across all sectors to work together to develop water management solutions over the long term. Reclamation Commissioner Brenda Burman lauded the effort after the signing by saying, “These agreements represent tremendous collaboration, coordination and compromise from each basin state, American Indian tribes, and even the nation of Mexico” (Bureau of Reclamation, 2019).

Up to this point, our water managers and policy makers have done much of the difficult coordination in making sure our communities have enough water. However, as drought continues and we “max out” the ways in which technology can help us use water more efficiently (e.g., low-flow toilets and showerheads; high-efficiency agricultural irrigation systems), people will be called upon to express their opinions about the kinds of choices they are willing to make, as a community, to manage a diminished supply. Helping community members better understand their water systems—where their water comes from and who the stakeholders are that rely on a consistent supply—is thus essential to equipping individuals with the knowledge they need to contribute to the health and well-being of their communities. Equally essential, however, is understanding what people’s lived experiences with water have been, what they value, and what they feel they need to know to help them make meaningful choices.

This paper reports on the challenges that arose when a water modeling system built for experts was adapted for a public museum audience. It introduces the concept of transactional design—integrating Druschke’s “transactional” model of rhetoric and science and Kinsella’s model of “public expertise”—to demonstrate how technical communication and user experience (UX) designers and researchers can play an essential role in helping scientists cultivate meaningful relationships with members of the public toward the goal of making scientific content more accessible and actionable. It provides a checklist for steps that technical communication and UX designers and researchers—as those who best understand audiences and work directly with users—can use to setup knowledge-making partnerships between scientists’ knowledge and users’ expertise toward the co-construction of public-facing scientific communication projects.


DroughtSim1 is an interactive mathematical computer simulation model developed by scientists at a water policy center in the Desert Southwest to help experts and policy-makers study the effects of factors such as population, temperature, precipitation, and agricultural production on the region’s water supply over the next fifty years (see Figure 1).

The way experts interact with DroughtSim is via an interface that is projected across several large screens in a room called a “Decision Theater.” These large screens can accommodate the available inputs and outputs for experts to analyze in graphical form after running the model (See Figures 2 and 3).

This image is a complex and detailed flowchart that illustrates how the DroughtSim model operates, including annual inputs about available groundwater, recharge water, and river water, as well as annual outputs about how water is pumped, released, reclaimed, and delivered in the overall water supply and demand system.
Figure 1: Flowchart of the DroughtSim model

Several years ago, a National Museum2 approached the water policy center about adapting the DroughtSim for public museum audiences who live in the communities that would be hosting a traveling version of the National Museum as it makes its way around the US. This particular National Museum traveling installation is themed around people’s relationships with water, and the National Museum wanted to provide an adapted simulation experience for the public to learn how their water supply and demand system works, who the major water users are, where supplies come from, and how water affects their local economy and environment. The National Museum was scheduled to travel to rural towns all over the country, visiting a new town every two months or so.

The museum version of the DroughtSim, called “DroughtSim America” (DSA), that scientists at the center developed challenges museum visitors to successfully manage their region’s water supply-and-demand in a time of drought. It uses a complex array of regional and state-specific water data from the area where the museum is currently visiting. In addition to asking users to make choices about how much water to take from various sources to meet demand (e.g., groundwater, surface water, reclaimed water), it asks them to enforce various levels of efficient water use among community stakeholders (e.g., townspeople, industry, and farmers). Scientists hoped that visitors at the exhibit would learn “That [users] have a seat at the table and everyone needs to be acknowledged and recognized. That [water supply and demand] is a complex system with multiple benefits and multiple perspectives” (personal communication, February 15, 2019).

The goals the scientists articulated were noble; however, the scientists had built the DroughtSim America (DSA) interface based on the expert interface and did so without input from any members of the rural audience who would be interacting with the DSA in the museum environment. As the simulation was developed without regard to the particular needs of its target audience and, as I will show, followed common deficit and conduit models of development, the simulation did not meet the goals the scientists had in mind nor did it connect with the needs of users. Ultimately, the complexity of the system scientists were trying to communicate did not connect with the complexity of the lived experiences of the rural audience tasked with using the system, leaving participants confused and uninformed.

When visitors to museum exhibits, parks, and other tourist sites feel incompetent or confused in the face of problems they are being asked to solve, they are demonstrating what Gianni Moscardo calls “mindlessness.” Moscardo adapted Ellen Langer’s research on mindfulness to show the ways in which museum installations, national park sites, and other tourist experiences can be improved to encourage more mindful, engaged, and satisfied visitors. In contrast to mindlessness, mindfulness occurs when a visitor

Actively processes new information, creates new categories for information, and thinks about new ways to behave. Mindfulness is associated with more learning, better decision making, increased self-esteem, and feelings of control and enjoyment. (Moscardo, 1999, p. 25)

Mindfulness is imperative to visitors’ being able to interact meaningfully with sustainable, ecological, and conservation-based science knowledge. As this paper will discuss, transactional design is essential to ensuring mindful experiences because it requires that scientists value the knowledge and experience of the people whom they are trying to reach (what I refer to as “user expertise”) as equal to their own. But to better understand that process, and how much of a challenge adopting a transactional design approach to science communication may pose, we must first examine a few flawed yet pervasive science communication models that privilege technical rationality and scientific expertise at the expense of the user’s expertise.

This image shows the expert version of the DroughtSim interface. A user would interact with this interface on a large screen with the assistance of a water scientist. They would choose percentages from a number of policy choice options, including about reclaimed wastewater, farm water use, water extracted form the Colorado river, and forecasted population growth. A scientist would then run the model and output results, as both percentages and in graphs, about a number of sustainability indicators, including groundwater levels, environmental quality, agricultural production, and others.
Figure 2: Close-up of expert DroughtSim interface.


Communicating science to non-scientific audiences has historically embodied a sage on the stage relationship between scientists and the public. Walsh (2013) shows how scientists throughout history have adopted a prophetic ethos when trying to convince the public of the validity of scientific information. Scientists have also historically communicated their findings by relying on logical appeals supported by scientific data (Briselli, 2013). These appeals are still common, despite the fact that “public understanding and attitudes do not develop solely in response to objective reasoning” (p. 3). Kinsella (2004) citing Fisher; Pearce and Littlejohn; and Toumey, suggests that scientists default to using logical appeals because people’s perspectives about complex scientific issues are diverse and often deeply entrenched, leading them to “converge on technical rationality as the only widely accepted form of argument.” Yet, as technical communicators are often acutely aware, communication based on assumed technical competency silences expressions of individual and community values and “constitutes a formidable practical and symbolic barrier to increased citizen participation” (p. 83).

The long-debunked, but persistently employed “deficit” model of science communication is one such model that privileges technical rationality. This model describes a top-down communication structure where scientific information flows from the scientist to the uninformed but receptive community member. This model assumes that if scientists could just distribute their knowledge more widely—could just get their knowledge out there—a receptive audience of community members would happily learn from and act upon it. However, this model ignores myriad factors that influence a person’s relationship with scientific knowledge and the ways they use that knowledge to inform their opinions and actions. In a recent report by the National Academies of Science, Engineering, and Medicine, the authors address why this model in ineffective:

The deficit model assumes that if a message about scientific information is well crafted for one audience, it should meet the needs of other audiences as well. In fact, effective science communication is affected by the context and requires engagement with different audiences in different places at different times, taking account of what they want to know and already know, understand, and believe. (NASEM—my emphasis)

The NASEM is the latest in a long line of voices who have challenged scientists’ reliance on a deficit model (Irwin 2014, 1995). Yet it persists, I argue, because there haven’t been compelling enough consequences for scientists to act differently. Funding agencies, if requiring public facing communications at all, are not yet requiring that scientists find ways to empirically prove that they are “taking account of what [users] want to know and already know, understand, and believe” in the communications they develop for the public.

This image shows a moderator in the Decision Theater operating the DroughtSim interface with participants in the room.
Figure 3: DroughtSim projected in the Decision Theater.

Caroline Gottschalk Druschke and Bridie McGreavy (2014) describe this kind of contextual (rather than deficit) model as one that “involves interaction and two-way communication, emphasizing the importance of building trust and offering scientific information relevant to particular public audiences” (p. 47). Better understanding audiences and their responsiveness to scientific information is a topic of study that scholars in cognitive science and psychology have shown interest in. For instance, researchers in the cultural cognition program at Yale University study the ways in which people are receptive—or not—to information that may challenge their existing knowledge or values. These researchers have developed a grid of classifications to help explain people’s habits in resisting or accepting new scientific information (Kahan, 2010).

Similarly, scholars of rhetoric have become valued for their understanding of persuasive rhetorical appeals, especially the importance of the role that credibility (ethos) and appeals to emotion (pathos) have in complimenting more common appeals to logic (Vernon, 2014). However, even if scientists are made aware of what scholars from other disciplines can help them understand about users, it may not prevent them from engaging what Goodwin (2014) calls a “conduit” model of communication, where scientists look to other disciplines, such as rhetoric, for techniques to “wrap around the science content they will provide” (p. 3).

Rather than simply enhancing the communication of scientific knowledge; however, Spoel et al. (2008) suggests that rhetoric can be used to

Engage audiences in caring about what is being explained. In other words, it is a question of engaging the whole person through complex and rich rhetorical means, weaving together ethical, logical, and emotional proofs. It is a question of telling stories…that connect the science to people’s everyday knowledge, lives, values, and concerns (p. 53)

Appealing to the “whole person” in any communication context includes being willing to empathize with people’s concerns, knowledge, and values. Rodriguez and Davis (2015) suggest that science communicators need to begin by asking, “Who are we talking to?” and in doing so, take the extra steps to develop an understanding of different audiences’ values, life experiences, cultures, disciplines, and communities, what can collectively be thought of as their “expertise.”

This approach should feel familiar to technical communication and UX specialists, who are trained in a range of research methods that they employ to get to know users’ expertise. But connecting UX specialists with scientists can be challenging if scientists aren’t aware that they need to connect at all. Druschke (2014) discusses the importance of this interdisciplinary collaboration as one that positions

Scientists and rhetoricians of science as co-constructors of engaged science that gets things done in the world… This shift calls us to turn the focus away from exchange (what science gains from rhetoric and vice versa) and towards conceiving of our work as a necessary and integral part of the engaged practice of science itself” (p. 2)

Druschke is trying to raise the importance of the work that rhetoricians bring to science by integrating them into a single shared process that transcends each separate area of study. Druschke seeks to move beyond what she calls a “transactional” relationship between rhetoricians and scientists to a more interdisciplinary approach that will improve the practice of science itself.


However, when developing science communication experiences for the public, I would like to advocate that we refocus our attention on just such a transactional relationship, which does not turn away from, but instead actively embraces, the idea of “exchange.” A transactional model thus acknowledges the obvious divides that exist between the knowledge that both scientists and the public bring to any communication situation, but rather than positioning one as in deficit to the other, it registers the knowledge of each as equal.

A transactional design model champions what Kinsella (2005) calls “public expertise.” Kinsella cites Fisher and Forester’s work advocating for a dialogue in which “the local knowledge of ordinary citizens and the abstract knowledge of technical experts interact synergistically to provide more complete analyses and more effective decisions” (p. 89). For synergistic interaction to occur; however, scientists need to be as open to learning from and valuing the experiences of the public as scientists expect the public to be open to learning from the data scientists have to share. Transactional design proceeds through an iterative process that doesn’t treat audience needs as secondary to science. It emphasizes, rather, that science communication only works when researchers study audience expertise—including finding out what members of the audience want to know, already know, understand, and believe—and acknowledge that audiences have knowledge of equal value as the scientific knowledge that scientists are trying to communicate. Technical communication and UX specialists can play an essential role in helping this occur.

To illustrate the need for such a model, I will first trace the evolution of the DroughtSim America development process, which will illuminate what technical communication and UX specialists need to be aware of when working with scientists who are already invested in the scientific validity of a complex system. Scientists, understandably, value their own knowledge and habits of mind, so I will also address the ways UX specialists can work with scientists who may value the accuracy of data and complexity of relationships among data above all else, including audience engagement. I will also discuss how UX specialists can mitigate unsuccessful models of science communication toward a transactional design model that will allow specialists to intervene in the early stages of a project to ensure positive outcomes and mindful interactions. This paper will conclude by identifying features of a transactional design model and discuss why it is so important for UX specialists to champion this model with scientists if they hope to engage users in mindful ways.


By April 2016, the scientists had finished building their initial version of the DroughtSim America, adapted to fit on an iPad that could easily travel to the rural towns around the country where the museum would be hosted. It was not until after this initial build (and not long before the simulation was set for release to the first few towns), that the UX team was asked to consult on clarity and design issues with the user interface. It should be said that this initial consultation was only requested because the leads of both the UX team and scientists’ research center had recently been introduced to each other through a leadership organization at their university. It is only because of this introduction that the scientists even became aware that technical communication and UX research and design were an area of expertise. This is a challenge that I will discuss more later.

After the UX team conducted an initial audit of the interface and flagged basic image, color, and language issues, we discovered that the interface had not been show to any users in the target audience. We were granted IRB approval to research users engaging with the system in a range of environments. A graduate student team member of ours was living in the southeastern United States at the time, close to where the museum would soon be visiting, so we sent an iPad with her to the region and conducted usability tests in two rural towns there. She brought the iPad to a local hardware store in one town and to a library in the other town. She recruited fifteen community members to interact with the simulation using a thinkaloud protocol and asked them structured interview questions about their experience afterward. Based on that testing, we submitted a host of recommendations for improvements to the interface design, instructions, and task flow. We also developed personas of rural users for whom the scientists should design as we worked with them to make revisions to the system.

Testing the system with rural users laid bare how different their relationships with water were compared to those of the urban professionals who had been the only audience members to have interacted with the expert system up to that point. For instance, whereas urban dwellers rely a great deal on urban infrastructure to deliver water to their taps and grocery stores, every rural user we spoke to had a ready answer to “What would you do if you ran out of water?” Responses to this question included, “I would dig a well,” “I would use water I already collect in water tanks,” “I would get water from the three rivers,” and “I would filter water from the irrigation reservoir.” Our rural participants did not seem particularly surprised at the possibility that their government may not be able to ensure it would be able to provide a consistent water supply during a time of drought.

These responses show inventiveness, resilience, and fortitude. Unfortunately, these insights were put aside as the UX team became preoccupied with more basic visual, textual, and technical improvements to the system that seemed essential if users were even going to be able to interact successfully with the system in the ways that scientists had hoped. One layer of improvements was visual and involved making the graphic, font, and color scheme more visually pleasing and modern. Another was linguistic: making sure the instructions and interface verbiage used descriptive language that people understood and that they didn’t perceive as too technical. Another was experiential: making sure people understood how to use the system and what the system was intended to help them figure out. A final one was technical: making sure that the timers and other back-end controls worked in a way that encouraged people to play with the simulation through the two scenarios that were included with it. Because there were so many layers the UX team was trying to address in the name of a user’s ability to successfully interact with the system, and because the UX team had been asked to consult after the initial version of the simulation had been fully developed, we adopted more of a troubleshooting role. This felt productive at the time but would prove ultimately insufficient to meeting the users’ needs and accomplishing the scientists’ goals.

Throughout this period, the UX team felt that it was doing genuinely useful work: in response to user feedback, we developed a new layout, color scheme, and font scheme, as well as new graphics and text that would be added to the interface to explain the results of a user’s choices in more active, concrete terms (See Figure 4). We also developed a narrative game-like scenario that would set up a user’s interaction with the simulation. We recorded a screen-cast video that presented the narrative scenario and the instructions on how to use the simulation. The game scenario we devised positioned the user as traveling back from the future, where drought had ruined the community, to help the citizens of today use the DroughtSim to enact the best measures to avoid future drought. We felt that contextualizing the water management responsibilities of the simulation in more urgent, problem-solving terms would impart a sense of heroic responsibility on the user and provide a more exciting challenge.

This image shows the revised DroughtSim interface designed for the National Museum Exhibit and including a redesigned color scheme, images, layout, and text.
Figure 4: Revised DroughtSim America Interface.

However, our efforts proved to be misplaced. In October 2018, the National Museum was stationed at a site within two hours of our university, so we visited the site and observed users there for four hours. Overall, users did not choose to approach or engage with the DroughtSim (the iPad screen appeared too small to be noticed among the other colorful exhibits), and those who did showed difficulty understanding the instructions for how to use the system. We witnessed technical errors in which built-in timers were not giving users enough time to complete the simulation before kicking them off of it. We also observed that other sounds accompanying other exhibits in the museum made hearing the game scenario and instructions difficult. This was hugely consequential and made us realize that gamifying science content is not always more effective than more traditional approaches if the complexity of the gaming scenario proves more cognitively burdensome than what people are expecting to experience (Giannakos, 2013; Koenig, 2008). The enjoyment and motivation people have come to expect through a gaming experience does not always translate to positive learning outcomes (Papastergiou, 2009) and may even have the potential to significantly decrease retention and transfer when compared with more traditional methods (Adams et al., 2012). In this case, asking users to engage in a game that established a time-travel scenario was simply too complex to be conveyed using difficult-to-hear audio, even if also transcribed on the screen. We observed firsthand people growing impatient at the level of cognitive investment we were asking them to make in this casual museum environment.

In light of our findings, we replaced the game scenario and video instructions with a printed instructions board that we designed to surround the iPad. This would serve two purposes: One would be to make the area that the exhibit occupied larger and more attractive to users; the other would be to make the instructions ever-present so that users could refer back to them as they were engaging with the simulation. Throughout November 2018, we user-tested iterative versions of the instructions with 37 students from the psychology 101 course on our campus. We asked some to interact with the simulation individually, and others in pairs, and interviewed each of them following their sessions. All sessions occurred in our program’s UX lab. We rapidly prototyped revisions to the instructions boards between days when students interacted with the simulation, adding clarifications and content where needed without overloading the display. The instructions board was sent to the next town in the traveling museum’s path and, though we were unable to travel to the town to see it, the former town’s museum manager called to say that he found the improvements more engaging and easier to interact with. Problem solved!

However, as we continued our research with users, we noticed that people were still largely confused by what the purpose of the simulation was and what they were supposed to get out of it. Because the simulation presented water as a supply-and-demand system that included all variables and results on a single screen, it remained conceptually too complex for people and they felt illequipped to manage water in a meaningful way on that level. This is where we began to realize that we hadn’t done a good enough job researching what people already knew about water (e.g., “how much water do you use in a day?” “Where do you think the water you use comes from?”) and what they felt would be useful to learn. Although some of our college-aged participants seemed to know very little about water, our rural and older participants were eager to share their personal expertise with water.

We found ourselves circling back to the kinds of conversations we had had with the participants we interviewed in the rural southern towns a year previous. We had asked those participants questions about their lives and their relationships with water because we were interested in developing personas. In that research, our rural southern participants showed substantial knowledge about local sources of water—as well as a resilience about their own ability to secure water—that should have alerted us to their potential disinterest in learning about water systems on an abstract level. In setting aside these insights, we had inadvertently pivoted toward a “conduit” approach to design, where we were developing rhetorically-driven design and language improvements to “wrap around” the existing scientific interface we had started with rather than interrogating how suitable the design of this kind of interface (drawing from a complex scientific model) itself would be for a rural, public audience. This conduit approach was also encouraged by the scientists themselves, who had invested years of effort into a complex system that they had neither the time nor funding to completely redesign.

So, in the spring of 2019, we continued to test the DroughtSim system, with the additional goal of figuring out what users really wanted to talk about with regard to water. Because the museum is often set up in the community centers and libraries of rural towns, we brought two iPads and instruction boards to a rural library community gathering of senior citizens. Eighteen seniors participated in our mock-museum. We interviewed the seniors about their water knowledge and water use, asked the seniors to interact with the exhibit, then interviewed them after their interaction. Our participants’ eagerness to pivot the conversation away from discussing the system and toward their real-life experiences with water and drought reinforced just how interested users were to share what they knew and find out more about what they could do to help their communities. The DroughtSim was left behind in our conversations as confusing and irrelevant.


In this section, I summarize the challenges that prevented us from addressing the deficiencies of the DroughtSim system more successfully. These challenges are likely to arise working with even the most well-meaning of scientists because scientists may Figure 4: Revised DroughtSim America Interface. Communication Design Quarterly Online First, August 2020 7 not value users’ expertise as equal to their own. The points of disconnect apparent in the challenges I discuss below illustrates the need for UX teams to adopt a transactional design approach when working with scientists on public-facing science communication projects.

Challenge 1: Scientists undervalue users

Technical communication and UX specialists working in commercial contexts can rely heavily on the results of well-structured user testing to prove to even the most skeptical executives that changes need to be made to a system or interface. However, unlike a commercial product that relies on satisfied customers to remain profitable, scientists typically have no incentive to make money from users, and thus less incentive to make their systems usable. Notice the contrast between what the scientist quoted below said they were hoping to achieve and how the user reacted to the experience:

[the DroughtSim] is a tool that we can put out there in the universe; it’s user directed to help awaken citizens of simple concepts about water. We are hoping it’s a learning experience and that it comes with some new understandings about where their water comes from and how it’s being used

—Scientist’s response when asked about the purpose of the DroughtSim (personal communication, February 5, 2019)

It was very confusing. We just faked it. We basically guessed on what we were doing.

—Community participant’s response after using the DroughtSim (personal communication, February 27, 2019)

Any commercial website that leaves users in a state of confusion will result in lost revenue and credibility. However, a user not being able to understand scientific content is more easily seen as the users’ own fault and may not be responded to with empathy.

In the case of DroughtSim, because so many academics and policy makers had interacted with the expert system, the scientists assumed that novice users to the new system would need little special accommodation, despite the fact that the system would be shown, unmoderated, on a drastically smaller screen, in a museum setting (instead of a “decision theater”), and interacted with by people who may not be particularly interested or knowledgeable about the purpose or topic. Not attending well enough to these technological and contextual differences proved problematic.

Challenge 2: Scientists are unfamiliar with effective UX design processes

Scientists’ undervaluing museum users of the DroughtSim began with their truncated design process. Figure 5 shows a typical UX design process, which I have modified to show the development path the scientists on the DroughtSim project took to adapt DSA. With the expert version of the DroughtSim already developed, the scientists we worked with began by brainstorming communication solutions—rather than first trying understanding anything about the user who might interact with the solutions they would develop. It is unsurprising that scientists would follow this truncated process because at this point scientists have likely already developed the science that informs the public-facing version of the project. This encourages scientists to start later in the design process, rather than starting with the user.

Through this truncated process, scientists would start in the ideate phase rather than the empathize phase. If they follow the conduit model described by Goodwin, scientists wouldn’t consult a communication, design, or UX specialist until the prototype phase, when they would seek out communication techniques to “wrap around the science content they will provide” (p. 3). This approach assumes that the scientific content is objective and exists outside of any user experience or rhetorical context. In the case of the DroughtSim, the scientists already had their expert system ready to adapt, and they started at the ideate phase, brainstorming ideas about how the DroughtSim should be modified for the smaller iPad screen. They did not conduct user research in an attempt to better understand the rural visitor for whom they were designing. They also did not talk to potential museum visitors. They did not have anyone outside of the development team test the simulation at any phase.

This image shows a typical UX process, include the five stages of Empathize, Define, Ideate, Prototype, and Test. This image is annotated to show that for scientists, the process starts at Ideate, then goes to Prototype, then goes to “wrapping” content in communication techniques, then ends in Delivery, essentially skipping Empathize, Define, and Test.
Figure 5: Typical UX Design Process (Briselli, 2014), modified to show an example of the scientists’ process.

When the UX team was brought in, we immediately prioritized conducting user research to discover usability issues that prevented potential museum visitors from engaging with the simulation successfully. Technical communication and UX specialists consider this research essential to the success of any project. So why didn’t the scientists make sure to do such research themselves? There are several probable reasons for this. First, it takes a lot of time to conduct user research. You have to find the right users (in this case, rural citizens not living in proximity to our urban university environment), set up authentic contexts of interaction, and then be practiced enough with qualitative observation and interview methods to glean useful data from the experiences of your users. The scientists would have needed to get IRB approval to conduct this qualitative user research, including recording interviews and observations, which may be just enough of a hinderance and take just enough time out of a person’s schedule to act as a dissuading factor. The scientists would have also had to be familiar with effective UX research methods and the language to describe and design the research they would need. Our UX team, by contrast, had experience obtaining IRB permission for UX projects, which made the IRB and testing design process relatively easy by comparison.

More consequentially, the scientists were doing all of the development work on this project pro bono for the National Museum, without any funding provided beyond equipment costs. The scientists had accepted the invitation from the National Museum because they thought it would bring visibility to the research center and could be leveraged to apply for additional funding from other grants. As a result of working pro-bono, even if the user testing was Figure 5: Typical UX Design Process (Briselli, 2014), modified to show an example of the scientists’ process. 8 Communication Design Quarterly Online First, August 2020 of little to no-cost to conduct, major modifications that might be recommended from the results of the user testing would be cost-prohibitive.

Finally, researching the usability of a system to meet the needs of a non-scientific user is not typically the kind of work that would result in scientific papers or advancement of scientific knowledge. The scientific knowledge for this project had been advanced when the scientists were able to incorporate more complexity through the integration of a greater number of data sources into the model. Modifying the complexity of the system into something simpler and more accessible, by contrast, would delay such advancement. For these reasons, there was little payoff in optimizing the user experience for the museum installation. Ideally, the public would learn about the tradeoffs necessary to balance a complex water supply and demand system. But ultimately, for the scientists, the project would be deemed successful if the team were able to build a glitch-free system that conveyed accurate data and would maintain the scientific credibility of the research center.

Challenge 3: Scientists underestimate the need for UX funding

Not recognizing the extent to which user research and testing is essential to effective user engagement means that the scientists underestimated the need to seek funding for both the design and testing of the system. Because the scientists were working pro-bono, there was no money to be had for design and testing of the translated system. However, during this time the scientists wrote a grant to a state philanthropy that awarded the center money to develop a classroom version of the WaterSim and pay teachers to attend training on how to use the system and integrate it into their curricula. Because of the scientists’ overall lack of experience with UX, however, they overlooked the opportunity to request money for the design, user testing, or improvement of the classroom system in their grant request. All of this development work seemed to be taken for granted, likely because the system already existed in one form (the expert form), so translating it to another form (the classroom form or the museum form) was assumed not to require that much technical effort. It may have also been the case that the scientists felt that including such funding requests might suggest that their system wasn’t in perfect working order, and this would have compromised their ethos on the grant. It is equally likely that the scientists simply did not know the language to use to ask for and justify design and UX funding because they had never had to request such funding in the past.

Challenge 4: Scientists prematurely fixate on the mode of delivery

Scientific research is often complex in nature, requiring significant background knowledge of methods, techniques, and familiarity with past research; thus, its results may only be understood by other researchers. Scientific research is typically disseminated to other scientists in the form of scholarly publications and presentations that are written at a more technical level than would be understood by lay audiences. Once findings have been disseminated in scholarly form; however, scientists may have difficulty translating their findings in a way that connects to a different audience altogether. They may engage in what design fields call premature fixation, where a scientist becomes resistant to change as their design—or in this case their findings and conclusions—take on a high level of complexity or detail. (Robertson, Walther, & Radcliff, 2007).

In the case of the DroughtSim, the scientists had spent years developing a sophisticated and complex model that had been used by technical experts and policy makers interested in exploring how changes to a number of environmental, economic, and policy variables might affect our vulnerability to drought. The model was what they had been awarded funding from the National Science Foundation to develop and it was what had gotten the attention of people at the National Museum, who asked them to develop a modified version for their traveling museum.

The model captured the complexity of regional water systems, including how water availability and demand would affect agriculture, industry, the economy, and the environment.

When adapting the model for non-technical users, the scientists wished to communicate this complexity to show users that, in drought scenarios, there will be no easy answers or simple solutions. The scientists recognized that the interface needed to be simplified from the original, providing a museum visitor with fewer variables to manipulate on the iPad versus the giant screens in the large room where technical users were able to interact with the original simulation. And yet, aside from simplifying the options a user could choose between, the scientists did not pursue major revisions to the experience flow nor did they interrogate their assumption that asking users to participate in a simulation and make regional, systems-level choices to balance their region’s water supply was the best way to connect with them about drought.

What our user testing showed, however, was that users didn’t seem to know enough about water systems to care about whether they were able to balance their regional system. Instead, users told us about their experiences with water and what they thought other people needed to know about drought. Many were aware that drought was a distinct possibility in their region. Many predicted, though with wild variety (e.g., “in a few years?” “maybe ten years?” “fifty years from now?”), that the region would experience a water crisis. But when users were asked to interact with the simulation, many weren’t interested in making system-level choices as if they were a water manager.

In our interviews, users also cited that they wanted to understand how drought would affect them and what they could do to help their communities adapt. Moscardo (1999) asserts that creating mindfulness in users requires that the exhibits they interact with connect to their personal experience and tell a good story. He establishes that there is often a disconnect between what visitors want to learn from an experience, and what the signs and other information about that experience are telling them. For instance, in one study of tourists to the Wet Tropics World Heritage rainforests in Australia, Moscado found that 50 percent of users surveyed wanted to know about conservation issues and what they could do to protect rainforests. However, only 5-16 percent of the signs delivered that kind of information. In contrast, only 36 percent of visitors expressed a desire to learn about specific plants and animals, yet 83 percent of the signs contained that kind of information (1999). Similarly, the DroughtSim did very little to connect to what users already knew, or what they wanted to get from the experience.

Challenge 5: Scientists privilege technical accuracy and complexity

The scientists we worked with on this project had a depth of knowledge about their subject matter that was magnetic. They provided thoughtful answers to all our questions about water and drought, data, modeling, and how to make the best predictions possible amidst a great deal of uncertainty. But when we shifted our role from “curious questioner” to “user representative” they became a bit stubborn about making any changes to their system that would compromise fidelity to the complex picture they had painted for us. They didn’t think simplifying the complexity of the system would accurately represent the real-life situation. It also wouldn’t make people more aware of the full depth of factors that the scientists had spent so many years integrating into their model.

In simplifying a non-technical users’ interaction with a complex model, a designer typically needs to sacrifice some of the complexity of what appears to determine the outcome of a system. But in addition to the scientists’ protesting that users wouldn’t experience what they needed to were the system overly-simplified, the scientists also voiced concern that were another scientist to interact with the simplified version at one of the museum sites, that person might question the intellectual credibility of the center for putting out a product that didn’t faithfully reflect all variables that determined outcomes. One scientist insisted that the system be able to withstand the scrutiny that might come from any water manager in the area who might randomly walk into the museum and interact with it. This scientist needed to make sure that his reputation as a scientist wouldn’t be compromised, no matter what that meant for the rural museum visitor user. Accuracy and complexity were more important than the user experience because accuracy and complexity were what scientists felt judged on. Becoming aware of what scientists’ value, which may be at odds with what would facilitate an engaged process for users, will better prepare UX professionals for having productive discussions with the scientists they are collaborating with.


As these challenges reveal the extent to which scientists may believe in the supremacy of their own scientific knowledge over the experience of the user who might interact with that knowledge, technical communication and UX specialists can embrace a transactional design approach from the beginning of their involvement on projects to work with scientists more successfully.

In this section I will discuss the features of a transactional design approach and share examples of how those features have been implemented effectively.

Transactional design values the localized experiences of users and connects to those experiences with the ultimate goal of cultivating mindfulness. Mindfulness is cultivated when users experience variety, multisensory media, novelty, interactivity, and when the content makes connections with where visitors are coming from, what they know, and what they value. See Figure 6 for the chart that Moscado developed to illustrate the process.

This image shows Moscado’s five-step Mindfulness model, including communication factors, visitor factors, cognitive state, organization of content, and consequences. The Communication Factors step includes the following attributes: 1. Variety and change, 2. Uses multisensory media, 3. Novelty/ conflict/surprise, use of questions, Visitor control/Interactive exhibits, connections to visitors, and good physical orientation. The Visitor Factors step includes: High interest in content, Low level of fatigue, and Lack of distractions. The Cognitive State step includes “Mindful” and is pointed at with arrows from the first two steps. The Organization of content step includes Clear structure matched to what visitors know. The Consequences step includes: 1. More learning, 2. High satisfaction, and 3. Greater understanding.
Figure 6: A model for ensuring mindfulness in the experience of museum visitors (Moscado, 1999).

If we start with the radical notion that a user’s expertise (i.e. their knowledge gained through experience) is as valuable as scientists’ knowledge when it comes to developing effective public-facing science communication projects, what actual process can we follow to ensure mindful engagement? The following checklist can be used to help a development team of both scientists and UX specialists embrace a transactional design process:

Pre-Step 1) Proactively seek out projects in your organization or municipality by networking with scientists and looking for grants that have been funded or are in development at your organization or within your municipality. It is not likely that scientific research teams are going to prioritize the public-facing science communication components of their efforts (if there are any) early in the process, so the more visible you can make your UX and technical communication teams/labs, the more effective you will be at introducing essential interventions early in the process. I have been invited to consult on projects because of my work on other projects, but I also end up connecting with projects because I tell people what I do for a living when they ask. I met the lead scientist for the DroughtSim project through a leadership group at my university. And the latest project I consulted on was with a volcano research group in which one of the members happened to swim in the lane next to me during our morning master’s swim group.

Step 1) Use UX research methods (surveys, interviews, etc.) to discover what users have experienced (what expertise they’ve developed) about the general subject you’re working on, including what they already know, what they want to know, and what they think others should know. This research will form the basis of Figure 6: A model for ensuring mindfulness in the experience of museum visitors (Moscado, 1999). 10 Communication Design Quarterly Online First, August 2020 the public expertise you will start from as you bridge over to the knowledge that the scientists bring to the experience as well. As both our interviews, and the results of a survey from confirmed (Granias et al., 2018), users aren’t always aware of what they may want to know about a topic, so it’s better to start by finding out what they value and what personal connection they have to it. This research should happen very early in the process and data should be collected in a systematic way that can be used to defend future design decisions.

Step 2) Interview scientists to discover the goals and outcomes scientists have for users. This shouldn’t be confused with what knowledge they have to share. You will want to ask about that, obviously. But equally important are what outcomes they would like to see for users: what should users be able to do with their experience? How do scientists hope users will think and feel and act? This will provide a holistic scope for the UX team and be useful information to help justify design choices.

Step 3) Put aside any product or interface that was developed before talking to users. You may return to it later, or it may inform the ultimate direction the project goes in, but as it is entirely the product of the scientists’ knowledge, it is, at this point, only a partial perspective.

To illustrate why this is important, when our UX team was asked to consult on a different project in which scientists were building a tool to help low-income users lower their utility costs, we found that the scientists had already begun to design the application and its features before talking to any users; they had even gone so far as to develop personas without actually talking to any people. However, when we showed them results of our user testing from the DroughtSim project, the scientists were persuaded to halt all development until they had a chance to talk to actual users from their target user groups. DroughtSim was a project they admired, yet after hearing how confused users had been, they didn’t want their project to generate the same levels of frustration.

Step 4) As you begin to brainstorm user flow and design ideas, start with what users know and have experienced (user expertise) as the very first thing they interact with in your interface, product, or exhibit. This can be done using imagery, language, and questions that they can relate with and understand. This will ensure buy-in, and a sense from the user that their perspective is valued and appreciated. For instance, one successful element of the DroughtSim was the one-minute introductory video that users watched before beginning the simulation to orient them to the wide range of ways people typically interact with water. I worked with the scientists on the script for this video to make sure that the language they used was not too technical or abstract, and that the points being made were paired with images that illustrated what was being narrated. The script for the video made direct reference to recognizable and tangible places and things that people associate with their experiences with water.

But connecting in tangible ways with users also means avoiding value propositions. For instance, important to the success of the video was our careful editing of a point in the video that discussed environmental consequences of drought. The health of the environment has become associated with certain political values that might not be shared by all users; thus, we made a key change to the script that attempted to avoid “taking sides.” The original sentence stated: “The environment is usually at the end of the water line and thus often natural wetlands and streams suffer from reduced water supplies.” To paint a more human-inclusive picture of the consequences of drought on the environment, we revised the sentence to read: “Water for the environment supports fish and wildlife habitat and provides recreation opportunities.” This change did what David Kahan (2010) recommends, which is to redirect our emphasis from an abstract (and often controversial) discussion of values (e.g. the environment as victim to harmful human action), to one that discusses activities the audience is more familiar with. In other words, people may respond skeptically to a discussion about how lower levels of water cause the environment to suffer. But by framing the discussion in terms of people’s ability to participate in the water-based outdoor activities they enjoy (e.g., using rivers and lakes for fishing and recreation), people who may have seen the environment as something that existed in competition with their other water needs, now see how drought might negatively affect their more concrete experiences—like their ability to recreate outdoors.

Virtually every user we interviewed, when asked about the video, said that they had enjoyed watching it and that it was just the right length. Users didn’t cite learning much new information from the video, but they at least were not confused by it and reported enjoying its content.

Step 5) From the initial connection point, architect the remainder of the experience using conditional or skip logic. This is a common survey design technique used to combat survey fatigue by programming a survey to adapt subsequent questions or content based on the responses that users provide (Lauer, Blythe, McLeod, 2013). Valuing a user’s expertise means starting with what they know and have experienced. But users don’t all know the same things, nor do they want to learn the same things. Skip logic can accommodate a wide range of starting points and proceed toward common learning outcomes. For instance, I am aware from the results of our user research that people don’t generally know how much water they use daily nor are they aware what proportion of a community’s water supply is used by people use vs. farmers vs. industry. Rather than asking these questions directly, you could design an interactive experience where you ask users to pour water from the well into various buckets with “people” “farmers” and “industry” user labels. Then, you could program the interaction to respond to their inputs using actual averages from their region, followed by explanations as to where water comes from and how water use has changed over the past twenty years. You could then focus specifically on the area where the largest disconnect seemed to occur. For instance, several participants we interviewed suggested that local housing development was responsible for most of the stress on our water system. They were surprised to see that, in their area, farming was actually responsible for 75% of all water use.

Step 6) Test with users early and often, and in context. Testing in context can make you aware of all kinds of environmental factors that may interfere with the ability of users to interact with your installation, from volume, to power supply, to internet reliability, to other issues. Had our team not visited the museum itself, we would have missed several obstructions that we never anticipated would prohibit users from interacting successfully with the exhibit.

Step 7) Acknowledge the limitations of scientific data when communicating to non-scientific audiences about scientific issues of global importance. Scientists typically value accuracy and credibility above all else. It can thus be challenging to adapt science so that it connects to what an audience might value from the experience. Scientists may see compromising on the complexity of content in science communication as compromising on the science itself. A transactional design process begins with a user’s expertise and bridges the gap between how users want to be able to act and what scientists have to teach to make that action meaningful. Users bring their own knowledge, experiences, and desires to the data scientists have to share which, together, can engender meaningful change.

Step 8) Make the case to funding agencies that this work is valuable and establish your UX work as the science that it is. User testing takes time and making changes to existing systems takes money, and not all scientists will write this kind of work into a grant proposal. Having said that, there are areas of funding in grants that can be shifted to accommodate new work, so you can ask about those. Our team initially worked pro-bono, but after the scientists recognized the value we were bringing to the process they found money in their existing grant to fund our work.

The bigger concern is that until funding agencies require that scientists build effective, public-facing science communication projects into their grants, scientists are not professionally rewarded for this kind of work nor might they consider it even to be “science.” For example, I was sitting in a presentation with the scientists from the center and representative from the NSF to which we were reporting about the activity that the center had completed on the grant over the year since the NSF had last visited. The director of the center gave me the opportunity to present the museum installation and classroom interfaces we had developed and discuss the user research we had conducted and the recommendations we were making to improve the interface. However, when the director transitioned from my discussion of the interface to another research area that the grant was funding, he said, “and now getting back to the actual science…” (personal communication, September 17, 2018).

I was struck by this phrase because I knew the director was impressed by what the UX team had accomplished. But he was talking to a funding agency that had not given money to support this “softer” activity and I assume he felt compelled to reinforce that the money the center had been awarded to complete scientific research was funding hard science. Our UX design and research work had not been part of the grant and it was not how the center would be assessed on their delivery of what the NSF had given them money to research. It was not considered actual science. This misconception is not likely to change until technical communication and UX designers successfully lobby funding agencies to require public facing communications on all projects and evaluate the effectiveness of those communications using valid UX methods.

It is thus essential to establish your UX work as a science that speaks to systematic data collection and validity. Employing valid methods to study the effectiveness of science communication shows UX research to be the science that it is. Our team demonstrated our value not just through our initial recommendations and design experience, but by conducting our UX research in a systematic manner. We designed valid user studies; we collected rich, meaningful data. We presented our findings accurately and professionally. Eventually, we were encouraged to get involved in writing grants and articulating how UX should be included in the grants scientists wanted to write. Knowing that scientists are not practiced in making the case for design and UX makes it especially imperative for UX professionals to be proactive in articulating how much of a grant should be spent on the design and testing of a system and why. But this advocacy can start at the organization level as well.

We are currently working with our technical communication and UX master’s programs to develop a slide deck and pamphlet themed around what UX can bring to science that we plan to distribute to departments across our campus. In it, we show our understanding of science and the values of scientists we’ve worked with. We also communicate how important it is to understand the user. We highlight the particular expertise we have in doing so and discuss our research methods to make UX seem like the science that it is, instead of some friendly gesture or cosmetic exercise.


Public-facing science communication projects carry tremendous potential to shape the way members of the public think and act about issues of local and global consequence. The transactional design checklist I present in this paper provide a way forward through some of the common challenges that may arise in working with scientists on public-facing communication projects. Valuing users’ expertise, as a combination of their lived knowledge and experience, is essential to the success of any science communication project. As this expertise is not typically recognized by scientists, the transactional design checklist will help technical communication and UX specialists be more strident advocates for users.

The future of our very planet rests upon the policies that our global leaders enact in response to the pressure applied by their constituents. As the most vulnerable communities around the globe will be hit the hardest by climate change, infectious disease, and other global issues, ensuring mindful public communication of science through a transactional design model is nothing less than an issue of social justice. Connecting global publics to scientific knowledge will ensure that people have equal access to the tools and information they will need to advocate for positive and meaningful change in their communities. Myriad outlets exist that have the potential to provide people with engaging access to scientific knowledge, but they must be designed in an engaging way that connects to a person’s existing experience. Technical communication and UX specialists must work with scientists to design experiences that can achieve this.


I would like to thank the scientists and administrators who initiated the DroughtSim project for giving me the opportunity to work with them. I would like to thank Madhura S. for her generous design work contributing to the interface design and instructions board. I would also like to thank Robin C. for her research with visitors in the southeastern U.S. and her development of personas from that research. Finally, I would like to thank the many other graduate and undergraduate students who participated in user researching the DroughtSim.


1. A pseudonym

2. A pseudonym


Adams, D. M., Mayer, R. E., MacNamara, A., Koenig, A., & Wainess, R. (2012). Narrative games for learning: Testing the discovery and narrative hypotheses. Journal of Educational Psychology, 104(1), 235–249.

Briselli, J. (2013). Demanufacturing doubt: A design strategy for science communication (Doctoral dissertation, figshare).

Briselli, J. (2014, August 26). Designing science communication. Retrieved from

Bureau of Reclamation. (2019, May 20). “Interior and states sign historic drought agreements to protect Colorado River.” Retrieved from

Druschke, C. G. (2014). With whom do we speak? Building transdisciplinary collaborations in rhetoric of science. Poroi, 10(1), 10.

Druschke, C. G., & McGreavy, B. (2016). Why rhetoric matters for ecology. Frontiers in Ecology and the Environment, 14(1), 46–52.

Giannakos, N. M. (2013). Enjoy and learn with educational games: Examining factors affecting learning performance. Computers & Education, 68(1), 429–439.

Goodwin, J. (2014). Introduction: Collaborations between scientists and rhetoricians of science/technology/medicine. Poroi, 10(1), 5.

Granias, A., Martin Rogers, N., Absar, K. & Helmstetter, C. (2018). How Americans relate to water: A qualitative study for the Water Main. [Whitepaper]. APM Research Lab and Wilder Research.

Irwin, A. (2014). From deficit to democracy (re-visited). Public Understanding of Science, 23(1), 71–76.

Irwin, A. (1995). Citizen science: A study of people, expertise, and sustainable development. Routledge.

Kahan, D. (2010). Fixing the communications failure. Nature, 463(7279), 296.

Kinsella, W. J. (2004). Public expertise: A foundation for citizen participation in energy and environmental decisions. Communication and public participation in environmental decision making, 83–95.

Koenig, A. D. (2008). Exploring effective educational video game design: The interplay between narrative and game-schema construction (Unpublished doctoral dissertation). Arizona State University.

Lauer, C., McCloud, M., and Blythe, S. (2013). Online survey design and development: A Janus-faced approach. Written Communication, 30(3), 330–357.

Moscardo, G. (1999). Making visitors mindful: principles for creating quality sustainable visitor experiences through effective communication. Sagamore publishing.

National Academies of Sciences, Engineering, and Medicine. (2017). Communicating science effectively: A research agenda. National Academies Press.

Papastergiou, M. (2009). Digital game-based learning in high school computer science education: Impact on educational effectiveness and student motivation. Computers & Education, 52(1), 1–12.

Robertson, B. F., Walther, J., & Radcliffe, D. F. (2007). Creativity and the use of CAD tools: Lessons for engineering design education from industry. Journal of Mechanical Design, 129(7), 753–760.

Rodríguez Estrada, F. C., & Davis, L. S. (2015). Improving visual communication of science through the incorporation of graphic design theories and practices into science communication. Science Communication, 37(1), 140–148.

Spoel, P., Goforth, D., Cheu, H., & Pearson, D. (2008). Public communication of climate change science: Engaging citizens through apocalyptic narrative explanation. Technical Communication Quarterly, 18(1), 49–81.

Vernon, J. L. (2014). Leveraging rhetoric for improved communication of science: a scientist’s perspective. Poroi, 10(1), 1–6.

Walsh, L. (2013). Scientists as Prophets: A Rhetorical Genealogy. Oxford UP.


Claire Lauer is an associate professor of technical communication and user experience at Arizona State University. She researches how to communicate scientific information to public audiences, how people read data visualizations, and how to effectively design data-driven interfaces for researchers. She has also published on how we use language to describe changes in text production, how technology impacts creative thinking and design, and how the work of technical communicators has adapted in response to technological innovation. She is past chair of ACM’s Special Interest Group for the Design of Communication (SIGDOC) and past vice chair of operations for the SGB Executive Council of the Association for Computing Machinery.

Implementing a transactional design model to ensure the mindful development of public-facing science communication projects
Tagged on: