Monsters versus Mobs: A Call to Unleash the Scientist/Narrator Hybrids
Written by Knigel Holmes
Photo Credit: Double–M cc
Technology cannot speak for science without frightening the townsfolk.
The Modern Prometheus tries speaking, tries explaining, but his patchwork throat provokes fear from the mob on the brink of lynch mob. If scientists want to avoid inciting irrational fear and misunderstanding towards science and technology, and if these professionals value a conscientious community, their secret laboratory must now unleash Prometheus 3.0, the scientist/narrator hybrid. These re-engineered titans will crush unsupported assumptions of science communication while communicating facts through a creative arsenal, including weapons-grade storytelling aimed at personal and cultural values. The scientific community does want the public to understand scientific research and its expert consensus, but scientific evidence does not support several traditional assumptions of science communication; therefore, the scientific community should take sides on public scientific issues and use science communicators with a wide range of creative communication techniques which use facts as well as compassion to connect to audience values.
If only the good Victor Frankenstein had called Bill Bryson to work out a book deal for A Short History of Gnarly Frankenstein, or dug up Carl Sagan for a sequel to The Demon-Haunted World, the mob would perhaps smash battering rams against bookstore doors rather than castle gates. Awe and amazement would replace shock and horror. Franken-pup pre-orders would replace disgust and repulsion. No matter how hard scientists work or how beneficial the technologies, disturbing narratives of science within public consciousness hinder efforts to introduce ideas into society. Narratives, accounts of connected events and the art of framing them, are powerfully influential, sometimes speaking louder than facts. Technologists, therefore, cannot expect their mute creations, their new technologies, to defend their own intentions.
Furthermore, how scientists and technologists speak to audiences is as important as their facts when introducing new ideas and changing old ones. Contemporary research challenges the assumption that the public will change views on scientific matters if simply given the same facts which convince scientists. Kahan et al., for example, found that those members with the highest degrees of science literacy and technical reasoning capacities were more prone to cultural polarisation (2012). Science communicators, then, should not treat the public as irrational or stupid, but rather re-evaluate their own assumptions and utilize compelling, creative communication techniques such as storytelling — maybe even biology-themed Monster Mashes.
Curiosity and empathy drive many people to become scientists. Contrary to many narratives, scientists are typically pro-social and compassionate. On professionalism and responsibility, Conrad Brunk argues that professionalism is not sufficient in and of itself for ethical justification; therefore, professionals must consider how their role functions and impacts the greater system. If scientists want to be responsible and help society, each should embrace conscientious professionalism. All conscientious professionals try bettering the world, and all those seeking to better the world must consider their actions in a larger context. Using science, for instance, to breathe life into a hodgepodge of corpses sewn together is amazing; however, ignoring how such a being will function in society means shirking the responsibility of inquiring into potential consequences. All conscientious professionals, then, must consider broader ramifications.
Professionalism, according to Brunk, should instil a greater sense of responsibility for the social good, and should not be a cloak for self-serving, monopolistic practices. Following his rationale, the scientific community not only has the responsibility to work on projects which are socially beneficial, or at least non-harmful, but also has the responsibility to reach out to the public. If a technology or science affects society in a major way, then society should have the means to have rational dialogue with the scientific community. Those professionals and the rest of society should have a reciprocal relationship rather than a one-way discourse of technologists stating after the fact that which has been introduced, and with which society must cope.
“Reciprocity”, says metallurgist Ursula Franklin, “is some manner of interactive give-and-take, a genuine communication among interacting parties. For example, a face-to-face discussion or a transaction between people needs to be started, carried out, and terminated with a certain amount of reciprocity” (Franklin, 2011). Such reciprocity is essential, and research shows scientists want this two-way communication (Peters, 2013); however, with increased specialisation, fewer people can understand except in the vaguest terms what experts do. Contemporary society lives in a world where breakthroughs in technology and science have uncertain facts, disputed values, high stakes, and urgent decisions (Funtowicz, 1993; Scheufele, 2013). Further, scientific research is arguably not value-neutral (Stevenson , 1989; Derry, 1999; Goldstein, 1984), and science communication environments are often polluted (Kahan, 2012). Similarly, there is no easy answer or agreement for who is and is not an expert in society. Some readers will accept Conrad Brunk, Ursula Franklin, and Dan Kahan as experts in this reading, yet others will not accept these names as justified authority. A conscientious scientist, then, must take a step back from their specialised work and find ways to explain such work in terms which lay audiences may better understand, but not in a way in which treats lay audiences as scientifically-illiterate (Bucchi, 2008).
Since scientists are generally burdened with doing science and do not always have time to communicate with the public on all forums, including various social media, one option is to employ science communicators (Monoyios, 2012; Peters, 2013). While working scientists should try to work directly with the public when possible, they can fulfill a part of their responsibility of being conscientious professionals by working with communicators who have training in bridging the complexities of scientific inquiry and lay audiences. Professional and lay scholar communicators can do outreach and mediate between scientists and the public. Further, scientists can now use the science of science communication to explain controversies over risk as well as predict, manage, and avoid conditions which are likely to trigger it (Kahan, 2012).
To improve reaching audiences, according to researcher Arthur Lupia, science communicators can build and use a scientific knowledge base which will help communicators manage their own realistic expectations about when audiences will pay attention and believe the message. Such a knowledge base can help communicators avoid common presentation mistakes. Further, science communicators do not need to engage in “spin” and manipulation, nor “dumb down” presentations (2013), and can challenge assumptions such as the reduction of science communication into a mere gap between expert and lay knowledge (Bucchi, 2008). Instead communicators can work to make their messages compelling and vivid while also demonstrating potential practical uses (Bucchi, 2008). Biologist, Maddalena Bearzi, gives five strategies (2013):
- Be simple and straightforward.
- Don’t be condescending or pedantic.
- Tell compelling stories.
- (When and if you can) use illustrations.
- Be Apolitical.
Being simple and straightforward is essential because although many in the scientific community try conveying information to the public and policymakers, much of the message is ignored or misinterpreted (Lupia, 2013). Often, audiences pay attention less than communicators anticipate. Non-scientists, for example, find scientific lexicon difficult and presentations unnecessarily abstract as well as disconnected from their own lives. Further, politicized environments influence whom audiences will be more likely to believe.
Conscientious scientists must communicate with compassion and tact because even if the communicator is factually correct, the target audience will likely dismiss the message if the audience perceives the speaker as condescending or pedantic. Displaying an attitude of superiority, either intentionally or unintentionally, undermines communication efforts not only in the short-term, but long-term. Airs of condescension bring to mind narratives such as Nurse Ratched’s snide oppression of McMurphy — or Captain’s shouting of “What we’ve got here is failure to communicate,” after beating on Cool Hand Luke. Individuals who narrate themselves as rebels fighting against authority and oppression become more resistant to any facts which may come from the perceived enemy (Diethelm & McKee, 2009).
While each person may arguably have the responsibility to educate themselves about science, perhaps the biggest obligation falls upon the science community and their outreach. Placing the blame for science conflict within society entirely on a scientifically-illiterate public is not supported by the evidence (Scheufele, 2013; Kahan, 2012), and is not a constructive attitude for conscientious scientists. “Over time,” reports Scheufele, “different researchers found that levels of knowledge can lead to more positive public attitudes toward science or undermine support for science, depending on the particular scientific issue people were debating. In fact, for controversial science topics the relationship between literacy and attitudes approaches zero” (2013). Simply, people with the highest degrees of science literacy and technical reasoning may end up on the scientifically unsupported side of a polarised science debate such as climate-change denialism (Kahan et al., 2012). The human brain is prone to cognitive biases such as motivated reasoning and motivated skepticism which interfere with reasoned discussion of scientific facts (Bastian, 2013; Kunda, 1990). Instead of too quickly assuming people are irrational or scientifically illiterate, science communicators should apply the principle of charity and strive for a patient discourse to determine the reasons and values behind people’s beliefs. Connect to compassion, override fears, rewrite the narrative, and Frankenstein’s monster ceases to be a monster.
Also countering common intuitions, Nisbet and Dietram (2009) reviewed social science research on how the public comprehends and interacts in societal decisions about science and technology, and argue that many science communication initiatives are based on a false premise that public knowledge deficits are a central culprit in societal conflict over science. Science literacy, they contend, has a limited role in shaping public perception and choices. The researchers highlight the role of the media and public communication in the process, and challenge the dominant assumption that science literacy is the main problem and solution to societal conflicts on science.
The philosophy behind the challenged assumption suggests that facts speak for themselves and that people will change their beliefs in line with scientists. The view assumes people comprehend facts in similar ways, and if the public does not accept the facts, then the blame falls on journalists for their failure to transmit the message, on the public for their “irrational beliefs”, or both. Importantly, using this “deficit” model increases tension between science communicators and the public because condescending criticisms of public ignorance alienates audiences. Moreover, by defining every debate as a “crisis” or “politicisation”, science communicators increase further polarisation. Similarly, discussing policy debate in terms of “sound science” reduces scientific knowledge to a commodity with which interest groups can draw from in political battles, undermining the perceived integrity of science, what Michael Mann calls the “scientisation of politics” (Frank, 2013).
If science communicators want to follow Bearzi’s advice of being simple and straightforward without being condescending or pedantic, then they should use Nisbet and Dietram’s recommendations emphasising the premise that science communication needs a foundation of systematic empirical understanding of the target audience’s values, knowledge, and attitudes; interpersonal and social contexts; and preferred media sources and communication channels. Dan Kahan adds more support to this perspective:
People acquire their scientific knowledge by consulting others who share their values and whom they therefore trust and understand. Usually, this strategy works just fine. We live in a science-communication environment richly stocked with accessible, consequential facts. As a result, groups with different values routinely converge on the best evidence for, say, the value of adding fluoride to water, or the harmlessness of mobile-phone radiation. The trouble starts when this communication environment fills up with toxic partisan meanings — ones that effectively announce that ‘if you are one of us, believe this; otherwise, we’ll know you are one of them’. In that situation, ordinary individuals’ lives will go better if their perceptions of societal risk conform with those of their group. (2012)
Following Bearzi’s suggestions and Kahan’s explanations, science communicators should connect to people’s values and personal narratives rather than simply overloading them with facts and becoming frustrated when people seem to wilfully ignore evidence. If science is based on facts, then communicators do not need to manipulate audiences with “spin”, but can instead draw upon such facts, for example, to tell honest stories. Science communicators can connect scientific findings to visceral, personal events to connect with people’s emotions and values. Science communicators can talk to people through shared, empathetic experience instead of talking down to people through complex, abstract ideas and jargon which often comes across as condescending. The rise of literary journalism or creative non-fiction exemplifies how science becomes accessible to wider audiences. To hype up some public interest, Dr. Frankenstein could have handed out a few copies of something such as Mary Roach’s Stiff: The Curious Lives of Human Cadavers or Rebecca Skloot’s The Immortal Life of Henrietta Lacks.
Similarly, following Bearzi’s advice of using illustration, science communicators should take full advantage of the contemporary audio-visual environment. For instance, Elise Andrew’s Facebook page I Fucking Love Science converts scientific research into more easily digestible image memes. Neil DeGrasse Tyson hosts StarTalk, a radio show which discusses interesting topics such as zombies. There are also graphic novels such as Logicomix: An Epic Search for Truth written by Apostolos Doxiadis and Christos Papadimitriou with artwork done by Alecos Papadatos and Annie Di Donna. While much of the audio-visual media on the Internet is not of the same intellectual calibre as academic papers, and while there is a risk of turning science into mere “fun facts” without the essential critical thinking and scientific reasoning, these media reach wide audiences and spark interest (Peters, 2013). Another problematic issue arises when others use these media, either intentionally or unintentionally, to propagate misinformation, degrading the science communication environment (Diethelm & McKee, 2009). If scientific outreach ignores these various media, there will be a vacuum where which other, less qualified, voices speak on scientific issues. A conscientious professional should take an active role in promoting reason, science, and scientific skepticism to the public; therefore, scientific outreach should not ignore these multimedia tools.
Bearzi suggests communicators remain apolitical, and this point has merit; however, not without stipulation. According to Bearzi, communicators should avoid bringing their own political affiliations into their message be they Communist or Conservative, yet communicators must also recognise that while they should take a nonpartisan stance with regards to political parties, they have to take a political stand. Scientists must understand that they cannot escape taking sides through either action or inaction (O’Brien, 1993) because inaction is still action. If scientists are not speaking up on issues which may negatively affect public interests, then such scientists are not remaining neutral, but in fact picking a side. By trying to stay neutral on matters where there is scientific consensus such as on the importance of vaccines, the safety of genetically-modified foods, or the dangers of anthropogenic global warming, and where ignorance or denial of the consensus leads to societal harm, scientists who refrain from putting forth their expertise allow others to speak without challenge (Diethelm & McKee, 2009).
Furthermore, if people do not understand how the scientific consensus works, then they will not have a clear understanding of how to distinguish quality science from noise in a polluted science communication environment. The scientific consensus is one of the best tools with which the public can use to prevent those who would manipulate society on scientific issues (Janabi, 2013; Lee, 2000). While not perfect, the scientific consensus is a means through which laypeople can compare the claims of other proclaimed authorities and experts (Lee, 2000). For example, laypeople can use the scientific consensus against individual scientists making extraordinary claims about alternative medicines. Since no one can be an expert in everything, and since our world is becoming more specialised and complex, individuals must rely more heavily on expert authorities. Knowing the difference between a legitimate argument from authority and a fallacious appeal to authority can be difficult for more than only laypeople; however, the scientific consensus generally provides a justified argument from authority on scientific issues.
An appeal to authority can be either valid or invalid (Rational Wiki, n.d.). Sometimes an argument needs input from a qualified or expert source; however, often the appeal becomes fallacious such as when the topic falls outside of the authority’s expertise or when the authority has an interfering bias. Sometimes someone can even fallaciously assert themselves as an expert, otherwise known as Ipse dixit, or “He himself said it” (Ipse dixit, n.d.). Since distinguishing a valid from invalid argument from authority takes thorough analysis, using the scientific consensus needs justification. From a layperson’s perspective, the scientific consensus is just another authority, and many in the public who are unfamiliar with the assumption of uncertainty in science perceive these authorities as having been wrong often before. Janabi (2013), however, gives three conditions that satisfy the criteria for a legitimate appeal to authority, and which he argues the scientific consensus satisfies:
- The authority referenced is an expert on the exact subject involved.
- There is consensus among experts from the field in question.
- The consensus is based on high-quality, replicated data.
While discussions on the philosophy of science can argue against some of these assumptions, these criteria provide at least some workable demarcation. Since most science communicators want people to understand scientific matters, communicators must be able to persuade the public about the importance of the scientific consensus and its function.
Currently, however, there is no easy means by which a layperson can determine and understand the scientific consensus on many issues. Trying to determine the scientific consensus on the safety of genetically-modified foods, for instance, is likely to lead laypeople to Google wherein several of the top ranked links lead to numerous credible-sounding institutions which claim there is no consensus and link to lists of studies seemingly showing detriments to health. Without an academic science background or thorough self-education on science, laypeople will likely be unable to distinguish legitimate scientific institutions and research from those which simply mimic the credibility through superficial aspects such as similar names. Ted Goertzel describes the situation thusly:
How can we distinguish between the amusing eccentrics, the honestly misguided, the avaricious litigants and the serious sceptics questioning a premature consensus? No private individual has the time or the expertise to examine the original research literature on each topic, so it is important to have some guidelines for deciding which theories are plausible enough to merit serious examination. (2010)
Science communicators, therefore, compete with those disseminating false information and promoting false science controversies, making simple and straight-forward messages difficult. For example, with the increase of science literature, there is also an increase in noise, especially after the introduction of open access (Stone, 2013). Deterioration of traditional media has deteriorated the role of trained science journalists in traditional newsrooms, and as science moves online, journalists who may not be trained in science might miscommunicate scientific matters (Scheufele, 2013). False experts further compromise the science communication environment by making lay audiences more confused about the scientific consensus (Diethelm & McKee, 2009). Dixon (2013), for example, investigated how balanced presentations of the autism-vaccine controversy influence vaccine risk judgements, finding that in false-balance situations, participants were more uncertain that vaccines did not cause autism and believed fewer consensuses among experts. Exploiting this weakness, false experts on the Internet create fear, uncertainty, and doubt of actual expert opinion (Diethelm & McKee, 2009). For this reason, conscientious science communicators, then, must take an active role in challenging distributors of misinformation that increases societal science confusion.
If science outreach expects the public to know what the scientific consensus is and why it is important, then the scientific consensus needs to be clearly stated and easily accessed. Unless science communicators effectively communicate the importance of the scientific consensus, members of the public are not culpable for ignorance of such consensus. Goertzel explains:
Decision-makers and the general public are best served when scientists specializing on an issue can reach a reasonable degree of consensus, making clear the limits to their knowledge. If scientists cannot do this, surely it is too much to expect politicians or journalists to do it. But efforts to define a consensus are vulnerable to attacks by conspiracy theorists that portray them as mechanisms for suppressing dissent and debate. There are always dissenters and arguing with them can be time-wasting and frustrating. (2010)
Scathing criticisms should be reserved for those who should and do know better. Those with scientific knowledge should also be empathetic to those who may not have the same learning opportunities. By connecting to people’s values, communicators help show people how scientific facts and reasoning are needed for informed decisions. While the wider public has the responsibility to inquire into science and not merely leave the responsibility to the authorities, science communicators must connect to people’s values in order for people to understand their responsibility in scientific matters. Both laypeople and professionals need to work together politically in preventing other individuals and organisations from preying on the vulnerable and creating false conflict. Dan Kahan states:
Decades of study show that the sources of public controversy over decision-relevant science are numerous and diverse. There is, however, a single factor that connects them: The failure of democratic societies to use scientific knowledge to protect the science communication environment from influences that prevent citizens from recognizing that decision-relevant science contributes to their well-being. (2013)
The science community needs to work with the public towards protecting against those influences which hinder science communication and deteriorate the science communication environment. Experts and science communicators need to explain clearly that not being an expert does not mean not having the responsibility to learn the scientific topics about which one speaks. Non-professionals and the general public ought to not only challenge authority, but also challenge their own ignorance lest they cause unintentional detriment to society. If science communicators, however, do not deliver this message, many will not have the necessary tools or understanding with which to undertake this responsibility.
Nancy Williams describes the idea of “affected ignorance” which to her is “the phenomenon of people choosing not to investigate whether some practice in which they participate might be immoral or rife with controversy” (2008). Williams believes that affected ignorance is culpable because one chooses to ignore something that is morally important. While generally involving the refusal to consider if a practice in which one participates is immoral, affected ignorance has a variety of forms which sometimes overlap. She provides four forms of affected ignorance:
- When people refuse to acknowledge the connection between their actions and consequent suffering of their victims.
- When people ask not to be informed of the nature of the practice in question.
- When people simply do not ask questions.
- When people adhere dogmatically to conventional rationalisations and are unwilling to accept the possibility that the majority opinions and widespread practices can be mistaken or cruel.
Before reprimanding people for science denial, quackery, or lack of skepticism, conscientious science communicators must make every effort to uphold their side by communicating the essential information which would combat the four types of affected ignorance in the public. If people are to acknowledge the connection between how, for example, their anti-vaccination promotion reduces herd immunity and puts the most vulnerable at even more risk, science communicators must make sure that the scientific literature is accessible and understandable. If the information is acceptable, but the anti-vaccination promoter asks not to be informed, does not ask questions, or follows their own anti-vaccination ideology, then these promoters can be deemed culpable. One challenge, of course, is ensuring the science-based message speaks loudest through the noise of those unintentionally or intentionally spreading anti-science disinformation.
In sum, the scientific community and its genetic engineers need to incubate more scientist/narrator hybrids. Scientists need public support, and before the public will support various science projects, the scientific community has to embrace story-telling and creative communication based on enthusiasm, understanding, and compassion. If the scientific community wants public support, scientists need to take sides on public science issues while employing charismatic communicators able to connect with people’s values. Scientists cannot remain isolated while expecting the public to embrace new research. Doing good work is not enough. In contrast, if scientists want to undermine scientific endeavours in society, procrastinating reciprocal communication about potentially controversial research is trustworthy sabotage. Since most scientists do care if research succeeds, public discourse should be viewed as part of the project, not as an afterthought or obstacle. Scientists must also add their expertise to public debates, emphasising the scientific consensus and reasoning against manufactured controversies. Educating the public during heated controversies takes more than supplying facts. As Charlie Gordon says it in Flowers for Algernon: “Intelligence alone doesn’t mean a damned thing. Here in your university, intelligence, education, knowledge, have all become great idols. But I know now there’s one thing you’ve all overlooked: intelligence and education that hasn’t been tempered by human affection isn’t worth a damn” (Keyes, 1966). Charlie’s judgement may not be fair towards the scientific community, but it does matter that he holds such a perspective. The scientific community may already be full of compassionate individuals who are concerned with world betterment; however, doing pro-social work is only part of the equation, another part is pro-actively breaking down the negative narratives. A part of breaking down those narratives means breaking down assumptions, both in the public as well as the science community. In the end, rewriting the scripts for public narratives of monsters cannot be done while narrating the public as a mindless, irrational mob.
References
- “Argumentum ad verecundiam”. (n.d.). Rationalwiki.
- Bastian, H. (2013). Motivated Reasoning: Fuel for Controversies, Conspiracy Theories and Science Denialism Alike. Scientific American.
- Bearzi, Maddalena. (2013). 5 Simple Tips for Communicating Science. National Geographic.
- Brunk, C. G. (1985). Professionalism and responsibility in the technological Society. Conrad Grebel College.
- Bucchi, M., & Trench, B. (Eds.). (2008). Handbook of Public Communication of Science and Technology. New York: Routledge.
- Derry, G. N. (1999). What Science Is and How It Works. Princeton, NJ: Princeton University Press.
- Diethelm, P. and M. McKee (2009). “Denialism: what is it and how should scientists respond?” The European Journal of Public Health 19(1): 2-4.
- Dixon, G., & Clarke, C. (2013). Heightening Uncertainty Around Certain Science: Media Coverage, False Balance, and the Autism-Vaccine Controversy. Science Communication, 35(3), 358-382. doi:10.1177/1075547012458290
- Frank, Adam. (2013). Welcome to the Age of Denial. The New York Times. Retrieved from http://www.nytimes.com/2013/08/22/opinion/welcome-to-the-age-of-denial.html
- Franklin, U. M. (2011). The real world of technology. House of Anansi.
- Funtowicz, S. O., & Ravetz, J. R. (1993). Science for the post-normal age. Futures, 25(7), 739-755.
- Goertzel, T. (2010). “Conspiracy theories in science.” EMBO Report 11(7): 493-499.
- Goldstein, M. & Goldstein, I. (1984). The Experience of Science: An Interdisciplinary Approach. New York: Plenum.
- “Ipse dixit”. (n.d.). Rational Wiki.
- Janabi, Fourat. (2013). From Anti-GMO to pro-science: ‘A Layman’s Guide to GMOs’. Genetic Literacy Project.
- Kahan, D.M. (2013). “A Risky Science Communication Environment for Vaccines.” Science 342(6154): 53-54.
- Kahan, D. M. (2013). Making climate-science communication evidence-based—all the way down. Culture, Politics and Climate Change. London: Routledge.
- Kahan, D. M., et al. (2012). “The polarizing impact of science literacy and numeracy on perceived climate change risks.” Nature Climate Change 2(10): 732-735.
- Kahan, D.M.. (2012). Why We Are Poles Apart on Climate Change. Nature.
- Keyes, D. (1966). Flowers for Algernon, ([1st ed.). New York: Harcourt, Brace & World.
- Kunda, Z. (1990). The case for motivated reasoning. Psychological bulletin, 108(3), 480.
- Lalasz, Bob. (2013). Why Everything You Know About Science Communications is Wrong, and More Science is the Answer. Cool Green Science.
- Lee, J. A. (2000). The Scientific Endeavor: A Primer on Scientific Principles and Practice. San Francisco: Addison Wesley Longman.
- Lupia, A. (2013). Communicating science in politicized environments. Proceedings of the National Academy of Sciences, 110(Supplement 3), 14048-14054.
- Monoyios, Kalliopi. (2012).” Communicating Science: What’s Your Problem?” Scientific American.
- Nisbet, M. C. and D. A. Scheufele (2009). “What’s next for science communication? Promising directions and lingering distractions.” American Journal of Botany 96(10): 1767-1778.
- O’Brien, M. H. (1993). Being a scientist means taking sides. BioScience, 43(10), 706-708.
- Peters, H. P. (2013). “Gap between science and media revisited: Scientists as public communicators.” Proceedings of the National Academy of Sciences 110(Supplement 3): 14102-14109.
- Scheufele, D. A. (2013). “Communicating science in social settings.” Proceedings of the National Academy of Sciences 110(Supplement 3): 14040-14047.
- Stevenson, Leslie. (1989). “Is Scientific Research Value-Neutral?” Inquiry 32, no. 2: 213-222.
- Stone, R. and B. Jasny (2013). “Scientific Discourse: Buckling at the Seams.” Science 342(6154): 56-57.
- Williams, Nancy M. “Affected Ignorance and Animal Suffering: Why Our Failure to Debate Factory Farming Puts Us at Moral Risk.” Journal of Agricultural and Environmental Ethics 21 (2008): 371–384.
You must log in to post a comment.