-
Addressing Uncertainty: When Information is Not Enough / Faire face à l'incertitude : quand l'information ne suffit pas
The experience of uncertainty, typically attributed to a lack of knowledge, is central to the study of information behaviour. Uncertainty also arises, however, from real-world indeterminacy that no amount of information can resolve. This paper explores the multiple sources of psychological uncertainty and examines the implications for information behaviour research and practice.
Résumé : L'expérience de l'incertitude, généralement attribuée à un défaut de connaissance, est centrale dans l'étude du comportement informationnel. Mais l'incertitude découle aussi, cependant, de l'indétermination propre au monde réel, qu'aucune quantité d'information ne saurait résoudre. Cet article explore les multiples sources d'incertitude psychologique, et en examine les implications pour la recherche et la pratique en comportement informationnel.
information behaviour, uncertainty, risk, cognitive bias
comportement informationnel, incertitude, risque, penchant cognitif
Introduction
Uncertainty is a pivotal concept in library and information science (LIS), particularly in the area of information behaviour, and is in some sense the basis of both research and practice. Within the domain of information behaviour, uncertainty is viewed as a psychological condition resolved by access to appropriate information. Under this paradigm, uncertainty is a condition to be reduced if not eliminated, and better information services, increased accessibility of information, better-designed information systems, and improved search skills are—implicitly or explicitly—identified as the means to that end.
But this approach to uncertainty—viewing it as something to be "cured" by information—represents only one aspect of the concept. In fact, much information seeking is occasioned by circumstances in which uncertainty is an inescapable characteristic of the world or the total state of knowledge. These are situations in which uncertainty is not resolvable by available information, either because the outcome is yet to be determined and can therefore be described only probabilistically (e.g., the lottery) or because probabilities and even possible outcomes are not only unknown but potentially unidentifiable (e.g., the health implications of wind turbines). Information seekers and decision makers operating under these conditions of uncertainty can undoubtedly be assisted by appropriate information to better understand the complex nature of the situation they [End Page 384] face, and information professionals are well positioned to provide this support. Yet such uncertainty—particularly in a decision-making scenario—increases our reliance on cognitive heuristics which govern information seeking and processing, leading to biased decision making under a fabricated sense of certainty. Information professionals would be well advised to understand the nature and consequences of this biased information processing rather than tacitly legitimizing certainty as the appropriate outcome of information consumption. This understanding requires that we examine the mechanisms of certainty formation somewhat independently from the simple presence, absence, inconsistency, or coherence of information.
This paper draws on literature in psychology, epistemology, and risk to present a more nuanced exploration of the different sources of psychological uncertainty. We also discuss the ways in which information professionals can support information seekers in contexts where information does not resolve, and may even exacerbate, the experience of uncertainty.
Uncertainty in LIS
According to the Encyclopedia of Library and Information Science, "uncertainty is an unavoidable human condition" and a "persistent characteristic in information seeking" (Anderson 2010, 5,285). It is therefore somewhat surprising how little research has been done on the concept within LIS. Attention to the notion of uncertainty appears within the past 30 years, coincident with the shift in LIS away from system-centric to user-centric perspectives exemplified in Dervin and Nilan's review, "Information Needs and Uses" (1986). The new paradigm described by these authors focused on the user's subjective experience in the examination of information needs and uses, encompassing the motivations, situations, and perspectives of users. It was in the context of this user-centred approach to information needs that uncertainty started to become a significant concept for study. Thus, Dervin and Nilan conceptualize uncertainty as one of the conditions that precipitate information needs (17), and note "uncertainties" as one of the challenges facing information seekers (p. 22).
Within this new paradigm, uncertainty quickly became a crucial concept for understanding information behaviour. Ingwersen (1995), for example, called uncertainty a "fundamental" yet "underrated" dimension of information science (148). Wilson (1999) wrote that uncertainty is "the ghost at the feast . . . [W]e may assume that much (perhaps most?) information seeking and retrieval are occasioned by uncertainty. . . . [F]rom the perspective of the user, it is always there" (265). Uncertainty, described as an uncomfortable cognitive state (Kuhlthau 2004), was identified as a primary motivation for information seeking, and information was cast as the resolution to states of uncertainty. Although researchers recognized that there was not a simple inverse relationship between information and uncertainty, they retained the notion that uncertainty is a cognitive state ultimately and necessarily addressed by the right kind or right amount of information. Kuhlthau (2004), for example, observed that the experience of uncertainty increases in the early stages of seeking information only to diminish as users become more familiar with materials and the topic. Within the problem-solving [End Page 385] model of Wilson et al. (2002), uncertainties are expected to rise and fall for information seekers over the course of their activity, but their progress is nevertheless marked by successive resolutions of these uncertainties. According to Anderson (2010), such research illustrates that the field of information behaviour has grown to recognize uncertainty as a "natural part of a searcher's experience" (5,291). But despite this acceptance of uncertainty as natural within these frameworks, the end point of information seeking remains the reduction or elimination of uncertainty.
Varieties of uncertainty
Scholars outside of LIS who work in the area of uncertainty research recognize that the psychological experience of uncertainty has its roots in at least two (perhaps not entirely separable) conditions: internal uncertainty (loosely equivalent to epistemic uncertainty or lack of knowledge) and external uncertainty (comparable to aleatory uncertainty or chance; see, for example, Dequech 2004; Kahneman and Tversky 1982a; Howell and Burnett 1978). Internal uncertainty can be thought of as uncertainty arising from a lack of knowledge; external uncertainty, by contrast, is irreducible and world based. Scholars differ on the exact nature of these types of uncertainty (e.g., Dequech 2004), but we take as our definitions those offered by Kahneman and Tversky (1982a), and their examples make the intended contrast quite clear. Someone who states "I think New York is north of Rome, but I am not sure" is expressing internal uncertainty since the relative latitudes of the two cities is a knowable fact that the speaker does not possess (it turns out that Rome is actually slightly north of New York). By contrast, a speaker who says "chances are that you will find John at home if you call tomorrow morning" is expressing external uncertainty: Tomorrow morning, John may or may not be home, and no one—John included—is in a position to say which holds until the event itself has come to pass.
External uncertainty is uncertainty about complex/inexplicable/unfamiliar phenomena or (more commonly) future events. Some questions are relatively straightforward, such as "Will I win the lottery?" or "Will John be at home tomorrow morning?" Others are much more complex: "Will limiting dietary cholesterol reduce my chances of a heart attack?" or "If I live under a wind turbine, will it have a negative impact on my health?" In the simplest cases of external uncertainty, possible outcomes are known and probabilities can be reliably attached to each, which is sometimes called a condition of risk (Luce and Raiffa 1957). For example, the result of a fair coin toss will be either heads or tails (i.e., known outcomes), and each is equally likely to occur (known possibilities). In some cases of external uncertainty, potential outcomes are known but probabilities are not; Ellsberg (1961) called this state of uncertainty ambiguous while Luce and Raiffa (1957) refer to this as decision making under uncertainty (see also Einhorn and Hogarth 1986). I can, for example, be quite certain that one of two things will happen when I arrive at the notoriously overcrowded parking lot near my office: I either will or will not find a parking spot. It is difficult, however, to determine a reliable probability for each alternative. In the most [End Page 386] challenging cases, sometimes called ignorance (Hogarth and Kunreuther 1995), we lack full information about even potential outcomes, let alone the probabilities attached to each outcome. Examples abound in the real world: If I accept the reality of climate change, I might want to determine what the consequences are likely to be. In this case, however, I am in no position to list even the possible outcomes, let alone their relative odds of occurring.
The experience of uncertainty—internal or external—is in every case driven by a lack of knowledge. The important difference lies both in what is unknown and in whether it is knowable. In the case of internal uncertainty, the information seeker lacks information, and once they have accessed the appropriate information, their uncertainty can be resolved. In the case of external uncertainty, information is also relevant, but the best and most thorough information will still leave some degree of uncertainty: In these cases the uncertainty is not fully remediable since it is inherent in the world. In the context of external uncertainty, optimal decision support will assist information seekers to address one of three challenges: (1) to accurately understand the relevant probabilities (under conditions of risk); (2) to estimate probabilities when potential outcomes are known but probabilities are not (under conditions of ambiguity); or (3) to understand both the range of possible outcomes and the probabilities attached to each (under conditions of ignorance). Each of these tasks presents cognitive difficulties, and decision makers often recruit strategies and heuristics to assist. These strategies and heuristics, however, can lead to biased information selection and processing and, as a result, biased decision making. LIS professionals will be better able to assist information seekers to the extent that they (1) recognize that external uncertainty will not be resolved by information; (2) recognize that information seekers face challenging tasks in understanding the decision context in the face of external uncertainty; and (3) understand that information seekers will be prone to using strategies and heuristics to address these challenging tasks, potentially resulting in a biased understanding.
Although they do not use the specific term "internal uncertainty," LIS scholars who study information behavior are quite familiar with the concept, since the psychological uncertainty typically studied in information behaviour is the direct result of a lack of information rather than being an inherent condition of the world. Dervin's (1992) concept of an "information gap," for example, is essentially an expression of internal uncertainty; for Belkin (1980), uncertainty is an "anomalous state of knowledge," a phrase that clearly refers to an internal cause. At the same time, however, there is some acknowledgement of other sources or types of uncertainty. Anderson (2010), for example, acknowledges that "in real world conditions, a degree of irreducible uncertainty will always remain in a situation" (5,288), and she calls for "further research into the character of uncertainty and variability that characterize 'real world' existence" (5,294). We fully support Anderson's call for research into the character of uncertainty and further assert that a more complex understanding of uncertainty is also critical to effective practice within LIS. Part of this complexity arises from cases of external [End Page 87] uncertainty where resolving uncertainty through information is difficult or even possible, and premature certainty is therefore inappropriate. It is important, therefore, that we explore how people react in such situations.
Effects of uncertainty
Within LIS, as in other disciplines, uncertainty is often constructed as a noxious condition, something to be avoided or redressed. Kuhlthau (2004) notes that uncertainty is a "cognitive state that commonly causes symptoms of anxiety and lack of confidence" (92). John Dewey offers a potential explanation for this tendency: "It is not uncertainty per se which men dislike but the fact that uncertainty involves us in peril of evils" (Dewey 1960, 8). Within the psychological domain, uncertainty reduction theory (Berger and Calabrese 1975) proposes that individuals are motivated to reduce uncertainty in interpersonal interactions through the exchange of information. Some psychological theories allow for individual differences in response to uncertainty. Sorrentino's theory of uncertainty orientation, for example, differentiates between uncertainty-oriented individuals who seek information in response to uncertainty, and certainty oriented individuals who value clarity and stick to what is already known when confronted by an uncertain situation (see Sorrentino and Roney 2000). The difference between these two types of individuals, however, is in the strategies they use to reduce uncertainty, rather than their motivation to achieve that end. Although uncertainty can sometimes be a positive experience (consider, for example, the practice of gambling, where uncertainty is part of the attraction; see also Anderson 2006), decisions are most comfortably made in the context of full knowledge of certain (rather than probabilistic) outcomes. For example, when considering a choice between a meaningfully large certain gain and a gamble that could lead either to a larger gain or a smaller (often 0 or negative) gain, decision makers avoid uncertainty by choosing the "sure thing" (Kahneman and Tversky 1982b). In the context of decision making, uncertainty is identified as something with which to "cope" (Lipshitz and Strauss 1997) since uncertainty constitutes a "major obstacle to decision making" (149).
The study of decision making under various conditions of external uncertainty (risk, ambiguity, and ignorance) has spawned programs of research examining decision-making processes, influences on those processes, and the strategies and heuristics used by decision makers in these challenging circumstances. Prominent researchers in the area include Kahneman and Tversky (e.g., 1982a; 1982b; 1984), Slovic (e.g., 2000) and Camerer (e.g., Camerer and Weber 1992), and Hogarth (e.g., Einhorn and Hogarth 1986; Hogarth and Kunreuther 1995), to name but a few. A full review of this literature is beyond the scope of this paper; instead, our goal is to draw on a few prominent and important examples to demonstrate how information processing in these contexts can be biased, and how information design, selection, and presentation can influence decision making in these uncertain contexts. [End Page 388]
Much of the research on uncertainty in decision making has taken place in a controlled experimental context, and the majority has focused on the specific context of risk (Hogarth and Kunreuther 1995). This work provides valuable insight into reasoning and decision making, but research that examines the more complex environment of naturalistic decision making provides even more insight into information use in decision making. This research identifies the strategies that decision makers in real-world contexts use to address or cope with uncertainty. Lipshitz and Strauss (1997) note that "suppression" is one of the strategies used to address uncertainty; it involves "tactics of denial (ignoring or distorting undesirable information) and tactics of rationalization" (154). Another of the strategies that decision makers have for responding to uncertainty is to construct a simplified model of reality, which provides the environmental appearance of certainty (Michael 1973). Schwenk (1984) terms this "cognitive simplification."
Suppression or cognitive simplification are ways of "manufacturing" certainty or reducing complexity by applying biases in information processing that allow us to feel more certain when real-world uncertainty is unavoidable. Three biases that have obvious application in the reduction of uncertainty are (1) probability neglect (Sunstein 2002), which leads to ignoring probabilities and effectively under- or overweighting some outcomes in decision making; (2) confirmation bias (Nickerson 1998), which leads to selectively attending to information consistent with a prior or developing decision; and (3) accessibility or availability bias (Iyengar 1990; Tversky and Kahneman 1973), which leads to focusing selectively on the most salient alternatives or outcomes (e.g., those that have been most widely covered in the media). These and other biases may lead to good decisions in the face of uncertainty; however, they also represent departures from fully rational decision making and can in at least some circumstances lead to suboptimal outcomes. In the following sections, we demonstrate how these biases could affect information seeking and information processing in situations of uncertainty, and we suggest some ways in which LIS professionals can mitigate these effects.
The particular tasks faced by information seekers or decision makers dealing with uncertainty depend on the type of uncertainty. In the context of internal uncertainty, the information seeker has the relatively straightforward task of understanding the factual information that addresses their information need. In the context of external uncertainty, different cognitive work is required for optimal decision making. In the simplest cases, when outcomes and probabilities are both known, the challenge is to understand the probability information and to take it into account when evaluating alternatives. When probabilities are not known (conditions of ambiguity), the goal is to develop reasonable estimates of outcome likelihood in order to support optimal decision making. When neither probabilities or outcomes are known, the information seeker or decision maker must construct both an understanding of the set of alternative outcomes and the likelihood that each will occur. [End Page 389]
Responding to uncertainty: the role of LIS professionals
Providing information is obviously a first-line response to uncertainty, and LIS professionals have well-developed strategies for identifying accurate, timely, and comprehensive information in response to user queries. We have time-tested methods for identifying appropriate materials to meet user information needs. Users motivated by uncertainty to seek information will benefit from the best possible resources, and information professionals should understand that when uncertainty has an external component, different considerations will come into play in identifying optimal resources. One particular challenge is selecting resources optimized to meeting information needs with respect to risk, that is, in situations of external uncertainty in which both outcomes and probabilities are known. Information about risk tends to be difficult to understand (e.g., Bogardus et al. 1999) in part because this information is usually presented numerically and quantitative literacy levels are relatively low (see Darcovich et al. 2000). The manner in which risk information is presented can have a significant impact on understanding. Based on a review of the literature, Burkell (2004) recommends the use of frequency pictographs for the presentation of risk information (see also Fagerlin et al. 2007; Vahabi 2007).
Even when probabilities are effectively communicated, decision makers show a bias to interpret very low probabilities in surprising ways, treating them as having a higher (or sometimes lower) likelihood than the probabilities suggest (see Kahneman and Tversky 1984). In practice, individuals appear to underweight (or even ignore) low-probability events if they do not view them as important or salient (e.g., a low likelihood of rain), while they overweight low-probability events when the outcome is salient or personally relevant (e.g., a low probability of death due to anaesthetic; see Camerer and Kunreuther 1989). The result is that for salient low-probability events, such as winning the lottery or the possibility of a nuclear accident (see also Mumpower 1988), decision makers tend to focus on outcomes to the exclusion of probabilities, making what at times appear to be suboptimal decisions at an individual (e.g., Slovic et al. 1977) and societal (e.g., Sunstein 2003) level. Peters et al. (2006) demonstrate that individuals with better numeracy skills are better able to reason with respect to numeric (and probabilistic) information, and these skills are therefore likely to provide some protection against underweighting or overweighting of low-probability events. LIS professionals could assist users dealing with risk information by providing quantitative literacy training and support. This can be viewed as an extension of the information literacy instruction that we already undertake to provide clients with the suite of abilities to "recognize when information is needed and have the ability to locate, evaluate, and use effectively the needed information" (American Library Association 1989). Adding quantitative literacy to the suite of skills supported by instruction would assist clients to effectively interpret the risk information characteristic of the most specific instances of external uncertainty.
Commonly, information seekers and decision makers are faced with situations in which outcomes are known but the probabilities of those outcomes [End Page 390] are not (or cannot) be specified. Decision makers prefer to avoid this kind of ambiguity, and one strategy they employ is constructing subjective probability estimates based on available information. Through this strategy, which Tversky and Kahneman (1973) term the availability heuristic, the probability of an event is judged by the ease with which relevant cases can be recalled. Use of the availability heuristic leads to predictable biases, such as the judgement that air travel is more dangerous than motor vehicle travel when in fact the reverse is true. This false impression is based on the greater memorability of airplane accidents, which are widely reported in the press, compared to motor vehicle accidents. LIS professionals should anticipate this bias and recognize that information seekers will use this strategy to create subjective probability estimates when objective probabilities are not available. To minimize this bias, information professionals should ensure that information about potential alternatives (e.g., potential outcomes of a medical treatment) receive appropriate coverage in the information provided and that all alternatives are presented in equally salient ways (since stories are more salient than statistics, for example, it would be inappropriate to have scenarios illustrating one alternative and statistics describing the other). In an attempt to achieve "objective" reporting, journalists sometimes give equal weight to both sides of a story, even when one side is objectively more likely or more accurate (see Mindich 1998). This type of journalistic coverage can contribute to biased probability estimates. For example, false balance in reporting on climate change has led to a divergence of scientific and public discourse on the issue (Boykoff and Boykoff 2004); public discourse (e.g., newspapers coverage) might devote equal time to climate change sceptics and proponents, giving vastly more attention to sceptics than their presence in the scientific literature suggests they deserve. LIS professionals should be aware of the false balance approach sometimes used in journalistic reporting and counter it with information resources that more accurately represent the balance of opinion in the relevant literature.
In the most challenging state of uncertainty, information seekers and decision makers are operating under conditions of ignorance, where neither outcomes nor (by extension) probabilities can be specified. Often in these cases there is significant debate and disagreement (e.g., about the consequences of climate change), and information seekers are faced with the challenge of developing a coherent understanding and perhaps a plan of action in the face of a complex and multifaceted information environment. Hogarth and Kunreuther (1995) suggest that in conditions of ignorance, when perhaps they should be thinking harder about decisions, decision makers tend to favour "simple arguments that serve to resolve the conflicts of choice" (32). Thus, there is a tendency to resort to heuristics that limit the complexity of the information environment. When outcomes are unknown, it is up to the decision maker to establish what they might be, and availability bias will influence the set that is generated. As a result, there will be a tendency to think about those outcomes that are salient (because they are more easily recalled), and outcomes that are less obvious will tend not [End Page 391] to be considered. One simplifying strategy in these circumstances is to selectively attend to "confirmatory" evidence—evidence that is consistent with a developed or developing view. Thus, individuals seeking information about the safety of nuclear power will tend selectively to attend to information that supports their initial beliefs: If they think that nuclear power is safe, they will notice and process information consistent with safety, while if they fear that nuclear power is unsafe, their tendency will be the opposite. Hogarth and Kunreuther (1995) note that "one-sided arguments or justificatory processes may be more likely to occur in situations of ignorance as opposed to risk. Under risk, explicit tradeoffs are salient. Under ignorance, decision makers are free to recruit arguments to support their intuitions and to ignore conflicting arguments" (33). This and other research suggest that there is a tendency to resort to oversimplified and perhaps one-sided information in situations of ignorance. LIS professionals who are assisting patrons with information needs that fall in this realm should be aware of these tendencies and ensure that the patrons are provided with an appropriately broad range of information resources that accurately reflect the range of opinion or evidence on the topic at hand. Encouraging patrons to actively consider all sides of an argument and to generate evidence in support of each will help to mitigate the effects of these biases and support a more considered perspective.
Examples
Consider the following four questions.
A. What is "rent seeking"?
B. What's the weather going be like today?
C. What treatment would have the best chance of preventing my migraines?
D. If I install a wind turbine, will it affect my health?
Question A is an example of internal uncertainty. It demonstrates a lack of knowledge that requires information as a remedy. Finding the definition of "rent seeking" can be a straightforward matter of finding an article in the Concise Encyclopedia of Economics or some other resource. Given that the resource is accurate, current, and written at a level appropriate for the client, this information would address the uncertainty underlying the information need.
Question B is an example (given the accuracy of our weather forecasts) of the most basic version of external uncertainty: risk, where outcomes and probabilities are known. In response to this question, an information professional would consult an information resource (perhaps an online weather service, such as the Weather Underground website). The information available from that resource, however, would almost certainly include a probabilistic component (e.g., "Today there is a 40% chance of precipitation."). While the information obtained is certainly relevant to the information need and may very well reduce the uncertainty of the information seeker (who, before receiving this information, may have had no idea of the likelihood of rain), the information does not eliminate uncertainty—the seeker still has no definitive answer regarding today's weather. [End Page 392] Moreover, in these and other situations of risk, the information seekers are faced with the challenge of interpreting the information that is provided—in other words, to answer for themselves the question of what it means for there to be a 40% probability of precipitation and to make a reasonable decision on the basis of that information. Here the role of the information professional is expanded. Simply providing information is not enough: We should also be carefully provisioning the information that optimally supports the challenging task of interpreting this type of quantitative information. Furthermore, we should be prepared to provide quantitative literacy training to help information seekers interpret the probability information they encounter. In the absence of such support, the information seeker might instead ignore or neglect the probabilities and focus solely on the potential outcomes. Although the consequences are unlikely to be of great import with respect to an inquiry about the weather (perhaps an umbrella left at home and a consequent soaking), in other circumstances the impact of probability neglect can be significant, as it may result in suboptimal personal decisions (e.g., failing to ensure against low-probability, high-cost events) or allocation of public resources (e.g., paying an inordinately high amount to remove any danger of blood-borne infections from the blood supply).
In the case of Question C (preventing migraines) the information seeker is likely to draw on a wider breadth and diversity of information. Documents consulted could include official outlets (government health websites, such as PubMed Health), medical websites (WebMD), primary research (PubMed), or blogs by migraine sufferers. Information seekers are also likely to approach personal contacts, such as friends, acquaintances, family members, and health care personnel. There are many potential resources, each offering different or multiple treatment approaches (e.g., medication, changing one's prescription glasses, Botox injections, exercise, acupuncture, or sunbathing). One of the challenges in interpreting this information is determining which treatment is most likely to be effective, yet it is unlikely that directly comparative probabilistic information will be available. Instead, the information seeker will have to develop subjective probabilities to identify the best treatments of their migraine. Because of people's tendency toward the availability bias, the treatments judged most effective would be those for which evidence of positive effect is most easily recalled, whether these be the treatments recommended by family and friends, those popularly reported in the press, or those advertised on television. The role of the information professional in this context is to provide information about effectiveness that will act as an adjunct to this heuristic processing, and perhaps balance the salience of the various alternatives under consideration to mitigate the effects of the availability bias.
Finally, a search for information to answer Question D (health effects of wind turbines) would likely result in conflicting information. News outlets report on residents suffering from a constellation of health problems due to nearby turbines, while official reports state that there is no scientific basis to such a connection. Anecdotal reports identify a wide range of effects, including body vibrations, headaches, queasiness, dizziness, and sleep disturbances; few of [End Page 393] these effects, however, are supported in the scientific literature. Importantly, the reports of negative effects are often first-person accounts reported in news stories. These tend to be much more memorable, and much more meaningful, than the relatively dry scientific and statistical information that does not support these effects. Moreover, news stories may tend to depict a false balance on the issue, offering equal time and space to both sides of the debate—and perhaps even more attention to the accounts of negative impacts—despite the fact that a relatively small number of people report these effects. If information seekers have an intuition or prior belief that wind turbines have negative health effects, they will tend to select and attend to information resources supporting that view, thereby reinforcing their prior belief. In this situation, the information professional should ensure that information provided in response to queries presents an appropriately balanced representation of the alternatives, ensuring that the information seeker encounters information in support of both alternatives (negative health effects, no negative health effects) in proportions that reflect best evidence on the issue. At the same time, decision makers should be encouraged to articulate both sides of the issue rather than assemble a single argument in support of their original intuition or position. As wind turbines are relatively recent technology, the full measure of their direct or indirect health effects may not yet be fully realized, whether by experts or the general public. In such a state of ambiguous uncertainty, these strategies will be helpful to minimize the negative impact of decision heuristics that might otherwise lead to a biased opinion.
Conclusion
Historically, research in LIS has dealt with internal rather than external uncertainty—lack of knowledge rather than indeterminacy. Under this perspective, uncertainty is effectively addressed by information. However, information cannot resolve external uncertainty, which is world-based and irreducible. Nevertheless, information seekers confronting external uncertainty can still benefit from the support of information professionals. In the context of external uncertainty, information seekers continue to require access to high-quality information relevant to their information needs. Traditional skills that support the identification of high-quality information retain their relevance in this context, as does information literacy training. To optimally support information seekers facing external uncertainty, these interventions should include awareness on the part of information professionals of the heuristics and biases that operate in situations of external uncertainty. The role of information professionals in these situations should be to reduce the potentially negative impact of heuristics and biases through three avenues: (1) by creating information environments that mitigate (or at least do not exacerbate) natural biases; (2) by providing training and support to optimize information-processing skills relevant to uncertainty; and (3) by making information seekers aware of the biases in reasoning they are likely to enact. Paradoxically, the effect of these strategies is likely to be an increase in the psychological uncertainty experienced by these information seekers. We argue, however, that in the face of an uncertain world, persistent psychological uncertainty is an appropriate response. [End Page 394]