Volume 7, No. 1, Art. 19 – January 2006

Risk Perception of an Emergent Technology: The Case of Hydrogen Energy

Rob Flynn, Paul Bellaby & Miriam Ricci

Abstract: Although hydrogen has been used in industry for many years as a chemical commodity, its use as a fuel or energy carrier is relatively new and expert knowledge about its associated risks is neither complete nor consensual. Public awareness of hydrogen energy and attitudes towards a future hydrogen economy are yet to be systematically investigated. This paper opens by discussing alternative conceptualisations of risk, then focuses on issues surrounding the use of emerging technologies based on hydrogen energy. It summarises expert assessments of risks associated with hydrogen. It goes on to review debates about public perceptions of risk, and in doing so makes comparisons with public perceptions of other emergent technologies—Carbon Capture and Storage (CCS), Genetically Modified Organisms and Food (GM) and Nanotechnology (NT)—for which there is considerable scientific uncertainty and relatively little public awareness. The paper finally examines arguments about public engagement and "upstream" consultation in the development of new technologies. It is argued that scientific and technological uncertainties are perceived in varying ways and different stakeholders and different publics focus on different aspects or types of risk. Attempting to move public consultation further "upstream" may not avoid this, because the framing of risks and benefits is necessarily embedded in a cultural and ideological context, and is subject to change as experience of the emergent technology unfolds.

Key words: risk perception, hydrogen energy, emergent technology, public engagement, public attitudes, trust, uncertainty

Table of Contents

1. Introduction

2. The Semantics of "Risk"

2.1 Types of risk

2.2 Ideology and culture in perceptions of risk

2.3 How decisions are made

2.4 Limitations of "information"

3. Public Perceptions of Risk

4. Experts' Assessments of Risks Associated With Hydrogen

4.1 The range of applications

4.2 Safety

4.3 Public health

4.4 The environment

4.5 Summary

5. Public Awareness of and Attitudes to Other Emergent Technologies

5.1 Carbon capture and storage

5.2 Genetically modified organisms and food

5.3 Nanotechnology

6. Issues in Public Engagement in New Technologies

6.1 The shock of the new?

6.2 The issue of "upstream" consultation

7. Conclusions

Acknowledgements

References

Authors

Citation

 

1. Introduction

Concerns over global warming and climate change and the depletion of fossil fuels have intensified interest by scientists, industries and governments in the feasibility of alternative energy sources. Hydrogen can be an energy-carrier and may become an effective substitute for hydrocarbons, especially in transport. It has to be generated from other energy sources. The benefits of hydrogen lie in any savings it may make in carbon dioxide and polluting emissions or in use of scarce fossil fuels. The gains are greatest where renewable primary sources—such as wind, wave, tide or solar—are used in generating hydrogen, and also where nuclear power is the source. Various alternative scenarios, "visions" or hydrogen futures have been identified (HODSON & MARVIN 2004; MCDOWALL & EAMES 2004; WATSON et al. 2004) using different assumptions about the economy and factors affecting technological innovation and diffusion and different timescales. Currently hydrogen energy systems are an emergent technology about which there is considerable scientific uncertainty and relatively little public awareness. [1]

One way of gauging likely public perception of risks, benefits and costs of a potential hydrogen economy is to consider public perceptions of similarly new, uncertain and largely unknown technologies as studied to date. We have selected studies of carbon capture and storage (CCS); genetically modified organisms and food (GM) and nanotechnology (NT). The purpose of this analysis is to examine whether lessons learned from such studies—empirical insights, concepts and methods—can be used in conducting work on public awareness of, and attitudes towards, hydrogen energy and a hydrogen economy. We have to use parallel cases because we do not know how a hydrogen future might progress and how people might perceive it as it does unfold. Alternative parallels would be historical ones, that is, innovations that have been diffused already, even if they did not fulfil their original promise (GEELS & SMIT 2000), or else never took off (LATOUR 1996). We might get a more rounded view of these than of contemporary parallels—but then we probably would not have the public consultations to examine that are in these recent case studies. [2]

First, we outline the concepts of "risk" that inform our approach to public perceptions of hydrogen and also the use we make of the three parallel case studies. This is followed by a summary of expert assessments of risks associated with hydrogen (based on reviews of the relevant scientific literatures). The case studies of public perceptions of the risks and benefits of CCS, GM and NT are then considered. Finally, the paper discusses implications for the communication of risks between experts and lay people, and broader ("upstream") questions about so-called public engagement in any emergent technology. [3]

2. The Semantics of "Risk"

2.1 Types of risk

"Risk" is a term of several meanings, which all too easily tend to slip into each other but need to be kept separate. They fall into three broad types. Type 1 is associated with the practice of risk assessment. A risk is assessed here as the effect of a hazard (e.g. so many casualties per 100,000 at risk) multiplied by the probability of its occurrence. While there will be a margin of uncertainty attached to an assessed risk, that margin is statistical and reflects the sample size and the variance in the two key variables. The assessment itself is grounded in evidence already available and furnishes a "rational expectation" that applies so long as the conditions on which our current knowledge rests remain unaltered. [4]

Type 2 is associated with taking decisions, when the consequences lie in the future and may turn out to be different than expected. Here the uncertainty is not merely statistical and current knowledge is not sufficient to be a guide: we are on the verge of the unknown. For instance, there is a dilemma for business that current market demand may alter should fashion change or should failure of supply raise price. In this context, organisations employ risk management, in order to have contingency plans. [5]

Type 3 is about neither rational expectations nor contingency plans to deal with the unexpected, but about what people perceive to be assured threats. Many governments today are claiming that "global terrorism" is an assured threat. Regardless of the probability of a hazardous act of terrorism and whether we have contingency plans for dealing with the unexpected, terrorism will, they claim, inevitably occur at some time and in some place that could be unknown. The obverse of "assured threat" is the Panglossian view that that "all is for the best in the best of possible worlds" (VOLTAIRE 1947). [6]

In Type 1 terms, exposure to hazards is never a zero probability and, in Type 2 terms, the unexpected may always happen, but, in Type 3 terms, assured threats or their converse, assured safety, seem certain to those who believe in them. [7]

All three meanings of "risk" have relevance to the pathway to a hydrogen economy. The first typically deployed in "science", the second characteristic of "experience" and the third applying to "representation" of risk, for instance in the media, influenced by culture and ideology. Of course, scientists "represent" hydrogen as energy to stakeholders and the public—often in Panglossian terms (CHERRY 2004—that is, those with a less direct interest in the future of hydrogen than stakeholders have—may acquire such an interest as fossil fuel depletes, and so switch their mode of understanding of risk. [8]

2.2 Ideology and culture in perceptions of risk

The terms "ideology" and "culture" both refer to the symbolic, as opposed to the material world, and to beliefs, values, feelings and social norms that shape how individuals think and act. In our usage here, "ideology" is rooted in interests that differentiate groups or classes, while "culture" reflects history and socialization shared by the groups that make up the wider society and informs each of their ideologies. Neither ideology nor culture is easily recognised by those who are embedded in it, because both condition their sense of what is natural and so appear to be common sense. Thus, risk assessment by scientists and engineers (Type 1) typically involves framing. It is normative for science to focus on the readily observable and measurable and to set aside the more speculative and "soft" evidence. WYNNE (1996) has argued that this tendency closed off debate about the long-term environmental impacts of nuclear power as compared with the short-term risk of melt-down in power stations. [9]

Ideology also intervenes by causing risk to be evaluated in relation to a familiar benchmark, such as petrol or natural gas in the case of hydrogen. This has distorting effects. It invites us to compare things on similar dimensions that are not altogether alike. Thus it is often claimed that hydrogen is as safe as or safer than petrol, whereas there are critical differences in what is required to handle each of the very different materials safely. [10]

In managing Type 2 risk, stakeholders too restrict their vision of the future—by whatever interests they are allied with. In the case of hydrogen as energy, there is a division among proponents between those whose main interest is security of energy supply and the continuation of global capitalism as it is, and those who see hydrogen as the foundation for a new economy and polity in which control of energy is distributed, not centralised or dominated by big business (RIFKIN 2002). [11]

Where there is Type 3 risk—assured threat or safety—the shared culture that underpins many varied perceptions lies close to the surface and permits its examination. Concepts of "dirty", "unhealthy" and "unsafe" and their opposites have their roots in the largely tacit ways in which culture orders our world. It is a familiar dictum that dirt is "matter out of place". Similarly "safe" may mean everything contained in its proper place, and "healthy" the exclusion of what is unwholesome (DOUGLAS 1964). The fact that they carry dense symbolic and emotive connotations and are experienced viscerally tends to inhibit challenges to the application of the terms. However, the terms are contested from time to time, when widespread assumptions about the safety or health or cleanliness of something familiar are severely tested, as they have been by such one-off incidents in nuclear power plants as Three Mile Island and Chernobyl. From then on the old framing of risk assessment of nuclear power was unlikely to be acceptable, and established risk management strategies came to seem inadequate. The development of new nuclear power installations ceased in several countries. [12]

Deconstructing the deep element of ideology and the even deeper element of culture in how risk is perceived may help us to remedy distortions of understanding and communication about hydrogen as energy. [13]

2.3 How decisions are made

A rational choice model is consistent with both Type 1 and Type 2 Risk and can carry us some of the way at least in discussing how sections of the public might choose or reject hydrogen as energy. Rational choice is a normative not a descriptive model. We have to move on to incorporate "real world" elements into the model. Some will enable us to modify but preserve the assumption of "rationality", others may cause us to abandon it. Risk is only one of the factors that a rational actor would weigh in the balance when deciding for or against hydrogen as energy. Other factors would be perceived benefits and perceived costs. Like risks, at least implicitly, these would be judged relative to alternatives. In a fully rational process of decision-making, every conceivable alternative should be considered. In taking a rational choice approach, no one element in the choice, including risk perceptions, can be adequately understood without attending to the others. Cost might be an inhibitor, even if benefits were judged to outweigh risks. Similarly, risks might inhibit choice even when benefits were high and costs low. [14]

Uncertainty on the one hand and values on the other, tend to "bound" the scope to act rationally (SIMON 1976). One way in which uncertainty is made more tractable is to avoid judging between several relatively unknown options and focus on a comparison of hydrogen with one that is familiar. Values also enter in. Thus, if benefits of hydrogen seemed marginal, even though it was competitive on costs and carried an acceptable risk, people might not part with the familiar option Also, the new option might be attractive or unattractive in itself, depending on the person's values. We assume that the public is most likely to have been exposed to representations of hydrogen and that these refer primarily to whether hydrogen is safe in use. Benefits for environment and health are more often referred to in the representation of hydrogen as energy than are risks. Costs are relatively invisible in this early stage of development. Even if the public know that they are high, they probably expect them to fall in the future. [15]

2.4 Limitations of "information"

The rational choice model rests on the assumption that knowledge is the basis of choice and that actors make choices that suit their interests, for example as producers or consumers. Its proponents often assume that subjects have only to be adequately informed to make appropriate choices. But this is a flawed view (AJZEN & FISHBEIN 1980). [16]

First, greater knowledge might make subjects more sceptical, less inclined to decide for or against an option. They might say they "don't know", even though the basis for that judgement is knowing more than they did when they felt they could make up their mind previously! Second, subjects might entertain beliefs that combine elements of knowledge and ideology and/or culture. Typically where knowledge is thin, it is patched with ideology. For instance, the lack of evidence to support a connection between global warming and climate change has long been patched up with a widely shared belief that extreme events are becoming or will become more common. Beliefs may be based on authority, including "the evidence" as scientists accept it. They may also be built on own experience or on rumour, for instance having encountered a hydrogen demonstration. Certain beliefs symbolise their object: an example would be the beliefs that hydrogen was/was not responsible for the fire that consumed the Hindenberg (BAIN & VAN DER VORST 1999). [17]

Third, values as well as facts are usually involved in making judgements. Values are ends-in-themselves. Consumers might, for example, choose a gas-guzzling SUV or a green option, such as heat pump or fuel cell for central heating, in spite of the costs of each, because they value one or the other for their own particular reasons. [18]

Fourth, affect often plays a part alongside cognition and value in making decisions. For instance, choice of green energy might be made "for the sake of my children/grandchildren", as might choice of an SUV that seems to offer security on the school run. It has also been observed that affect tends to distort judgements about the risks associated with benefits. It is rational to perceive a combination of benefits and risks in an option, but affect (including fear) can cause people to see only risk and no benefit or only benefit and no risk (FINUCANE, ALHAKAMI, SLOVIC & JOHNSON 2000). [19]

Fifth, norms influence judgements. They are the rules that subjects are constrained by, which might be law and regulation or informal expectations. Regulation has indirect effects on consumer's options in the energy field—notably on price—and they are probably unaware of how this happens. A relevant informal expectation that they might be aware of would be that one should not pollute the air or one should recycle scarce or toxic materials when the product that contains them is done with. [20]

3. Public Perceptions of Risk

SLOVIC's pathbreaking work (SLOVIC 2000a) on perceptions of risk has shown some recurrent patterns in the social and psychological contexts of assessing risks and benefits. The most important findings include the consensus that: perceived risk is influenced by the "imaginability" and memorability of a hazard; experts and laypeople tend to have different perceptions of how risky certain technologies are; disagreements about risk do not necessarily reduce or disappear in the face of "evidence"; fear and dread are the major axes of preference—and for any given level of benefit, higher risks may be tolerated by the public if those risks are controllable, familiar, immediate, known precisely and are voluntary (FISHOFF, SLOVIC, LICHTENSTEIN, READ & COMBS 2000). SLOVIC (2000b) has also shown that people's beliefs and attitudes about risk vary along the dimensions of "dread" and degree of knowledge. The extent to which risks are known or unknown is a crucial variable: people's perceptions vary according to whether the risk is observed or observable, whether it is known to those exposed to it, whether the effect of the hazards immediate or delayed, whether it is a new risk, and whether it is known or unknown to science. Thus for example, according to SLOVIC (2000b), nuclear power (and nuclear weapons) have the highest "dread" risk, but chemical technologies score the highest "unknown" risk. [21]

JOHNSON and SLOVIC (1995) examined public reactions to (Type 1) scientific estimates of risk uncertainty in relation to radiological and toxicological hazards. They found that people were unfamiliar with (and uncomfortable with) uncertainty in risk assessment. Low ratings of risk were treated cautiously and sceptically. How much trust the public had in government was an important mediating factor. They concluded that it should not be assumed that the lay public cannot understand uncertainty, but it should also not be assumed that explaining such uncertainties would increase people's trust. JOHNSON and SLOVIC (1995) showed in another study of public reactions to information about environmental and health risks that it was very difficult to convey uncertainty in risk estimates. Organisations communicating information about uncertainty were seen as either honest or incompetent. Where "low" risk levels were presented, these were regarded by laypeople as preliminary to higher estimates in future, or simply distrusted. JOHNSON (2003) has noted that uncertainty in environmental risk estimates raises questions in the public's mind about honesty and trustworthiness. Disagreement among experts is often ascribed to their self-interest rather than the inherent uncertainty of science itself. [22]

In such situations, numerous researchers have identified public trust as a crucial factor. SIEGRIST and CVETKOVICH (2000) investigated the role of social trust and knowledge in perceptions of hazards. They argued that in the case of technologies that people are familiar with, trust is not such an important factor. But where the technologies are relatively new or unknown, and/or in the absence of knowledge, social trust is important as an influence on people's assessments of risks and benefits. SJOBERG (2001) examined the extent of trust in experts in Sweden in connection with nuclear technology. He found that laypeople had significant concerns about unknown effects, and that the public was more sceptical about the completeness of experts' knowledge than the experts were themselves. Risk perception of the nuclear industry was also strongly affected by the view that unknown effects were likely to be negative: "the most important predictor of perceived risk turned out to be beliefs about the likelihood that there might be effects that are as yet unknown" (SJOBERG 2001, p.197). In addition, in this case the experts' credibility was being judged by the public in relation to how far they accepted the limits of scientific knowledge. LION, MEERTENS and BOT (2002) examined people's priorities for information about unknown risks (Type 2), and found that among those who wanted information about risks, most wanted to know exactly what the risks were; what the consequences were; whether the risks and possible effects were controllable; and when, where and how they might be exposed to the risk. Perhaps unsurprisingly, LION et al.'s (2002) results showed that the personal relevance of a risk was most important in determining how they might respond to and deal with such a risk: "Is the risk relevant to me and, if so, what can be done about it?" (LION et al., 2002, p.774). By inference, if experts and scientists are unable to provide information to relate to these questions, perhaps because the unexpected is (by definition) unknown, their credibility and expertise may be doubted. [23]

Conversely, FREWER et al. (2002) found that, in connection with public reactions to scientific uncertainty over food safety, people were more tolerant of uncertainty if it was seen as part of the research process, than if it were seen to be affected by government inaction. FREWER et al. (2002, p.370) observe that failure to communicate about uncertainty may further damage public confidence, but conclude, nevertheless:

"People are more familiar with the role of uncertainty in risk assessment than has previously been thought. Consumers find such uncertainty acceptable and want to be told about it…the public would support greater transparency in risk communication processes under conditions of uncertainty than has traditionally been available". [24]

KUNREUTHER (2002) has addressed questions about risk assessment of events where there is ambiguity about both the likelihood of their occurrence and their possible effects (as in the case of "extreme events"). Where there is an imbalance between the low probability of the events and the scale of the hazard they present, the extreme event may seem to the public to be an "assured threat" (Type 3). He notes that in situations of such indeterminacy risk communication and risk management are highly problematic. [25]

Communication of uncertainty is, to say the least, problematic, especially when there is an elision of what we previously referred to as Type 1 risk (risk assessment based on incomplete evidence) and Type 2 risk (relating to decision-making in a context of uncertainty). When, as often in the case of evaluation of extreme events, Type 3 risk (seemingly assured threat) enters the scene, there is yet further scope for miscommunication. Consultation with the public about benefits, risks and costs of emergent technologies has to introduce information about the science into the dialogue at some point and then gauge reaction. What then is the state of scientific assessment of risk (Type 1) in the case of hydrogen? [26]

4. Experts' Assessments of Risks Associated With Hydrogen

The main reason for interest in hydrogen in future energy scenarios (RIFKIN 2002; DUNN 2002) is related to a potentially clean way of producing electricity. When molecular hydrogen (H2) is used in a fuel cell, a device where it is chemically combined with oxygen, electricity can be produced and the only by-product is water. This theoretically simple principle has been long known to scientists. However, practical applications of hydrogen as a fuel have struggled to emerge, mainly because of technical difficulties in devising cost-effective ways of producing and storing hydrogen (HARRIS, BOOK, ANDERSON & EDWARDS 2004). [27]

As a benign substitute for hydrocarbon fuels, hydrogen is advocated by a wide array of lobby groups and other stakeholders, driven by quite different motivations: security of energy supply in a world where fossil fuels, especially oil, are gradually running out and currently are concentrated mainly in politically unstable areas; global warming concerns prompting a decisive reduction of man-generated greenhouse gases; improved use of renewable sources of energy, as hydrogen would allow engineers to cope with both fluctuation in production and distributed feed from renewable resources; and revival of nuclear energy as a possible emission-free way of producing hydrogen. [28]

4.1 The range of applications

Before natural gas was introduced as a household commodity, hydrogen had been used for many years in some European countries, such as the U.K. and Norway, as the largest single component of "town gas". According to the U.S. Department of Energy (DOE 2004) hydrogen has been used for the past 50 years in large quantities as a feedstock for a wide variety of industrial applications. Ammonia production for fertiliser accounts for about two thirds of total commercial use of hydrogen as an industrial gas. Other examples include petroleum upgrading (hydrocracking, hydrodealkylation, and hydrodesulphurisation) for such products as reformulated gasoline; food processing, such as hydrogenation of fats and oils, in which vegetable oils are changed from liquids to solids; semiconductor processing; glass and steel manufacturing industries; and cooling systems for large turbine generators. Liquid hydrogen is also used in the cryogenics industry and within the study of superconductivity. The only large-scale use of hydrogen as a fuel is that of NASA. Today, hydrogen production in the U.S. amounts to 9 million tons per year and is mainly achieved through methane steam reforming, electrolysis and as a by-product. Almost all of the hydrogen used is captive, that is, consumed at the refinery or chemical plant where it is produced. Therefore, a limited distribution network, consisting of liquid hydrogen delivery trucks, gaseous hydrogen tube trailers and dedicated hydrogen pipelines has been developed so far. [29]

The growing scientific and popular literature envisages quite new applications of hydrogen, centred on its use as an energy carrier, at the core of a yet-to-be-developed energy system. The energy stored in hydrogen would be used within different technological systems for a multiplicity of end-uses—mobile, stationary and portable. Transport uses are perhaps the most likely to attract consumers' attention, with the development of hydrogen-powered fuel cell or internal combustion engine (ICE) vehicles. Stationary applications include combined heat and power (CHP) systems for providing electricity and heat to homes, offices and larger facilities. Prospective hydrogen-based portable technologies comprise durable power for laptops, mobile phones and other high-tech electronic consumer products. They also include portable power devices to be used in remote areas, where connection to the electricity grid would be difficult or impracticable. Apart from portable applications, where hydrogen would substitute for batteries, mobile and stationary hydrogen applications would entail a gradual and more or less complete displacement of hydrocarbon fuels, such as natural gas, petrol and liquefied petroleum gas (LPG). Additionally, the present infrastructure built around fossil fuels would need to be adapted or replaced by new, hydrogen-dedicated storage and delivery systems comprising liquid, gaseous and solid storage technologies, pipelines, ground distribution fleets and fuelling facilities. [30]

Currently, hydrogen-based energy technologies exist only in the form of prototypes or are still at the laboratory stage. The complex technological system that would sustain hydrogen production, storage, delivery and end-uses is the subject of numerous, often contradictory, conjectures. It is challenging for experts, let alone the public, to assess potential benefits, costs and risks of this emergent system as compared with those of the present fossil fuel economy. [31]

Like any other technological system, one based on hydrogen will involve risks associated with possible hazardous situations posing threats to safety, public health and the environment. Risks of a future hydrogen-based economy would arise from each phase of the hydrogen life-cycle, from production, through storage and distribution, to its final use. The nature, severity and mitigation of such risks will be depend upon the technical configuration of any hydrogen system, and the development alongside it of socio-technical knowledge and routines, such as standards and regulations. No comprehensive technological risk assessment of hydrogen-based futures has been attempted to date. [32]

4.2 Safety

With respect to safety (BELLABY, FLYNN & RICCI 2004), current knowledge is all but limited to specific industrial practices that may have little or no relevance to future applications of hydrogen as an energy carrier, in particular in the transportation sector. In a report issued by the U.S. Department of Energy (DOE 2003) it emerges that "hydrogen is well known as a chemical, but its use as an energy carrier on a large-scale commercial basis is largely untested and undeveloped". This is also confirmed by several documents published within the European Hydrogen Integrated Project II, which addressed the development of comprehensive safety standards and regulations for hydrogen. A general remark emerging from these reports is that "the current knowledge about hydrogen safety is less thorough than the knowledge of safety of conventional fuels", compounded by a "general lack of data on frequency and size of hydrogen release" (EHIP II 2002). [33]

All the documents we have reviewed agree on some fundamental technical issues. As regards to risks to safety, unintentional hydrogen leaks are considered serious hazards. In the presence of ignition sources, such as electric sparks, flames or high heat, hydrogen leaks can cause combustion in air. This in turn may generate an explosion in specific circumstances. In fact, most of the technical reports agree that the greatest potential risk to the public appears to be a slow leak in a confined space, such as a home garage, where accumulation of hydrogen may lead to fire and explosion if no detection systems or vents are in place. Hydrogen has no odour. Its flames are almost invisible in daylight and emit less heat than other fuels, so that human senses alone are less able to detect them. [34]

Hydrogen embrittlement of metal and non-metallic materials, such as steel and plastics, is also a potential hazard. This involves the ability of hydrogen to penetrate into the molecular structure of certain materials, where it can cause a severe loss of strength and catastrophic ruptures of hydrogen containment systems. Liquid hydrogen entails other types of hazards. In fact, hydrogen can be stored as a liquid only at very low, or cryogenic, temperature (-253 ºC). If spilled, it can cause severe frostbite. Hydrogen gas can also be asphyxiant if released in large amounts, as it can displace oxygen. [35]

4.3 Public health

As far as direct and immediate risks to public health are concerned (BELLABY 2003), all sources agree that hydrogen is non-toxic and non-carcinogenic, and does not in itself present any concern for medium- or long-term health (HSE 2004). On the contrary, if hydrogen were substituted for hydrocarbon fuels in the energy and transportation sectors—which currently are responsible for much air pollution—no noxious gases and fumes would be emitted at the point of use, thus improving air quality and consequently public health. However, focusing only on end-of-pipe emissions gives but one part of the whole picture, especially if risks to the environment are also accounted for. In fact, as hydrogen needs to be produced by using an energy source, its potential beneficial environmental effects at the point of use may be cancelled by harmful emissions at the production stage. Sustainable hydrogen production and effective measures for reducing or eliminating greenhouse gas emissions (for example carbon sequestration) should therefore be put into practice. The public health consequences of not following this course are most conveniently addressed along with risks to the environment. [36]

4.4 The environment

Comprehensive assessments of health and environmental risks should take into account the whole technological system of which hydrogen will be part, as well as the entire life-cycle of such a system. A wide array of established and new technologies will contribute to the production, storage, distribution and use of hydrogen. Materials such as metal hydrides, carbon nanotubes and various catalysts will be variably deployed across the hydrogen energy chain, in amounts which will depend upon the scale of hydrogen penetration in the economy and the relative adoption rates of different hydrogen technologies. Increased production, diffusion and disposal of such materials, some of which may be totally newly engineered, may have risk implications for public health and the environment. [37]

CHERRY (2004) outlines the possible consequences of a widespread use of hydrogen-based technologies, such as fuel cells and hydrogen storage systems. Catalysts are essential components of fuel cells, which accelerate the rate of chemical reactions involving hydrogen. They are usually made of mixtures of exotic metals, whose side effects in the event of unintentional fires or during their disposal may raise safety, health and environmental concerns. Various metal alloys are also being evaluated as a possible storage medium in hydrogen cars, such as lithium hydrides. According to current technological knowledge, about 50-100 kg of metal would be required in a single car, thus posing significant challenges to safety, health and the environment. CHERRY also addresses possible negative impacts to environment caused by an increased usage of private transport, as a clean fuel option would possibly relax public commitment to energy saving and hinder institutional efforts to reduce energy consumption and traffic congestion. [38]

Safety and health hazards from components of fuel cells, such as the electrolyte and the membrane, are also mentioned by GASTON, CHELHAOUI and JOLY (2001). A common electrolyte used in alkaline fuel cells is potassium hydroxide, which is harmful for all human tissue as it causes serious chemical burns. Sulphuric acid is corrosive and can oxidise certain materials. When burning it emits toxic fumes. The membrane used in polymer electrolyte membrane (PEM) fuel cells contains fluorine, a substance that produces corrosive, toxic compounds when accidentally heated or set on fire. Lithium salts, present in molten carbonate (MC) fuel cells, do not pose toxicity dangers unless involved in a fire, when they produce toxic fumes. [39]

Recently, environmental implications of a fully-fledged hydrogen economy have been at the centre of an interesting dispute hosted by the journal Science. A research group in atmospheric science at the California Institute of Technology published a paper (TROMP, SHIA, ALLEN, EILER & YUNG 2003) where they predict dramatic consequences of unintentional leaks of hydrogen on the stratosphere, the upper layer of the atmosphere situated between 10 to 50 km above the earth surface. In summary, based on a computer simulation of atmospheric chemistry, TROMP et al. (2003) argue that unintended emissions of molecular hydrogen can have deleterious effects on the climate, including enhancing global warming and jeopardising the ozone layer. [40]

The issue is not new to climate change experts. The International Panel on Climate Change Third Assessment Report (IPCC 2001) points out that hydrogen can negatively interfere with the atmospheric chemistry responsible for abating methane and other major greenhouse gases, although it does not consider molecular hydrogen a direct greenhouse gas. It clearly states that "in a possible fuel-cell economy, future (hydrogen) emissions may need to be considered as a potential climate perturbation". [41]

The paper by TROMP et al. (2003) received strong criticism, mostly directed at the assumed hydrogen leakage rates of 10-20%, which in the authors' opinion "should be expected". More conservative values (0.1-3%) have been suggested instead (KAMMEN & LIPMAN 2003; LEHMAN 2003; LOVINS 2003; SCHULTZ, DIEHL, BRASSEUR & ZITTEL 2003). The predictions made by TROMP et al. (2003) are dependent upon a number of assumptions based on uncertain scientific knowledge. In fact, describing their results, they speak about "unknown environmental impacts" rather than assured threats. Other computer simulations based on different technical premises (DERWENT 2004) conclude that a hydrogen economy would produce 0.6% of the climate impact of the present fossil-fuel economy. Such remarkable discrepancies in results are partly due to the difficulty of anticipating the technical details and pervasiveness of prospective hydrogen-based technologies. In addition, significant statistical and structural uncertainties embedded in atmospheric science make predictions of this kind a rather difficult enterprise. [42]

4.5 Summary

To summarise, regarding safety there is a knowledge gap in that risk assessments hitherto have been based on the use of hydrogen as an industrial gas, whereas future hydrogen technologies entail a dramatic shift in applications to personal and household uses, about which little is yet known. In relation to health issues, the scientific literature is sparse: the impact of hydrogen-related technologies is under-researched and yet generally represented as benign. The long-term environmental risks associated with hydrogen are disputed, but this reflects, amongst other factors, an inherent uncertainty in climate change science. [43]

5. Public Awareness of and Attitudes to Other Emergent Technologies

The uncertainty surrounding risk assessment for hydrogen, especially in the range of new applications envisaged, is not unique to this technology. It is typical of other emergent technologies. What might studies involving consultation with the public about some of these other emergent technologies tell us that might be of value in studying public risk perceptions of hydrogen technology? [44]

5.1 Carbon capture and storage

In a recent study SHACKLEY, MCLACHLAN and GOUGH (2004) found investigating public perceptions of CCS "challenging" precisely because it is a highly technical issue, is remote from most people's concerns, and is at a very early stage of investigation. Carbon capture and storage has been proposed as one means of dealing with global warming and climate change, through storing carbon dioxide gas in underground (geological) or offshore (oceanic) "reservoirs". It has been suggested that carbon dioxide (CO2) could be pumped into aquifers or strata previously containing oil/gas. SHACKLEY and colleagues carried out a study of public perceptions of CCS by conducting a face-to-face questionnaire survey with 212 people (an opportunistic sample at Liverpool airport) and a series of "citizen panel" meetings with purposively sampled groups in two cities. These panels were initially asked to discuss various aspects of quality of life issues and environmental concerns and then were given increasingly more detailed information by experts about CCS, and asked to consider alternative options and indicate preferences regarding risks and benefits of CCS. [45]

Respondents were—perhaps unsurprisingly—unable or unwilling to express an opinion about CCS without receiving detailed explanation of why it was being proposed and what the possible risks might be. Once more information was provided, about half of the respondents indicated more positive attitudes to CCS. Combining findings from the citizen panels and survey work, SHACKLEY et al. (2004) found that, on first being made aware of CCS, most people were opposed to it, or were neither for nor against it, or simply replied "Don't Know". After more information was supplied, this pattern shifted slightly towards more positive attitudes but only in conjunction with comparisons with other methods of carbon mitigation (wind, solar, wave, nuclear, energy efficiency). Support for CCS was described as "moderate or lukewarm", and that support was also conditional on being informed about the reasons for CO2 mitigation. The citizen panels in particular revealed that the acceptability of CCS depended on it being part of a wider set of environmental and energy policies to combat global warming. Even then, CCS was not preferred when compared with improvements from wind, wave, solar and tidal power and other energy measures. CCS was, however, supported more strongly than nuclear power (which most disliked), and in one panel was seen as a valuable means of moving towards a hydrogen-energy-based economy. Among the other findings from the panels was public concern about the unknown long-term effects of CCS, distrust of industry representatives promoting CCS, and mistrust of the media. The panel in one city were more concerned about possible safety hazards and risks than the other panel. Uncertainties about risks (and effects) of leakages, and the impact of accidents on ecosystems and possible effects on human health were raised as issues of concern. Attitudes towards information supplied by experts varied, but even among the more sceptical citizens, it was believed that the general public would be persuaded by scientific experts. [46]

5.2 Genetically modified organisms and food

The second example of public perceptions of an emergent technology is taken from some recent studies of attitudes towards Genetically Modified Organisms and Food (GM). There is an extensive literature about this but we have focused on some examples of research that has addressed the ways in which people respond to "unknown" technological innovations. [47]

GROVE-WHITE and colleagues (GROVE-WHITE, MACNAGHTEN & WYNNE 2000) carried out a qualitative study by interviewing twenty experts and stakeholders, and undertaking six focus groups with members of the public about the introduction of GM crops and food. This was done in the context of considerable public debate about GM during the late-1990s in Britain. First, it was observed that the professionals and specialists, who were interviewed, saw providing information to the public as conveying "facts", not indicating areas of ignorance or uncertainty. Experts assumed that "consumers" made judgements on the basis of what was positively known. Second, the focus groups with members of the public revealed "widespread suspicion" of GM foods (and the motives of those promoting them) and a feeling that they had little influence over these new technologies. However, attitudes did vary between technologies—more favourable views were displayed towards Information Technology than towards GM, for example. Trust in information about GM supplied by business and government officials was limited and conditional. [48]

Members of the public interpreted information in relation to their own experience as consumers and their trust in the information source. There was great public concern about the uncertainties surrounding the impact of GM, but this was not mirrored in expert/specialists' approach to communication of information. GROVE-WHITE and colleagues noted a "deep cultural dislocation" between the expert framing of relevant knowledge, and typical public perceptions. Whereas the experts tended to ask "What are the risks?" (Type 1 risk), the public extended that question to probe "What might be the unanticipated effects" (Type 2 risk), and also "Who will be responsible?", "Can they be trusted?" As GROVE-WHITE et al. (2000) concluded, the public expected greater acknowledgement of scientific uncertainty, but: "[a]gain and again, public demands for 'the facts' or 'fuller information' about particular controversial products or processes have been patronised by official scientific advisors and spokesmen as misguided pleas for 'absolute certainty' that 'no risks exist'" (GROVE-WHITE et al. 2000, p.29). [49]

Public attitudes to genetically modified food were one of the five major "risks" studied by POORTINGA and PIDGEON (2003) using a large-scale nationally-representative (face-to-face interview) survey. Within a generally supportive position towards science, 39% of the respondents said that people put too much trust in science; 51% thought that scientists often try out new things without thinking about the consequences; 67% believed that scientists should listen more to what ordinary people think; and 69% replied that there is so much conflicting information that it is difficult to know what to believe. Among the five risk issues—climate change, radiation from mobile phones, radioactive waste, genetic testing and GM food—the least interest was in GM food and radiation from mobile phones. Forty-one percent considered GM food to be of importance. When asked which of the five risks posed the most risk to themselves individually (rather than to society as a whole) climate change and radioactive waste were seen as posing the greatest threat, genetic testing and mobile phone radiation as posing the least, and GM food was in the middle. When asked about different dimensions of risk, GM food (and climate change) scored most highly in terms of unknown consequences (Type 2 risk). Generally, the least trusted sources of information were national government, business and industry, and there was scepticism about the capacity of government to manage and regulate risks. Overall, from the survey, it was shown that "people appeared to be less concerned about GM food than the other risk cases, with perceived risks and benefits compared to the other risk cases judged as intermediate" (POORTINGA & PIDGEON 2003, p.54). [50]

Interesting comparisons from that survey can be made with SHAW’s small-scale qualitative study of public understandings of GM food (SHAW, 2002). This study comprised interviews with 17 purposively sampled experts (from the food and biotechnology industries, government agencies, academic scientists, public interest groups) and interviews with a sample of 32 members of the "lay" public. Generally, there was considerable unease among the public about GM food; contrary to the experts' expectations, those members of the public who were most knowledgeable about GM were also the most opposed to GM food. From the public interviewees, one dominant concern was the complexity and uncertainty of the scientific knowledge, but coinciding with a widespread belief that ordinary people had the potential to understand the issues. [51]

The trustworthiness of experts was also problematic, and this connected with the majority opinion that the public lacked personal control over risks, they perceived threats to be hidden (due to the unknown nature of genetic changes) and there was mistrust of business and government. Some interviewees, noticing conflicting information from different scientists, argued that decisions about risks were matters for self-judgement. This study again emphasised that the communication of risk in conditions of scientific uncertainty is extremely problematic, and argued that the "deficit model" of public understanding is inadequate. As SHAW (2002, p.279) noted:

"Across the array of interviewees, concerns about GM food often centred on the perceived uncertainty of 'expert' scientific knowledge. A recurring criticism was of the short-term perspective held by scientists, industry and government who were seen as failing to consider the long-term environmental and health impact of genetic modification". [52]

5.3 Nanotechnology

The third example of an emergent technology to be considered is nanotechnology (NT), as featured in the main findings and commentary from the Royal Society and Royal Academy of Engineering report on nanoscience and nanotechnologies (ROYAL SOCIETY & RAE 2004). Briefly, nanotechnology involves using material at the scale where one nanometre is one-millionth of a millimetre, and encompasses numerous disciplines and subdisciplines in science, engineering and medicine. It is widely believed that the development of nanotechnology will revolutionise the production of materials, electronics, biotechnology and many other medical and industrial applications. The major concerns have been that nanoparticles may have as yet unknown environmental and human health effects. The Royal Society/RAE report (ROYAL SOCIETY & RAE 2004, p.5) noted that research into hazards associated with nanoparticles, nanotubes and their pathways is necessary to "reduce the uncertainties related to their potential impacts on health, safety and the environment". It also observed that public awareness of nanotechnology in Britain is low—in their survey of public opinion, only 29% said that they had heard of nanotechnology, and only 19% could offer any form of definition. [53]

The Royal Society study commissioned a large-scale representative sample survey (face-to-face interviews were carried out with 1,005 people) and qualitative workshops with samples of the public in Birmingham and London (carried out by BMRB) at which experts and scientists were introduced to provide information to assist the focus-group discussions. In the workshops, public awareness of NT was low, but after participants had been given more detailed information, there were some signs of positive interest in, and support for, some applications of NT. The technical report (BMRB, 2004) on these workshops demonstrated that members of the public were generally positive towards new technologies, except that GM food, embryo selection and human cloning were viewed negatively. [54]

Even with technologies where there was public support, however, laypeople identified negative features. They went through a mental "weighing-up" process, "trading-off" positive and negative effects of new technologies. People concluded that no technology was intrinsically good or bad: much depended on the uses to which it was put. Indeed, participants in the workshops "found it difficult to react to nanotechnology as a concept without seeing some of the ways in which it could be used" (BMRB, 2004, p.35). However, when provided with information by scientists, respondents found it difficult to react to—some found it "very confusing and difficult to understand" (BMRB, 2004, p.33). In general, participants strongly favoured control and regulation over the development of NT but were unsure how this was to be put into practice. They were nonetheless certain that the public should be involved in future regulation: it was expressed that government and scientists did not have the right to make decisions about NT without effective public consultation. Some argued that, given that scientific knowledge was so advanced and scientists disagreed amongst each other, there was insufficient accountability to the public (BMRB, 2004, p.17). In the main report section on stakeholder and public dialogue, it was acknowledged that the public was seeking reassurance over long-term uncertainties about the potential impact of NT—and negative comparisons were made by the public with nuclear power and genetic modification. Concern was also registered about whom to trust among the institutions that might control and regulate NT. The final report recommended that public dialogue and a "constructive and proactive debate" about NT was necessary "at a stage when it can inform key decisions about their development and before deeply entrenched or polarised positions appear" (ROYAL SOCIETY & RAE 2004, p.6). [55]

6. Issues in Public Engagement in New Technologies

6.1 The shock of the new?

It is commonly assumed that the newness of emergent technologies might predispose citizens to caution, although there are sufficient examples from the consumer market—especially the introduction of mobile telephones—where take-up seems not to have been affected by perceived risk. PURCELL, CLARKE and RENZULLI (2000) noted that the novelty of a technology influences the cultural shaping of choices about risk. A new process or product may be regarded as potentially dangerous but as it becomes embedded in routine behaviour, it becomes taken for granted and not perceived as unduly risky (for example the motor car, air transport, microwave cookers). Popular choices and acceptance of a technology are mediated by cultural understandings of acceptable risks and benefits. However, as PURCELL et al. (2000) argue, the "menu of choice" available to citizens/consumers is usually—and predominantly—constructed by experts, scientists, industries and governments. [56]

Special difficulties emerge when scientific discovery and technological innovation deals with materials or processes which were previously unknown—such as carbon capture and storage, GM organisms and food, nanotechnologies or hydrogen energy applications. LIDSKOG (2000) has argued cogently that environmental problems and risks are becoming more complex as well as more diffuse and remote from people's comprehension. It is not merely unfamiliarity but indeterminacy that creates difficulties for public awareness and acceptability. As LIDSKOG (2000, p.203) suggests, if risks are incomprehensible to non-experts and are "only made visible and understandable through researchers' assessments and scenarios, then laypeople will have nothing to add to the process of knowledge production concerning those risks'". Evidently, risk assessments comprise conjectures, and where there is no direct personal experience of them, people's judgements are highly contingent and problematic. [57]

Probably the most acute difficulties arise in contexts where public acceptability of controversial technologies is in dispute. WOLFE, BJORNSTAD, RUSSELL and KERCHNER (2002) have analysed a case of controversial technology—hazardous waste remediation using genetically engineered micro-organisms. They point out that mere information provision (or technical risk assessment) is insufficient to gain public acceptability. They stress that public participation on this issue necessitated addressing questions of legitimacy (who were the legitimate participants in a debate), representation (the degree of representativeness of appropriate and affected groups), exclusion (whether certain interests or groups were excluded from consultation), and power and authority. WOLFE et al. (2002) also distinguished between technological feasibility and social acceptability: a technology can be feasible and meet some expert assessments of risk thresholds, but will nevertheless be unacceptable socially. They further emphasise that social acceptability is a continuum and one that is subject to change. Predictability and certainty and greater familiarity with the technology might increase the likelihood and willingness of the public to negotiate acceptability of controversial technologies. However, WOLFE et al. (2002) propose that the means to engage the public in such negotiation is through a deliberative process of dialogue between experts and laypeople. [58]

6.2 The issue of "upstream" consultation

This requirement to move beyond conventional, limited, forms of public consultation—simply providing information or options for the public to approve—has gradually become more widely endorsed. RENN (1998) argued forcefully for a "deliberative" risk management process in which experts, managers and members of the public could participate fully. He noted that risk management involves reducing risks to levels regarded as tolerable by the public to assure control and monitoring. But as risk refers to the potential for real consequences, it is both socially constructed and also a representation of reality. This "dual" nature of risk, RENN argued, demands a dual strategy for risk management. The magnitude of risks must reflect technical expertise in assessments, but public concerns and values influence the topics for which risk assessments are deemed desirable and the degree to which those assessments are accepted. To identify public values and integrate them in decision-making about risk management, RENN strongly advocated a communication process based on intensive dialogue and "mutual social learning" or "co-operative discourse" between members of the public, the scientific community and risk managers. [59]

Similar proposals have been made in Britain in the aftermath of the public consultation about GM food. GROVE-WHITE and colleagues (2000) identified the need to move from a "deficit-model" of public understanding of science and simplistic notions of communicating factual information to more sophisticated approaches to "interactive understanding". This was seen to be most important because of "the immediate practical need to incorporate more socially sensitive antennae into the very processes of technological innovation before irrevocable commitments are made" (GROVE-WHITE et al. 2000, p.39). HORLICK-JONES and colleagues (2004) took this further in their evaluation of the "GM Nation" public consultation exercise in the UK. They observed the widespread cynicism among the general public and those who actively participated in the consultation about whether their views would have any influence on government policy. HORLICK-JONES et al. (2004) commented on the fact that the official organisers of the debate found it difficult to secure a "balance" between objective factual knowledge and value-based ethical or political views. HORLICK-JONES et al. (2004) argued that this "problem" reflected a failure to appreciate that decision-making on risk and risk management is neither wholly technical, nor social, nor political, but combines all of these. Accordingly they recommended more "deliberative" processes of public engagement. In particular, where there were issues which were completely new and still "raw" in evidence (for example nanotechnology) deliberative juries were more appropriate, so that citizens could question a range of different experts, and mutual learning could occur. [60]

Many other commentators have championed a new model of promoting public engagement in technological innovation, and securing better public engagement, which broadens the scope of the discussion and involves the public at the earliest possible stage of development. HUNT, LITTLEWOOD and THOMPSON (2003) examined mechanisms for developing transparency and greater public participation in Radioactive Waste Management (RWM). They reported on various experimental dialogue processes involving "communicative interaction" between official stakeholder and the lay public. They concluded that it was essential to integrate ethical and social considerations in the decision-making. They rejected the conventional "top-down" approach of merely informing or re-assuring the public. Instead they argued that early or "front-end "consultation is necessary, using deliberative processes, and that this "upstream" process was more likely to lead to wider public acceptability of risky technology. The term "upstream", HUNT et al. (2003, p.6) note, "designates the idea of conducting participatory consultation early and before the "waters have been muddied" by institutional commitments to particular course of action". They found from dialogue workshops that laypeople consistently demonstrated a wish to step back from the immediate technical questions about RWM, in order to "frame" those within much broader debates about ethical, environmental and social contexts. Broadening the frame was a pre-requisite for addressing specific questions. Any specific risk was interpreted in relation to other knowledge and understanding of current concerns—for example BSE, rail safety, terrorist threats. If officials and experts wish to know what people think about a technology and its risks, HUNT et al. (2003) argue, then they can neither assume public ignorance nor disregard these other contextual issues. [61]

In a similar vein, the Royal Society & Royal Academy of Engineering report (ROYAL SOCIETY & RAE 2004) on nanoscience and nanotechnologies approves of the "upstream" dialogue and deliberation model to enhance public understanding and acceptance of new technology. The report recommended that dialogue and engagement should occur early, "before critical decisions about the technology become irreversible or 'locked in'" (ROYAL SOCIETY & RAE 2004, p.65). Clarity about the objectives of the deliberation was also necessary, together with commitment from the sponsor and key stakeholders to take account of the results of the dialogue. A more elaborated and general endorsement of upstream engagement has also come from the independent think tank "Demos" (WILSDON & WILLIS 2004) in their discussion of "see-through science". Their focus is on making science and technology debates truly transparent, scrutinising the assumptions and values that underpin technological innovation. Developing ideas associated with Brian WYNNE and others in the Lancaster group, WILSDON and WILLIS (2004) stress the necessity of upstream involvement of laypeople. The upstream questions which citizens are likely to insist on asking include: [62]

"Why this technology? Why not another? Who needs it? Who is controlling it? Who benefits from it? Can they be trusted? What will it mean for me and my family? Will it improve the environment?" (WILSDON & WILLIS 2004, p.28) [63]

They point out that conventionally, when introducing a new technology, policy-makers avoid or omit questions about whether it is seen as desirable, or what priorities or goals the technology is directed at, and skip to "the next layer of questions about how to deal with the risks, benefits and consequences of its exploitation" (WILSDON & WILLIS 2004, p.30). Particularly for new innovations, attempts to engage the public in dialogue tend to occur (if they occur at all) long after the major business decisions have been made. The "Demos" report authors reject this approach in favour of a deliberative and upstream model. Decisions about technology and its social acceptability are inherently political, and entail interactions and intermediation between different (sometimes conflicting) interests and values. Consequently, WILSDON and WILLIS (2004) argue, public engagement is not just about "informing" people about new technology and policy, or risk assessments, but must involve them in shaping the substantive decisions. [64]

7. Conclusions

Thus we can see that when considering the links between risk assessment and risk perception in emergent or new technologies, several factors stand out. First, the evidence from studies of public consultation over carbon capture and storage suggests that generally people had little or no knowledge of CCS, but on receiving more information, there was only a low level of support, and this was conditional upon contextualising the risk perceptions relative to other environmental issues and policies. In the case of GM food, again the predominant finding was public caution and suspicion of GM and its long-term effects; the trustworthiness of experts, business and government was also problematic. With respect to nanoscience and nanotechnologies, there is a very low level of public awareness; where there was limited support, it was based on "trade-offs" of other choices, and citizens demanded more detail about specific applications and uses in order to judge its benefits and risks. [65]

Public awareness of hydrogen energy and a potential hydrogen economy is yet to be investigated systematically. For communicating information about these issues to the lay public or engaging the public in dialogue, the (Type 1) risk assessment evidence is somewhat inconclusive; there is only early-stage provisional planning for Type 2 risks that involve the unexpected; and there is room for concern that perceived Type 3 risk—the sense of assured threat from a combustible and explosive gas—may be amplified, should a serious accident involving hydrogen occur as current niche development is rolled out to the consumer market. [66]

As we have seen from other emergent technologies such as CCS, GM and NT, the uncertainties of science are perceived in varying ways. Different stakeholders and different publics may focus on the different types of risk. Even attempting to move public consultation further "upstream" does not avoid this, as the framing of risks and benefits is necessarily embedded in a cultural and ideological context, and is subject to change as experience of the emergent technology unfolds. [67]

Acknowledgements

This paper is based on research funded by the EPSRC as part of the SUPERGEN programme, UK Sustainable Hydrogen Energy Consortium. The authors are working collaboratively at Salford University with Mike HODSON and Simon MARVIN, and with other social science colleagues at the Policy Studies Institute, London. However, the views expressed here are those of the authors alone.

The first version of this paper was presented to the ESRC Social Contexts and Responses to Risk (SCARR) Network Conference, "Learning about Risk", University of Kent, Canterbury, 28-29 January 2005.

References

Ajzen, Icek & Fishbein, Martin (1980). Understanding attitudes and predicting social behaviour. Englewood Cliffs: Prentice-Hall.

Bain, Addison & Van Vorst, William D. (1999). The Hindenburg tragedy revisited: the fatal flaw exposed. International Journal of Hydrogen Energy, 24(5), 399-403.

Bellaby, Paul (2003). H2 for C: Might substituting hydrogen for fossil fuel reduce inequalities in health? Paper presented at the British Sociological Association Medical Sociology Group Annual Conference, York University.

Bellaby, Paul, Flynn, Rob & Ricci, Miriam (2004). Is hydrogen safe? Approaches to the study of perceptions of risk among those who have a stake in a future hydrogen economy. Paper presented at the British Sociological Association Risk and Society Group Annual Conference, Nottingham University.

BMRB (2004). Nanotechnology: views of the general public. Report prepared for the Royal Society and Royal Academy of Engineering Nanotechnology Working Group. BMRB Social Research. London.

Cherry, Robert S. (2004). A hydrogen utopia? International Journal of Hydrogen Energy, 29, 125-129.

Derwent, Dick (2004). Environmental impact of H2. H2NET Summer Meeting, CCLRC Rutherford Appleton Laboratory, 14 July 2004, UK. Retrieved July 13, 2005, from http://www.h2net.org.uk/PDFs/AnnualMeeting_2004/Presentation_DickDerwent.pdf.

DOE (2004). Regulators' guide to permitting hydrogen technologies. U.S. Department of Energy. Energy Efficiency and Renewable Energy. Retrieved July 13, 2005, from http://www.pnl.gov/fuelcells/docs/permit-guides/overview_final.pdf.

Douglas, Mary (1964). Purity and danger. Harmondsworth: Penguin.

Dunn, Seth (2002). Hydrogen futures: towards a sustainable energy system. International Journal of Hydrogen Energy, 27(3), 235-264.

Finucane, Melissa L., Alhakami, Ali, Slovic, Paul & Johnson Stephen M. (2000). The affect heuristic in judgments of risk and benefits. Journal of Behavioral Decision-Making, 13(1), 1-17.

Fischhoff, Baruch, Slovic, Paul, Lichtenstein, Sarah, Read, Stephen & Combs, Barbara (2000). How safe is safe enough? In Paul Slovic (Ed.), The perception of risk (pp. 80-103). London: Earthscan Publications.

Frewer, Lynn, Miles, Susan, Brennan, Mary, Kuznesof, Sharon, Ness, Mitchell & Ritson, Christopher (2002). Public preferences for informed choice under conditions of risk uncertainty. Public Understanding of Science, 11, 363-372.

Gaston, Didier, Chelhaoui, Samira & Joly, Claire (2001). Safety problems related to fuel cells. World Congress Safety of modern technical Systems, 12-14 September 2001, Saarbruecken. Retrieved July 13, 2005, from http://www-old.ineris.fr/connaitre/domaines/accidentels/pdf/saarbruecken.pdf.

Geels, Frank & Smit, Wim (2000). Failed technology futures: pitfalls and lessons from a historical survey. Futures, 32, 867-885.

Grove-White, Robin, Macnaghten, Phil & Wynne, Brian (2000). Wising up: the public and new technologies. Centre for the Study of Environmental Change, Lancaster University. Retrieved July 13, 2005, from http://www.lancs.ac.uk/fss/ieppp/staff/docs/wising_upmacnaghten.pdf.

Harris, Rex, Book, David, Anderson, Paul & Edwards, Peter (2004, June/July). Hydrogen storage: the grand challenge. The Fuel Cell Review, 17-23.

Hodson, Mike & Marvin, Simon (2004). Opening the "black box" of the hydrogen economy. Working Paper 2, SURF, University of Salford.

Horlick-Jones, Tom, Walls, J., Rowe, Gene, Pidgeon, Nick, Poortinga, Wouter & O'Riordan, Tim (Understanding Risk Team) (2004). A deliberative future? Understanding Risk Working Paper 04-02, Cardiff University. Retrieved July 13, 2005, from http://www.uea.ac.uk/env/pur/gm_future_top_copy_12_feb_04.pdf.

HSE (2004). Fuel cells. Understand the hazards, control the risks. Norwich: Health and Safety Executive Books & HMSO.

Hunt, Jane, Littlewood, Dave & Thompson, Bill (2003). Developing participatory consultation. IEPP, Lancaster University. Retrieved July 13, 2005, from http://www.lancs.ac.uk/fss/ieppp/research/docs/final%20deliverable%204.10.doc.

IPCC (2001). Climate change 2001: The scientific basis. IPCC Working Group I, Third Assessment Report. Retrieved July 13, 2005, from http://www.grida.no/climate/ipcc_tar/wg1/index.htm.

Johnson, Branden (2003). Further notes on public response to uncertainty in risks and science. Risk Analysis, 23(4), 781-789.

Johnson, Branden & Slovic, Paul (1995). Presenting uncertainty in health risk assessment: initial studies of its effects on risk perception and trust. Risk Analysis, 15(4), 485-494.

Johnson, Branden & Slovic, Paul (1998). Lay views on uncertainty in environmental health risk assessment. Journal of Risk Research, 1(4), 261-279.

Kammen, Daniel M.& Lipman, Timothy E. (2003, October 10). Assessing the future hydrogen economy. Letter to the Editor. Science, 302, 226.

Kunreuther, Howard (2002). Risk analysis and risk management in an uncertain world. Risk Analysis, 22(4), 655-664.

Latour, Bruno (1996). Aramis or the love of technology (C. Porter, Trans.) Harvard: Harvard University Press.

Lehman, Peter A. (2003, October 10). Assessing the future hydrogen economy. Letter to the Editor. Science, 302, 227.

Lidskog, Rolf (2000). Scientific evidence or lay people's experience? On risk and trust with regard to modern environmental threats. In Maurie J. Cohen (Ed.), Risk in the modern age (pp.196-224). Houndmills: Macmillan.

Lion, René, Meertens, Ree & Bot, Ilja (2002). Priorities in information desire about unknown risks. Risk Analysis, 22(4), 765-776.

Lovins, Amory (2003, October 10). Assessing the future hydrogen economy. Letter to the Editor. Science, 302, 226-227.

McDowall, Will & Eames, Malcolm (2004). Forecasts, scenarios, visions, backcasts and roadmaps to the Hydrogen economy. Policy Studies Institute, London.

NASA (1997). Safety standard for hydrogen and hydrogen systems. Office of Safety and Mission Assurance, Washington DC. Retrieved July 13, 2005, from http://www.hq.nasa.gov/office/codeq/doctree/871916.pdf.

Poortinga, Wouter & Pidgeon, Nick (2003). Public perceptions of risk, science and governance. Centre for Environmental Risk, University of East Anglia.

Purcell, Kristen, Clarke, Lee & Renzulli, Linda (2000). Menus of choice: the social embeddedness of decisions. In Maurie J. Cohen (Ed.), Risk in the modern age (pp.62-79). Houndmills: Macmillan.

Rifkin, Jeremy (2002). The hydrogen economy. New York: Tarcher/Putnam.

Royal Society and Royal Academy of Engineering (2004). Nanoscience and nanotechnologies: opportunities and uncertainties. London: The Royal Society and Royal Academy of Engineering.

Schultz, Martin G., Diehl, Thomas, Brasseur, Guy P. & Zittel, Werner (2003, October 24). Air pollution and climate-forcing impacts of a global hydrogen economy. Science, 302, 624-627.

Shackley, Simon, McLachlan, Carly & Gough, Clair (2004). The public perceptions of carbon capture and storage. Tyndall Centre Working Paper 44, UMIST.

Shaw, Alison (2002). "It goes against the grain". Public understandings of genetically modified (GM) food in the UK. Public Understanding of Science, 11, 273-291.

Siegrist, Michael & Cvetkovich, George (2000). Perception of hazards: the role of social trust and knowledge. Risk Analysis, 20(5), 713-719.

Simon, Herbert (1976) Administrative vehavior (3rd edition). New York: Free Press.

Sjoberg, Lennart (2001). Limits of knowledge and the limited importance of trust. Risk Analysis, 21(1), 189-198.

Slovic, Paul (Ed.) (2000a). The perception of risk. London: Earthscan Publications.

Slovic, Paul (2000b). The perception of risk. In Paul Slovic P (Ed.), The perception of risk (pp.220-231). London: Earthscan Publications.

Swain, Michael R., Filoso, Patrick, Grilliot, Eric S. & Swain, Matthew N. (2003). Hydrogen leakage into simple geometric enclosures. International Journal of Hydrogen Energy, 28, 229-248.

Tromp, Tracey K., Shia, Run-Lie, Allen, Mark, Eiler, John M. & Yung, Yuk L. (2003, June 13). Potential environmental impact of a hydrogen economy on the stratosphere. Science, 300, 1740-1742.

Voltaire, Francois M. (1947). Candide—or Optimism (J. Butt, Trans.). London: Penguin Classics.

Watson, Jim, Tettech, Alison, Dutton, Geoff, Bristow, Abigail, Kelly, Charlotte & Page, Matthew (2004). Hydrogen futures to 2050. Tyndall Centre Working Paper 46, SPRU, University of Sussex.

Wilsdon, James & Willis, Rebecca (2004). See-through science, Demos, London. Retrieved July 13, 2005, from http://www.demos.co.uk/catalogue/paddlingupstream/.

Wolfe, Amy K., Bjornstad David J., Russell, Milton & Kerchner, Nichole D. (2002). A framework for analyzing dialogues over the acceptability of controversial technologies. Science, Technology and Human Values, 27(1), 134-159.

Wood, Stephen, Jones, Richard & Geldart, Alison (2003). The social and economic challenges of nanotechnology. Swindon: ESRC.

Wynne, Brian (1996). SSK's identity parade: Signing-up, off-and-on. Social Studies of Science, 26(2), 357-391.

Authors

Rob FLYNN is Professor of Sociology at the Institute for Social, Cultural and Policy Research, University of Salford. He has extensive experience in applied policy-oriented social research in urban and housing policy; local government studies; health policy and health services management. His current interest is in the relationship between expertise, risk and trust, and the regulation of professionals.

Contact:

Rob Flynn

Institute for Social, Cultural and Policy Research
Humphrey Booth House
University of Salford
Salford M5 4WT, UK

E-mail: r.flynn@salford.ac.uk
URL: http://www.iscpr.salford.ac.uk/

 

Paul BELLABY is Professor of Sociology of Health at the Institute for Social, Cultural and Policy Research, University of Salford. His central interests currently concern lay encounters with risk to health and safety, how lay might diverge from expert assessments of risk and how to obtain agreement on what constitutes risk.

Contact:

Paul Bellaby

Institute for Social, Cultural and Policy Research
Humphrey Booth House
University of Salford
Salford M5 4WT, UK

E-mail: p.bellaby@salford.ac.uk
URL: http://www.iscpr.salford.ac.uk/

 

Dr Miriam RICCI is Research Fellow at the Institute for Social, Cultural and Policy Research, University of Salford. Having initially trained as a physicist, she is currently interested in social and economic aspects of scientific and technological development.

Contact:

Miriam Ricci

Institute for Social, Cultural and Policy Research
Humphrey Booth House
University of Salford
Salford M5 4WT, UK

E-mail: m.ricci@salford.ac.uk
URL: http://www.iscpr.salford.ac.uk/

Citation

Flynn, Rob, Bellaby, Paul & Ricci, Miriam (2006). Risk Perception of an Emergent Technology: The Case of Hydrogen Energy [67 paragraphs]. Forum Qualitative Sozialforschung / Forum: Qualitative Social Research, 7(1), Art. 19, http://nbn-resolving.de/urn:nbn:de:0114-fqs0601194.

Forum Qualitative Sozialforschung / Forum: Qualitative Social Research (FQS)

ISSN 1438-5627

Creative Common License

Creative Commons Attribution 4.0 International License