Anda di halaman 1dari 8

Reliability Engineering and System Safety 71 (2001) 201208

www.elsevier.com/locate/ress

On the ALARP approach to risk management


R.E. Melchers*
Department of Civil, Surveying and Environmental Engineering, The University of Newcastle, Newcastle, NSW 2308 Australia Accepted 8 September 2000

Abstract There is an increasing trend by regulatory authorities for the introduction of the as low as reasonably practicable (ALARP) approach in dealing with risk management of proposed or existing complex hazardous systems. For these, decisions about acceptability or tolerability of risks and consequences can have very signicant nancial, economic and other consequences for the proponents. Conversely, there may be very signicant social and socio-economic implications. ALARP as a guide to achieving a satisfactory outcome has a certain intuitive appeal for the practical management of industrial and other risks. However, as suggested herein, there are a number of areas of concern about the validity of this approach. These include representativeness, morality, philosophy, political reality and practicality. An important, and in some respects fundamental, difculty is that the risk acceptance criteria are not fully open to public scrutiny and can appear to be settled by negotiation. q 2001 Elsevier Science Ltd. All rights reserved.
Keywords: ALARP; Risk; Probability; Safety; Hazard; Regulations; Morality; Decision-making

1. Introduction The management of risks associated with potential hazardous activities in society remains a matter of profound public and technical interest. There has been and continues to be considerable development in the range and extent of regulatory activity. Many new regulatory frameworks have been established. Except for public input to risk assessments for very specic and contentious projects, there appears to have been remarkably little public debate (and perhaps even understanding) of the more general and philosophical issues involved. This is despite the rather spectacular failure in recent years of electricity, gas and other services over large regional areas and the occurrence of several major industrial accidents. One issue which might have been expected to have received some public discussion is how decisions about hazardous facilities and activities are to be regulated. Should it be through regulatory or consent authorities, and if so, what form and allegiances should such bodies have? Alternatively, should it be through `self-regulation', or should there be some other mechanism(s)? These options have been explored in an interesting discussion paper [1]. However, it appears largely to have been ignored in practice. Perhaps by default, the regulatory approach is
* Tel.: 161-4921-6058; fax: 161-4921-6991. E-mail address: cerem@cc.newcastle.edu.au (R.E. Melchers).

the most common route in attempting to exert control over potentially hazardous activities. This trend is being followed in a number of countries. It is appropriate, therefore, to review some aspects of these directions. In particular, the present paper will focus on the use of the socalled as low as reasonably practicable (ALARP) approach [also sometimes known as the as low as reasonably attainable/achievable (ALARA) approach]. It will be viewed primarily from the perspective of so-called `Common Law' countries, that is those with a legal system parallel to that of the USA or the UK. For countries such as Norway, where ALARP is also very extensively used, some of the comments to follow may not be completely applicable. However, it is considered that the bulk of the discussion is sufciently general. The ALARP approach grew out of the so-called safety case concept rst developed formally in the UK [2]. It was a major innovation in the management of risks for potentially hazardous industries. It requires operators and intending operators of a potentially hazardous facility to demonstrate that (i) the facility is t for its intended purposes, (ii) the risks associated with its functioning are sufciently low and (iii) sufcient safety and emergency measures have been instituted (or are proposed). Since in practice there are economic and practical limits to which these actions can be applied, the actual implementation has relied on the concept of `goal setting' regulations. The ALARP approach

0951-8320/01/$ - see front matter q 2001 Elsevier Science Ltd. All rights reserved. PII: S 0951-832 0(00)00096-X

202

R.E. Melchers / Reliability Engineering and System Safety 71 (2001) 201208

Fig. 1. Levels of risk and ALARP, based on UK experience [3].

is the most well known of these. It is claimed by some as being a more `fundamental' approach to the setting of tolerable risk levels [3,4]. Conceptually the ALARP approach can be illustrated as in Fig. 1. This shows an upper limit of risk that can be tolerated in any circumstances and a lower limit below which risk is of no practical interest. Indicative numbers for risks are shown only for illustration the precise values are not central to the discussion herein but can be found in relevant country-specic documentation. The ALARP approach requires that risks between these two limits must be reduced to a level `as low as reasonably practicable'. In relevant regulations it is usually required that a detailed justication be given for what is considered by the applicant to satisfy this `criterion'. As a guide to regulatory decision-making the ALARP concept suggests both `reason' and `practicality'. It conveys the suggestion of bridging the gap between technological and social views of risk and also that society has a role in the decision-making process. In addition, it has a degree of intuitive appeal, conveying feelings of reasonableness amongst human beings. As will be argued in more detail below, these impressions are somewhat misleading. There are also considerable philosophical and moral short-comings in the ALARP approach. Perhaps rather obliquely, the discussion will suggest what should be done to improve the viability of ALARP or what characteristics need to be embodied in alternatives. However, it is acknowledged that this is not a paper offering `solutions' but rather one which it is hoped will focus more attention on the issues and stimulate discussion in order to bring about solutions. To allow attention to be focussed more clearly on the difculties with the philosophy of ALARP, it is necessary rst to review some matters fundamental to the interpretation and management of risk in society.

These issues include: (i) risk denition and perception, (ii) risk tolerance, (iii) the decision-making framework, and (iv) its implementation in practice.

2. Risk perception 2.1. Risk understanding and denition Increased levels of education, awareness of environmental and development issues and greater political maturity on the part of society generally has led to a much keener interest in industrial risk management practices, policies and effectiveness. Apart from hazardous industries, public interest derives also from notable public policy conicts over the siting of facilities perceived to be hazardous or environmentally unfriendly. Despite this, `risk' as a concept perceived by the general public appears to be rather poorly dened, with confusion between probability, something involving both probability and consequences and something implying monetary or other loss. Vlek and Stallen [5] gave some ten different denitions of `risk' or riskiness, using various ways of `mixing' all or parts of the two main component ideas. Traditional decision analysis, of course, simply multiplies the chance estimate by the consequence estimate. This is only a `rst-order' approach, with both the chance estimate and the consequence estimate being mean values. It is possible, at the expense of greater complexity in analysis, but perhaps reecting more accurately personal and societal perception, to invoke measures of uncertainty, such as the standard deviation of each estimate [6]. Nevertheless, there is likely to remain some disagreement over a core denition of risk (as there appears to be in most sociological and psychological works about any term) depending on ones view-point and stake in the eventual outcome [1].

R.E. Melchers / Reliability Engineering and System Safety 71 (2001) 201208

203

In the mathematical/statistical literature and in most engineering oriented probability discussions, risk is simply taken as another word for probability of occurrence or `chance', with consequences, however they might be measured, kept quite separate. Herein the approach will be adopted to use `risk' as a generic term, implying both probabilities and consequences without specifying how these are to be combined. 2.2. Risk as an objective matter It has become increasingly clear that `risk' is not an objective matter. Thus all risk assessment involves both `objective' and `subjective' information. Matters generally considered to be capable of `objective' representation, such as physical consequences, are seldom completely so, since in their formulation certain (subjective, even if well accepted) decisions have had to be made regarding data categorization, its representation, etc. This also applies to areas of science once considered to be `objective', a matter which is now considered briey. In the development of mathematical and numerical models in science, model `verication' is the proof that the model is a true representation. It may be possible to do this for so-called `closed' systems. These are completely dened systems for which all the components of the system are established independently and are known to be correct. But this is not the general case or the case for natural systems. For these `verication' is considered to be impossible [7]. Model `validation', on the other hand, is the establishment of legitimacy of a model, typically achieved through contracts, arguments and methods. Thus models can be conrmed by the demonstration of agreement between observation and prediction, but this is inherently partial. Complete conrmation is logically precluded by the fallacy of afrming the consequent and by incomplete access to natural phenomena Models can only be evaluated in relative terms [7]. Philosophical arguments also point to the impossibility of proving that a theory is correct it is only possible to disprove it [8,9]. Moreover, in developing scientic work, models are routinely modied to t new or recalcitrant data. This suggests that models can never be `perfect' [10]. It follows that for theories and models to be accepted, there is necessarily a high degree of consensus-forming and personal inter-play in their development and the scientic understanding underpinning them [11]. Some of this can be brought about by `peer' reviews of risk assessments and procedures, such as widely practiced in the nuclear industry. These concepts carry-over directly to risk estimation since risk estimates are nothing but models of expectation of outcomes of uncertain systems (i.e. `open' systems), couched in term of the theory of probability. Thus, in the context of PSA, often the probabilities are seen as physical properties of the installation and how it is

operated and while this view is useful for making comparative statements about riskiness or for comparison to standards, this interpretation is inconsistent with all standard philosophical theories of probability [12]. 2.3. Factors in risk perception There are many factors involved in risk perception [1]. These include: (i) the likely consequences should an accident occur; (ii) the uncertainty in that consequence estimate; (iii) the perceived possibilities of obviating the consequences or reducing the probability of the consequences occurring, or both; (iv) familiarity with the `risk'; (v) level of knowledge and understanding of the `risk' or consequences or both; and (vi) the interplay between political, social and personal inuences in forming perceptions. The last two items in particular deserve some comment. Knowledge and understanding of risk issues on the part of individuals and society generally implies that (risk) communication exists, that it is utilized to convey meaningful information and that the capacity exists to understand the information being conveyed and to question it. Perhaps the most critical issue is the actual availability of relevant and accurate information. For a variety of reasons, there has been an increasing requirement placed on governments and industry to inform society about the hazards to which its members might be exposed. There has developed also greater possibility for access to government and government agency les under `Freedom of information'-type legislation. Whether these developments have been helpful in creating a better informed public is not entirely clear, as it involves also issues such as truthfulness in communications and the trust which society is willing to place in the available information. That there will be an interplay between individual and societal perceptions of risk follows from individuals being social beings. Their very existence is socially and psychologically intertwined with that of others. Formal and informal relationships and institutions set constraints and obligations upon people's behavior, provide broad frameworks for the shaping of their attitudes and beliefs, and are also closely tied to questions both of morality and of what is to be valued and what is not. There is no reason to suppose that beliefs and values relating to hazards are any different from other more general beliefs and values [1]. 3. Decision frameworks 3.1. New technology Society as a whole is constantly faced with the need to

204

R.E. Melchers / Reliability Engineering and System Safety 71 (2001) 201208

make decisions about existing hazardous or potentially hazardous projects. Usually these decisions are delegated to organizations with recognized expertise in the area. For existing technology, that expertise will rely on past experience, including accident statistics and `incident' (or `near-miss') statistics for hazardous facilities. In many cases hazard scenario and contingency planning also will be carried out. It is in this area that the techniques of probabilistic risk analysis are recognized to have validity in the sense of Section 2.2 [6]. For the potential risks associated with new technologies, however, the problem of management is more acute. This is because the basis for making decisions, that is a base of accumulated knowledge and experience, is not available. The dilemma can be seen clearly in the earlier writings related to nuclear risks, prior to the occurrence of the accidents at Three Mile Island, Chernobyl and the like. For example, Stallen [13], in reviewing the works of Hafele and Groenewold notes that the only solutions for the control of risks caused by new technology tend to involve extensive use of other (and older) forms of technology. History suggests that a new technology will only survive if it has no major catastrophes early in its development. Thereafter, the risks are apparently small because: (i) the operating experience base is small; (ii) particular care tends to be taken; and (iii) there has not been enough time for inservice problems to become sufciently evident. This may lead to the false sense that the actual risks involved are small. Further, for new technologies it is generally the case that the scientic understanding of the total socio-technical system, its limitations and assumptions, is rather incomplete, adding further to the difculties of satisfactory risk estimation. The `trial-and-error' underpinning much of the understanding of conventional and well-developed technology is missing. In connection with the development of science, Popper [8,9] has argued that only falsications (i.e. failures) lead to new developments verications of existing ideas merely add to our apparent condence in them, but they could be wrong. The inferences for risk analysis are not difcult to make [14]. 3.2. A wider perspective Under these circumstances, how can society deal with the evaluation of risks imposed by new technology? It is suggested that some light may be thrown on this question by an examination of the parallel issue of the rationality of science. Noted philosopher Habermas [15] has argued that the rationality of science stems not from any objective, external measures such as `truth' but from agreed formalisms (see also Section 2.2). This involves transactions between knowledgeable human beings and agreement between them about what can be considered to be `rational', given the base of available knowledge and experience. It presupposes a democratic and free society with equal oppor-

tunities for contributing to the discussion, for discourse and for criticism. It also requires truthfulness of viewpoint and the absence of power inequalities. Although these might seem like tall orders indeed, Habermas argues that there are very few situations where these conditions are not met or cannot be met eventually since open and free discourse will uncover the limitations which might exist. The implication for risk analysis and evaluation is that the rationality of the criteria and the degree to which risk might be accepted should be based, ultimately, on the agreed position of society obtained through internal and open transactions between knowledgeable and free human beings. Such a position has been put in different, but essentially analogous ways by others [1]. The importance of giving consideration to public opinion underlies much writing on risk criteria. However, the practical difculties of arriving at consensus decisions over the question of acceptable risk in society are considerable. According to Layeld [16] in commenting on Britain's Sizewell B reactor The opinions of the public should underlie the evaluation of risk. There appears to be no method at present for ascertaining the opinions of the public in such a way that they can be reliably used as the basis for risk evaluation. More research on the subject is needed. Moreover, society is a complex mix of sub-groups with differing aims, ambitions, views, opinions and allegiances. It is not surprising then that when faced with most matters about which profound decisions need to be made society responds with a variety of view-points and courses of action. Although there are always inter-plays between short-term and longer-term self-interests and morally `high-ground' views, it appears in many cases that the diversity of views and the convictions with which they are held is inversely related to the knowledge sub-groups of society have about the matter being considered. Layeld [16] noted As in other complex aspects of public policy where there are benets and detriments to different groups, Parliament is best placed to represent the public's attitude to risks. In practice, of course, such a course of action might be taken only for major policy decisions, such as whether the nation should have nuclear power or not, etc. However, Wynne [17] and others have argued that Parliament is ill-equipped both in time and expertise to fully appreciate the implications and changes likely to be brought about by the introduction or further development of new technologies. In his view, particularly for major new technology issues, the political process can only be considered to be defective. A historical review of the introduction of any really new technology shows, however, just how ill-informed and illequipped parliaments tend to be, mostly being even unaware of the changes taking place around them. For most major technological innovations (irrespective of their hazard potential) parliamentary interest tends to follow well after the technologies have been introduced. There are many

R.E. Melchers / Reliability Engineering and System Safety 71 (2001) 201208

205

examples of this in the developing Industrial Revolution [18] and more recent examples include IVF technology, gene technology, internet technology, etc. Moreover, even within society more generally there is seldom much awareness of potential problems and hence little or no debate or detailed consideration of it. Usually only after the technology has been established and some of its problems have become evident does public perception become active. This suggests that risk assessment in general, and approaches such as ALARP, can deal only with the control of the further development of already established technology. 3.3. Practical decisions Whatever the idealized situation ought to be, the need to make day-to-day decisions about lesser hazards in society has invariably led to regulatory approaches as more convenient substitutes for public or parliamentary debate. One reason sometimes given for leaving the decisions to public servants is that the public is uneducated, ill-informed and irrational in dealing with complex issues; arguments which can hardly be sustained as essential in a modern society. However, to invoke public debate and discussion ideally requires time and, for many individuals, much back-ground education when the discussion is about complex issues. None of these conditions tends to be met in practice, for a variety of reasons (see also Section 2.3). Often regulators will facilitate some form of public participation, such as through making available documents and through providing back-ground briengs. Unfortunately, in advancing along this line, there is a danger that there may no longer be much left of Habermas's vision of transactions between knowledgeable and free individuals in coming to a consensus. The methods which have evolved for the solution of acceptable or tolerable risk problems in a bureaucratic setting may be categorized broadly to include (see [1, Chapter 5]): 1. professional judgement as embodied in institutionally agreed standards (such as engineering codes of practice) or as in commonly accepted professional skills; 2. formal analysis tools such as cost-benet analysis or decision analysis, with or without public discussion opportunities; and 3. so-called `boot-strapping' approaches employing techniques such as `revealed preferences' as used in social psychology, or using extrapolations from available statistical data about risks currently accepted in other areas of endeavor. Aspects of all three are commonly in use. As will be seen, the ALARP approach falls essentially in the third category. 4. Risk tolerability The levels of risk associated with a given facility or

project that might be acceptable to, or tolerated by, an individual or society or sub-groups is an extremely complex issue, about which much has been written. It is not possible to deal with this matter here, but see Reid [19] for a useful summary and critique. Of course, `tolerability' and `acceptability' are not necessarily the same, although it has been common in risk analysis to loosely interchange the words. According to the HSE [3], `tolerability' refers to the willingness to live with a risk to secure certain benets and in the condence that it is being properly controlled. To tolerate a risk means that we do not regard it as negligible or something we might ignore, but rather as something we need to keep under review and reduce still further if and when we can. Acceptability, on the other hand, implies a more relaxed attitude to risk and hence a lower level of the associated risk criterion. According to Layeld [16], in terms of the nuclear power debate, the term `acceptable' fails to convey the reluctance that individuals commonly show towards being exposed to certain hazardous activities. Although the distinction between the terminology `acceptability' and `tolerability' is important, it is also the case that the term `acceptable' has been used in relation to consent or acceptance of a proposed risk situation on the part of regulatory authorities. This suggests by implication that the decisions of the regulatory authorities in some manner reect `tolerability' on the part of society.

5. ALARP 5.1. Denition of terms As noted, the ALARP approach has been advocated as a more fundamental approach to the setting of tolerable risk levels, particularly suitable for regulatory purposes [20]. Fig. 1 summarizes the approach, in which the region of real interest lies between the upper and lower limits. This is the region in which risks must be reduced to a level ALARP. Since this objective is central to the approach a very careful discussion and explanation of terms might be expected. However, apart from appeals to sensible discussion and reasonableness and the suggestion that there are legal interpretations, there is little in print which really attempts to come to terms with the critical issues and which can help industry focus on what might be acceptable [3]. The critical words in ALARP are `low', `reasonably' and `practicable'. Unfortunately, these are all relative terms standards are not dened. `Reasonably' is also an emotive word, implying goodness, care, consideration etc. However, as will be discussed below, what may be reasonable in some situations can be seen as inappropriate in others. Regarding `practicable', the Oxford Dictionary refers to `that can be done, feasible', i.e. what can be put into practice. Of course, many actions can be implemented, provided the nancial rewards and resources are sufcient.

206

R.E. Melchers / Reliability Engineering and System Safety 71 (2001) 201208

Thus there are a very clear nancial/economic implications `reasonable practicability' is not dened in legislation but has been interpreted in legal cases to mean that the degree of risk can be balanced against time, trouble, cost and physical difculty of its risk reduction measures. Risks have to be reduced to the level at which the benets arising from further risk reduction are disproportionate to the time, trouble, cost and physical difculty of implementing further risk reduction measures [3]. It is therefore clear that nancial implications are recognized in pursuing any safety improvement to demonstrate ALARP, account can be taken of cost. It is possible, in principle, to apply formal cost-benet techniques to assist in making judgement(s) of this kind. [3]. This assumes that all factors involved can be converted to monetary values. Unfortunately, it is well-known that there are not inconsiderable difculties and hence implied value judgements in evaluating or imputing monetary values for both benets and costs. This problem is particularly acute for the analysis of hazardous facilities where the value of human life and the (imputed) cost of suffering and deterioration of the quality of life may play a major role in the analysis. Further, an approach based on cost analysis implicitly assumes equal weighting for each monetary unit, a proposition known to cause difculties with cost benet analysis when applied to issues with social implications. It is considered that the selection of tolerable risk is of this type. Value judgements which society might make are subsumed in the valuations required for cost analysis. In addition, there is also the problem that the optimum obtained in cost benet analyses is seldom very sensitive to the variables involved. This means that cost benet analysis alone is unlikely to provide a clear guide to the selection of appropriate policy. Finally, it is unclear how value judgements such as `low', `reasonably' and `practicable' correlate with a minimum total cost outcome. The value judgements required involve issues well beyond conventional cost benet analysis, a matter well recognized in dealing with environmental issues [21]. 5.2. Openness In the expositions of the ALARP approach it appears that the specic tolerable probability levels which would qualify for acceptance by a regulatory authority are not always in the public domain. The tolerable risk criterion may not be known to the applicant and some process of negotiation between the regulatory authority and the applicant is needed. Societal groups concerned about openness in government might well view this type of approach with concern. A related problem with implementation of the ALARP approach can arise in the evaluation of two similar projects assessed at different times, possibly involving different personnel within the regulatory body and different propo-

nents. How is consistency between the `approvals' or `consents' to be attained? Irrespective of the care and effort expended by the regulatory authority, there is a real danger that an applicant with a proposal which needs to be further rened or which is rejected, will cry `foul'. Without openness and without explicit criteria, such dangers are not easily avoided. Is there not also a danger of corruption? 5.3. Morality and economics The issue of morality and how this is addressed by the ALARP approach can be brought most clearly into focus by a discussion based around the nuclear power industry. That industry took a major blow in the USA with the Three Mile Island and other incidents. Currently there are no new facilities planned or under construction. This is possible in the USA because there are alternative sources of electric power with perhaps lower perceived risks, including political risks. Opposition to nuclear power and the potential consequences associated with it are clearly in evidence. Such an open opposition may not always be tolerated in some other countries, nor may there be viable alternative power sources. Thus there may be pressures for public opposition to be ignored and to be discredited and for access to information to be less easy to obtain. For example, there have been claims of `cover-ups', such as over UK nuclear accidents. Whatever the precise reasons, it is clear that in some countries the nuclear industry remains viable. Comparison to the US situation suggests that what might be considered `reasonable and practical' in some countries is not so considered in the US, even though the technology, the human stock and intellect and the fear of nuclear power appear to be much the same. The only matters which appear to be different are: (i) the economic and political necessities of provision of electrical power; and perhaps (ii) acquiescence to a cultural system as reected in the political authority and legal systems and which preclude or curtail the possibility of protracted legal battles apparently only possible on Common Law countries. Do these matters then ultimately drive what is `reasonable and practical'? And if they do, is the value of human life the same? The dichotomy between socio-economic matters and morality issues has other implications also. It is known that in some countries the nuclear power system is of variable quality, with some installations known to have a considerable degree of radiation leakage far in excess of levels permitted under international standards. Even if, as is likely, the costs to bring the facilities to acceptable standards are too high, there will be economic pressures to keep the facilities in operation, despite the possibility that some plant workers would be exposed to excessive radiation. It is known that in some case maintenance problems in high radiation areas have been carried out through hiring, on a daily basis, members of the lowest socio-economic classes to do the work. Because the remuneration was good by local standards there was no shortage of willing workers, even

R.E. Melchers / Reliability Engineering and System Safety 71 (2001) 201208

207

though it has come to be known that many develop radiation sickness and serious tumors within weeks of being exposed. Although somewhat starkly, this illustrates that the criteria of `reasonableness' and `practicability' so essential in the ALARP approach are ultimately issues of morality. While for projects having the potential for only minor or rather limited individual or social consequences there is probably no need to be concerned, for other, more signicant projects the question must be asked whether it is acceptable for decisions about such issues to be left for private discussion between a regulatory authority and project proposers. 5.4. Public participation As noted earlier, for many systems in common usage there is a long and established base of experience (both good and bad) upon which to draw. This is not necessarily the case for all facilities and projects, particularly those subject to risk assessment requirements. It would seem to be precisely these projects for which risk analysis should be open to public scrutiny and debate so that the issue of their rationality in respect to society can be considered. As noted, the ALARP approach would appear to permit a small group of people making decisions about a potentially hazardous project, away from public scrutiny, and in consultation with the proponents of the project. According to the Royal Society report [1, p. 93], The (ALARP) approach has been criticised on the grounds that it does not relate benets clearly enough to tolerability. More importantly, however, it does not address the critical issue of how public input to tolerability decisions might be achieved, beyond an implicit appeal to the restricted, and now much criticised revealed-preferences criterion and`The question of how future public input to tolerability decisions might be best achieved is also closely related to recent work on risk communication. It is acknowledged that public debate and participation at a level leading to worthwhile input is not always practical. As noted earlier, only some participants will have the time, energy and capability to become fully acquainted with the technical intricacies involved in signicant projects. There are the dangers also of politicizing the debate and perhaps trivializing it through excessive emotional input. Nevertheless, there are strong grounds for not ignoring non-supercial public participation and involvement in risk-based decisions [1]. 5.5. Political reality Risk tolerability cannot be divorced from wider issues in the community. It is intertwined in matters such as risk perception, fear of consequences and their uncertainty etc. as well as various other factors which inuence and change society with time. Societal risk tolerability would be expected to change also. Change can occur very quickly when there is a discontinuity in the normal pattern of events

in society a major industrial accident is one such event. The implication for the ALARP approach might well be as follows. What would have been considered sufciently `low' for a particular type of facility prior to an `accident' might not be considered sufcient for other generally similar facilities after an accident. Yet there will be very considerable societal and political pressures for changing the acceptance criteria. Is it appropriate to do so? Following an accident, there is, usually, a call for an investigation, better safety measures, more conservative design approaches, better emergency procedures etc. However, some accidents must be expected. The fact that it is admitted at the consent, approval or design stage of a project that there is a nite probability of failure associated with the project implies that an accident is likely to occur sooner or later. The fact that the probability might have been shown to be extremely low does not alter this fact. Perhaps unfortunately, probability theory cannot suggest, usually, when an event might occur. Rationality demands that `knee-jerk' political and regulatory responses might well be inappropriate yet this is implicit in the `reasonable' and `practical' aspect of ALARP. 6. Discussion and possibilities In science, it is recognized that progress comes in relatively slow steps, learning by trial-and-error and modifying the body of theory and understanding in the light of apparent contradictions. Similarly, in the more practical arts such as engineering, progress comes about through a slow progression, carefully learning from past mistakes. Major problems in engineering are likely when past observations and understanding appear to have been forgotten or ignored [22,23]. It may be that an appropriate strategy for risk management lies along these lines also. Moreover, it is increasingly being recognized that such matters are best treated using risk analysis and that risk analysis is best performed using probabilistic methods [24]. Even then, the issues dealt with in probability-based risk management have, however, an added problem when it has to deal with low probability high consequence events. These, morally and practically, do not allow the luxury of a trial and error learning process. There may be just too much at stake hence advocates of the `precautionary principle'. Nevertheless, it is generally the case that the technology involved is not totally new but rather is a development of existing technology for which there is already some, or perhaps already extensive, experience. Associated with that existing technology are degrees of risk acceptance or tolerance reected in the behavior of society towards them. It is then possible, in principle, to `back-calculate' [25,26] the associated, underlying, tolerance levels, even if the analysis used for this purpose is recognized to be imperfect. The new technology should now be assessed employing, as much as possible, the information used to analyze the

208

R.E. Melchers / Reliability Engineering and System Safety 71 (2001) 201208

existing technology and using a risk analysis methodology, as much as possible, similar in style and simplications to that used to determine the previous tolerance levels. The process sketched above is one which elsewhere has been termed `calibration' [25,26], i.e. the assessment of one project against another, minimizing as much as possible the differences in risk analysis and data bases and not necessarily attempting to closely anchor the assessment in societal tolerable risk levels. The risk levels employed are derived from previously accepted technology only, using admittedly simplied models, and are of a nominal nature, having no strong validity outside the framework in which they have been employed. A somewhat similar approach is already implicit in the nuclear industry, with professionally agreed or accepted models being used for probability and other representations and with a strong culture of independent (`peer') reviews of risk analyses. The resulting probability estimates are likely to be internally consistent, and to have a high degree of professional acceptance, even though they may not relate very closely to underlying (but perhaps unknowable) probabilities of occurrence.

References
[1] Royal Society Study Group. Risk: analysis, perception and management. London: Royal Society, 1992. [2] Cullen The Hon Lord. The public inquiry into the Piper Alpha disaster. London: HMSO, 1990. [3] HSE. The tolerability of risk from nuclear power stations. London: Health and Safety Executive, 1992. [4] Kam JCP, Birkinshaw M, Sharp JV. Review of the applications of structural reliability technologies in offshore structural safety. Proceedings of the 1993 OMAE, vol. 2, 1993. p. 28996. [5] Vlek CJH, Stallen PJM. Rational and personal aspects of risk. Acta Psychologica 1980;45:273300. [6] Stewart MG, Melchers RE. Probabilistic risk assessment of engineering systems. London: Chapman and Hall, 1997. [7] Oreskes N, Shrader-Frechette K, Belitz K. Verication, valididty, and conrmation of numerical models in the earth sciences. Science 1994;263(4):6416. [8] Popper K. The logic of scientic discovery. Basic Books: New York. [9] Popper K. The growth of scientifc knowledge. New York: Basic Books, 1963 (see also Magee B., Popper. Fontana Modern Masters, 1987). [10] Kuhn TS. The structure of scientic revolution. Chicago, IL: University of Chicago Press, 1970. [11] Ravetz JR. Scientic knowledge and its social problems. Oxford: Clarendon Press, 1971. [12] Watson SR. The meaning of probability in probabilistic safety analysis. Reliability Engineering and System Safety 1994;45:2619. [13] Stallen PJM. In: Conrad J, editor. Risk of science or science of risk ?, Society, technology and risk assessment. London: Academic Press, 1980. p. 13148. [14] Blockley DI, editor. Engineering safety. London: McGraw-Hill, 1990. [15] Pusey M. Jurgen Habermas. Chichester, UK: Ellis Horwood/ Tavistock, 1987. [16] Layeld F. Sizewell B public inquiry: summary of conclusions and recommendations. London: HMSO, 1987. [17] Wynne B. In: Conrad J, editor. Society and risk assessment-an attempt at interpretation, Society, technology and risk assessment. London: Academic Press, 1980. p. 2817. [18] Lischka JR. Ludwig Mond and the British alkali industry. New York: Garland, 1985. [19] Reid SG. In: Blockley DI, editor. Acceptable risk, Engineering Safety. London: McGraw-Hill, 1992. p. 13866. [20] Sharp JV, Kam JC, Birkinshaw M. Review of criteria for inspection and maintenance of North Sea structures. Proceedings of the 1993 OMAE, vol. 2, 1993. p. 3638. [21] Layard PRG. Costbenet analysis: selected readings. Harmondsworth: Penguin, 1972. [22] Pugsley AC. The prediction of proneness to structural accidents. The Structural Engineer 1973;51(6):1956. [23] Sibley PG, Walker AC. Structural accidents and their causes. Proceedings of the Institute of Civil Engineers. Part. 1. 1977. p. 191208. [24] Kirchsteiger C. On the use of probabilistic and deterministic methods in risk analysis. Journal of Loss Prevention in the Process Industries 1999;12:399419. [25] Melchers RE. Structural reliability analysis and prediction. 2nd ed.. Chichester, UK: Wiley, 1999. [26] Melchers RE. In: Melchers RE, Stewart MG, editors. Probablistic calibration against existing practice as a tool for risk acceptability assessment, Integrated risk and assessment. Rotterdam: Balkema, 1995. p. 5156.

7. Conclusions Risk management should embody fundamental principles such as societal participation in decision-making. It is recognized that this may be difcult for a variety of reasons and that alternative decision-making procedures are required. The current trend appears to be one of increasing involvement of regulatory authorities, with acceptance criteria not always open to the public or the applicants and in some cases settled by negotiation. This is also the case with the ALARP approach. It is suggested that there are a number of areas of concern about the validity of this approach. These include representativeness, morality, philosophy, political reality and practicality. It is suggested that risk assessments recognize peer review and the incremental nature of technological risks.

Acknowledgements The support of the Australian Research Council under grant A89918007 is gratefully acknowledged. Some parts of this paper appeared in an earlier conference contribution. The author appreciates the valuable comments on a number of issues made by the reviewers. Where possible their comments have been addressed.

Anda mungkin juga menyukai