Available at http://www.ijcsonline.com/
Abstract
Computer systems have become an integral part of our life. They are being used in systems catering to basic utility
services to complex scientific research and defense purposes. Any system presents some risk to its owner's, users and
environment. Some present more risk than others and those that present the most risk are what we call safety-critical
systems. Safety Critical systems are those systems whose failures could result in loss of life, loss of revenue, significant
property damage or damage to the environment. This paper reviews 9 system failures, belonging to various domains,
because of software bugs. The paper discusses some regulatory standards and guidelines for proper testing of the
software. We as well recommend the guidelines that should be followed while testing to reduce the instances of software
failure in safety-critical systems.
Keywords: Safety Critical Systems, SIL, Safety Standards
I.
INTRODUCTION
SOFTWARE FAILURES
Therac-25 (1986)
Public in-convenience
A. Space
1. Disintegration of MARS Orbiter [19]
NASA launched a mission to carry out a study of
MARS environment. An orbiter was launched in 1998 to
carry out the study amid much fanfare but it ended in a
disaster. Investigations into the root cause of the failure of
mission was attributed to a software error involving
calculation. A report issued by NASA states the root cause
was failure to use metric units in the coding of a software
file, small Forces, used in trajectory models. An
investigation revealed that the navigation team was
calculating metric units and the ground calculations were in
Imperial units. The computer sytems in the crafts were
unable to reconcile the differences resulting in a navigation
error.
2. The Mariner 1 spacecraft [12]
During the time when punch cards were used, program
would be translated to the punch cards by a programmer or
operator using a keypunch. Any mistakes either in the form
of typographical mistakes, or an incorrect command, could
not be caught by looking at the punch card. Many a times,
verification was carried out by re-punching the code onto a
second punch card and then by comparing it with the first
card using a card verifier. An unverified bug introduced by
a punch card is generally regarded as one of the most
108 | International Journal of Computer Systems, ISSN-(2394-1065), Vol. 03, Issue 02, February, 2016
III.
109 | International Journal of Computer Systems, ISSN-(2394-1065), Vol. 03, Issue 02, February, 2016
DISCUSSION ON SIL
110 | International Journal of Computer Systems, ISSN-(2394-1065), Vol. 03, Issue 02, February, 2016
100000000 100000
Many Peo- Humans
ple killed
lives
100
Damage to
physical
ob- jects,
in danger
risk
of
personal
in- jury
Financial
Great finan- Significant
Economy
catastrophes cial loss
financi
al loss
Destruction/ Destruction/ Faults
in
Security
data
disclosure
Disclosure
of
of
strategic
critic
data
al
data
Reparable,
Local
Environmen Extensivea
nd
a
t
and
but
damage to
services
nd
irreparable
co
the
services
damage to
menvironthe
prehensive
ment
environdamage to
ment
the
environment
1
Insignificant
damage to
things; no
risk
to
people
Insignificant
financi
al loss
No risk for
data
No environmental risk
111 | International Journal of Computer Systems, ISSN-(2394-1065), Vol. 03, Issue 02, February, 2016
REFERENCES
[1]
[2]
[3]
[4]
[5]
[6]
[7]
[8]
[9]
[10]
[11]
[12]
[13]
[14]
[15]
[16]
[17]
[18]
[19]
[20]
112 | International Journal of Computer Systems, ISSN-(2394-1065), Vol. 03, Issue 02, February, 2016