This article needs additional citations for verification. Please help improve this article by adding citations to reliable sources. Unsourced material may be challenged and removed.Find sources: "Safety-critical system" – news · newspapers · books · scholar · JSTOR (April 2009) (Learn how and when to remove this template message)
Examples[1] of safety-critical systems. From left to right, top to bottom: the glass cockpit of a C-141, a pacemaker, the Space Shuttle and the control room of a nuclear power plant.

A safety-critical system (SCS)[2] or life-critical system is a system whose failure or malfunction may result in one (or more) of the following outcomes:[3][4]

A safety-related system (or sometimes safety-involved system) comprises everything (hardware, software, and human aspects) needed to perform one or more safety functions, in which failure would cause a significant increase in the safety risk for the people or environment involved.[5] Safety-related systems are those that do not have full responsibility for controlling hazards such as loss of life, severe injury or severe environmental damage. The malfunction of a safety-involved system would only be that hazardous in conjunction with the failure of other systems or human error. Some safety organizations provide guidance on safety-related systems, for example the Health and Safety Executive (HSE) in the United Kingdom.[6]

Risks of this sort are usually managed with the methods and tools of safety engineering. A safety-critical system is designed to lose less than one life per billion (109) hours of operation.[7][8] Typical design methods include probabilistic risk assessment, a method that combines failure mode and effects analysis (FMEA) with fault tree analysis. Safety-critical systems are increasingly computer-based.

Reliability regimes

Several reliability regimes for safety-critical systems exist:

Software engineering for safety-critical systems

Software engineering for safety-critical systems is particularly difficult. There are three aspects which can be applied to aid the engineering software for life-critical systems. First is process engineering and management. Secondly, selecting the appropriate tools and environment for the system. This allows the system developer to effectively test the system by emulation and observe its effectiveness. Thirdly, address any legal and regulatory requirements, such as FAA requirements for aviation. By setting a standard for which a system is required to be developed under, it forces the designers to stick to the requirements. The avionics industry has succeeded in producing standard methods for producing life-critical avionics software. Similar standards exist for industry, in general, (IEC 61508) and automotive (ISO 26262), medical (IEC 62304) and nuclear (IEC 61513) industries specifically. The standard approach is to carefully code, inspect, document, test, verify and analyze the system. Another approach is to certify a production system, a compiler, and then generate the system's code from specifications. Another approach uses formal methods to generate proofs that the code meets requirements.[11] All of these approaches improve the software quality in safety-critical systems by testing or eliminating manual steps in the development process, because people make mistakes, and these mistakes are the most common cause of potential life-threatening errors.

Examples of safety-critical systems



The technology requirements can go beyond avoidance of failure, and can even facilitate medical intensive care (which deals with healing patients), and also life support (which is for stabilizing patients).

Nuclear engineering







See also


  1. ^ J.C. Knight (2002). "Safety critical systems: challenges and directions". IEEE: 547–550. ((cite journal)): Cite journal requires |journal= (help)
  2. ^ "Safety-critical system". Retrieved 15 April 2017.
  3. ^ Sommerville, Ian (2015). Software Engineering (PDF). Pearson India. ISBN 978-9332582699.
  4. ^ Sommerville, Ian (2014-07-24). "Critical systems". an Sommerville's book website. Retrieved 18 April 2018.
  5. ^ "FAQ – Edition 2.0: E) Key concepts". IEC 61508 – Functional Safety. International Electrotechnical Commission. Retrieved 23 October 2016.
  6. ^ "Part 1: Key guidance" (PDF). Managing competence for safety-related systems. UK: Health and Safety Executive. 2007. Retrieved 23 October 2016.
  7. ^ FAA AC 25.1309-1A – System Design and Analysis
  8. ^ Bowen, Jonathan P. (April 2000). "The Ethics of Safety-Critical Systems". Communications of the ACM. 43 (4): 91–97. doi:10.1145/332051.332078. S2CID 15979368.
  9. ^ Thompson, Nicholas (2009-09-21). "Inside the Apocalyptic Soviet Doomsday Machine". WIRED.
  10. ^ "Definition fail-soft".
  11. ^ Bowen, Jonathan P.; Stavridou, Victoria (July 1993). "Safety-critical systems, formal methods and standards". Software Engineering Journal. IEE/BCS. 8 (4): 189–209. doi:10.1049/sej.1993.0025. S2CID 9756364.
  12. ^ "Medical Device Safety System Design: A Systematic Approach". 2012-01-24.
  13. ^ Anderson, RJ; Smith, MF, eds. (September–December 1998). "Special Issue: Confidentiality, Privacy and Safety of Healthcare Systems". Health Informatics Journal. 4 (3–4).
  14. ^ "Safety of Nuclear Reactors".
  15. ^ "Safety-Critical Systems in Rail Transportation" (PDF). Archived from the original (PDF) on 2013-12-19. Retrieved 2016-10-23.
  16. ^ a b Wayback Machine
  17. ^ "Safety-Critical Automotive Systems".
  18. ^ Leanna Rierson (2013-01-07). Developing Safety-Critical Software: A Practical Guide for Aviation Software and DO-178C Compliance. ISBN 978-1-4398-1368-3.
  19. ^ "Human-Rating Requirements and Guidelinesfor Space Flight Systems" (PDF). NASA Procedures and Guidelines. June 19, 2003. NPG: 8705.2. Retrieved 2016-10-23.