Anda di halaman 1dari 7

computersystemsvalidation.

blogspot

Computer System Impact / Risk Assessment


Purpose of Risk Assessment
The purpose of the risk assessment process is to ensure that the validation (quality) effort is directed at the systems
that have the potential to impact product quality, efficacy and data integrity (throughout this article referred to as
Product Quality).

Initial Risk Assessments


In the ISPE Commissioning and Qualification Guide the approach to an initial impact assessment was first
introduced. This was designed to ensure that only systems that have the potential to impact product quality require
formal validation.

The impact assessment process had selections.

Direct Impact - A system that is expected to have a direct impact on product


quality.

 Indirect Impact - A system that does not have a direct impact on product quality
but is linked to / supports a direct impact system.
 No Impact – A system that does not have the potential to have an impact on
Product Quality and is not linked to or support a direct impact system.

Only systems that have a Direct Impact on Product Quality require full formal validation. Where systems are
identified as Indirect Impact then the interfaces to the Direct Impact System should be considered. No
Impact systems do not require formal verification and should be installed following Good Engineering Practices
(GEP).
While this impact Assessment provides guidance as to what systems require validation it does not support the level
of validation required. At the early stage of the process a Risk Assessment can support the approach and level of
software validation required.

Computer System Risk Assessment


GAMP 5 supports the approach of performing an initial risk assessment to determine:
 What are the risks to the business
 System GxP Determination
 What is the overall impact of the system
 Determine if more risk assessment are required
This provides the impact assessment and some planning. However even at the early stage of the project a simple risk
assessment can provide guidance to the Computer Systems Validation requirements, based on

 Product Quality / GMP Impact


 Complexity of the Process
 Stability of the Software (GAMP Category)
From the combination of these items a number of recommendations can be made. This includes
 Supplier Audit requirements
 Software development controls (e.g. Source Code Review)
 Design Documents (Functional Design Specification, User Manuals, etc)
 Further risk assessment (Component Assessments, FMEA, etc)
 Test Documentation
These recommendations as to the validation requirements can be automated in a spreadsheet, based of a combination
of the Impact, Complexity and GAMP category.
In future posts I will provide examples of this Impact / Risk Assessment process for computerised systems. If there
are any specific examples that you would find useful then please post them within the comments.

GxP Assessment
The first stage of whether a system requires a validation is to identify whether the system has a GxP impact.

The following are examples of questions that can be used in making the validation determination for a computerised
system

 The system is used to monitor, control or supervise a GxP manufacturing or packaging process and have the
potential to affect Product Quality, Safety, Identity or Efficacy.
 The system is used for GxP analytical quality control.
 The system manipulates data, or produces reports, to be used by GxP quality related decision authorisation /
approval processes, where the data supports the decision process or the electronic record constitutes the master
record.
 The system is used to maintain compliance with a GxP requirement. The system is used to monitor, control
or supervise warehouse or distribution within a GxP requirement.
 The system is used for GxP batch sentencing or batch records.

If any of the questions are answered “Yes” then the system has a GxP impact and the system requires a level of
Validation and control through the Computer Systems Lifecycle. If all the questions are answered “No” then the
system may be deemed not to have a GxP Impact. This should be documented to support the decision not to perform
formal validation.

Computer System Risk Assessment


It is useful to have a high level determination of the computerised systems GxP Impact identified to help support the
decision processes for the level of validation and controls to be applied throughout the computer systems lifecycle.

This initial risk assessment should consider the impact to GMP and the complexity (GAMP category) of the system.
The following provides guidance for determining the GxP impact of the computerised system

High - The computerised system has a direct impact in Product Quality, Patient and Consumer Safety or High
Impact GxP data integrity

Medium - The computerised system has a Direct Impact on GxP or health-related regulations (including the
integrity of supporting systems data) but no direct impact on Product Quality, Patient Safety or GxP data integrity.

Low - Indirect impact to GxP or other health-related activities with no direct impact on Product Quality, Patient
Safety or Data Integrity.

Based on the ISPE Guide for Risk-Based Approach to Electronic Records and Signatures high impacting electronic
records and GxP data can be defined as data that directly supports batch release providing assurance of product
quality, efficacy, safety, purity and / or identity.

The complexity of the system the level of Coding and Configuration can also be used to support the risk assessment.
Using the GAMP 5 categories as detailed below

Cat 5 –Complexity High – Bespoke software applications

Cat 4 – Complexity Medium – Configured software applications (for complex configuration then you may wish to
consider it as category 5)

Cat 3 – Complexity Low - Non configurable software (systems should only be considered to be Low Complexity if
they are established market products with a proven track record within the industry, for example standard HPLC).

Overall Risk Rating is determined using the traditional 9 box grid image
Using the Risk Rating
Many decisions can be made from the initial risk ranking including the approach and extent of the validation.

 Supplier Auditing – High Risk items only. Medium and low risk computerised systems have only a postal
questionnaire.
 Risk Assessments – High and Medium risk systems will be subject to more detailed risk assessments, low
risk computerised systems will not.
 Level of verification activities – High and Medium risk systems have detailed formalised testing, Low risk
computerised systems have reduced testing, either commissioning or supplier verification.
 Level of security – Low risk computerised systems minimal controls over security, High and Medium Risk
computerised systems have full security controls applied.
 Frequency of periodic reviews.

The above list is not exhaustive; the regulated company can use the Risk Rating to determine the level of validation,
deliverables and controls through the lifecycle of the computerised system.
Process Control Systems GAMP 5 Software Categories
In the article Validation Determination the use of categorising software was discussed and how this can support the
approach to the validation. In this post we are looking at types of software which fall in to these categories for
Process Control Systems / Automation Systems.

Categorising software is used to support the approach to the validation based on complexity and novelty of the
computerised system.

The categories detailed within this post are based on GAMP 5 Software Categories.

GAMP Software Category 1 – Infrastructure Software


Unless a very simple control system (PLC and HMI) there is likely to be some elements of infrastructure software.

Infrastructure software in its most simple form is the operating system which the application software resides.

Additional software for managing the infrastructure the process control system includes:

 Operating Systems
 Anti-virus Software
 Active Directory / Domain Controller
 Database Software (SQL / Oracle)
 Server and Network Hardware
 Virtual Environments
 Firewalls, including configuration
 Server and Network Monitoring Tools
 Backup Systems
Note: Infrastructure should be built, configured and deployed in accordance with defined process / procedure and
critical aspects and / or configuration verified. Infrastructure is qualified but not validated. The validation is
performed on the hosted application not on the infrastructure.

GAMP Category 3 – Non Configurable Software


Configuration relates to adding functionality through standard modules, library items to standard software
applications to meet the business requirements.

In a process control system a DCS would be configured from standard modules to control a specific process and
would fall under GAMP Category 4. An electronic chart recorder which is also configured with Input Ranges,
Alarm Setpoints, etc. would fall under GAMP Category 3 for while it is has parameters entered under the
configuration it does not define functionality or a process flow.

It is important to understand the distinction between true configuration and parameterisation when assigning the
category.

Other examples that would fit under GAMP category 3 would be systems that are provided with computerised
controllers, including Programmable Logic Controllers (PLC’s) where the application is not modified (although may
be parameterised) to meet the business need. Within the pharmaceutical industry there are many examples of these
including Labelling and Packaging equipment.

There is no fixed rule as to the validation approach for GAMP Category 3 systems. This should be combined with the
impact or criticality of the process that the system is monitoring and / or controlling. It can support decisions as to
lifecycle steps that may not need to be performed for example Source Code Reviews, limited verification activities
and greater reliance on vendor test documentation.
As with any supplier you should ensure that the software has been performed in accordance with an appropriate quality
management system. However the GAMP Category can support the decision as to the level of supplier assessment that
needs to be performed (Postal Questionnaire rather than Full Site Audit.

EU Annex 11 states that the need for an audit should be based on a risk assessment refer to the Validation Determination post.

GAMP Software Category 4 – Configured Software


Configured software for a process control system is software applications that are configured to meet specific
business needs (see above GAMP Category 3).
GAMP Category 4 – Configured Software range in complexity from simple configuration of SCADA system graphics
to complex process control within a DCS or PLC (linking standard library objects to control the process).

Examples of configurable software for a Process Control System includes:

 DCS / SCADA Mimics


 DCS / SCADA Databases (Alarms, Tags, History)
 PLC / DCS programs configured from Standard functions library / IEC61131-3
For GAMP Category 4 software the approach to the computer systems validation may be to use the supplier’s
documentation and verification to demonstrate the suitability of the standard modules and limit the regulated
company’s verification to the critical functions of the business process and functions to support regulatory
compliance (security, electronic records, etc.).

GAMP Software Category 5 – Bespoke Software


Bespoke software is software that is generally written from scratch to fulfil the business need. As this software is
going the full development lifecycle there is a higher level of risk of errors within the application code.

In terms of a Process Control System GAMP Category 5 software may range from PLC logic (Ladder, Sequence Flow
Chart, C++, etc.) to custom scripts written within the SCADA / DCS system.

As GAMP Software Category 5 the level of verification through software testing (FAT, SAT, IQ, OQ, etc.) will be
increased. The level and formality of performing and documenting this testing will be determined on the GMP
Impact (Product Quality, Patient Safety, Data Integrity and GMP regulatory requirements).

Summary
The Validation Determination can be used to identify each component of the system and the associated software
category(s).

The GAMP Software Category may be used to support Computer Systems Validation decisions which may be
documented within the Validation Determination Statement or within the Validation Plan.

The GAMP Category can also be used to support further risk assessments, for example consider the type of software
category for controlling / monitoring each function. The likelihood of failure or the failure going undetected may be
lower for less complex / novel software.
GAMP5 Software Categories
As discussed in ISPE GAMP 5 the GAMP Categories for hardware and software have been retained in GAMP 5, all
be it in a modified format from GAMP4.

The software categories identified in GAMP 5 do not fit with determining the risk to product quality, efficacy or data
integrity and no longer plays an integral part to determining that a computer system is fit for purpose.

The complexity and the maturity of the software can be used to support and mitigate identified risk but should not
be used to determine the validation / verification deliverables.

GAMP Categories
The GAMP categories were originally introduced to provide an initial assessment as to the validation requirements /
deliverables.

In GAMP 4 there were five software categories. These have been revised in GAMP5 to four categories as detailed
below:

Category 1 – Infrastructure software including operating systems, Database Managers, etc.

Category 3 – Non configurable software including, commercial off the shelf software (COTS), Laboratory
Instruments / Software.

Category 4 – Configured software including, LIMS, SCADA, DCS, CDS, etc.

Category 5 – Bespoke software

Category 2 from GAMP 4 has been removed. This related to firmware. At the time that GAMP4 was issued firmware
was considered to be used for simple instruments. However as technology has advanced the it has been recognised
that complex software can be embedded (firmware) within systems.

Why Categorise?

In GAMP 3 and GAMP 4 the purpose of the GAMP categories had clear purpose, identifying which validation
deliverables were not required. Categories 1-3 were considered to standard systems and the System Life Cycle Design
(SLCD) documentation were not required, this included

 Supplier Audits
 Functional Specifications
 Source Code Reviews

GAMP 5 still includes these categories however the benefits are not integrated within a Science and Risk Based
Approach to validation and the ASTM approach.

In the ASTM E2500-07 standard that:


“Vendor documentation, including test documents may be used as part of the verification documentation, providing the regulated company
has assessed the vendor”

This implies a level of governance to be applied over suppliers independent of the maturity or complexity of the
software.

While GAMP5 provides guidance to the approach based on the categories there are better rationales that can be put
in place rather than the complexity of the software. For example a Laboratory Instrument (Category 3 – COTS)
which is pre-use and post-use calibrated or runs standards along with the test need less verification than a system
where only the results are relied on. This can be documented within the validation plan or the risk assessments.
Posted by Unknown

Labels: ASTM-2500, GAMP 5, Validation


2 comments:
1.

Anonymous14 March 2011 at 09:26

<>
<<sent by Ian

As you said “GAMP5 still includes these categories however the benefits are not integrated within a Science
and Risk Based Approach to validation and the ASTM approach”

what is the benefits? what i understand is: we may put a very simple customized code ( CAT5) to
configuration software (CAT4), which decides and documented by the RA. Is it right?
Reply

2.

B. Tedstone8 April 2011 at 09:07


Ian
Yes the categorisation of software supports the risk process. The novelty and complexity of the software
should support the assumptions made within the risk assessments.

In previous versions of GAMP (GAMP3 and GAMP4) the software categories were used to directly relate to
the validation activities (e.g. supplier audit).

However rather than just considering the complexity and novelty the need to perform and audit should be
based on documented risk assessment considering patient safety and data integrity along with the category
of software. This is supported in the new release of EU Annex 11 which states:

“The competence and reliability of a supplier are key factors when selecting a product or service provider. The need for an audit
should be based on a risk assessment.”

Anda mungkin juga menyukai