By John Lafferty
Software Validation just got a whole lot easier.
The effort required comply with the software validation requirements of ISO 13485:2016 and 21 CFR Part 820 has been greatly reduced due to recent changes in regulations and guidance. Under the latest guidance, software validation can be completed faster, for less cost and using less documentation whilst resulting in improved software performance.
In this blog, John Lafferty, our Life Sciences Programme Director, explains how to use the latest guidance to validate software used in the manufacturing and testing of medical devices and in Medical Device Quality Management Systems.
Note: This article does not apply to Software in a Medical Device (SiMD) or Software as a Medical Device (SaMD)
Topics covered in this blog:
There are two reasons software validation became easier;
Note: The FDA Guidance – General Principles of Software Validation 2002 remains current and only Section 6 of that guidance has been superseded by the latest (2025) CSA guidance.
The development of these two guidance documents involved extensive collaboration between the FDA and ISPE which means that they are mutually compatible. Both documents promote a focus on software quality assurance over software testing. Both documents promote a focus on good software engineering over documentation and give detail on how validation activities can be carried out in proportion to risk.
For the first time, we have real clarity on the practical application of the concept of software validation in proportion to risk. The FDA CSA Guidance divides applications into two types;
The word ‘directly’ is very important when distinguishing between the two types of software application. It should be noted that the thinking outlined in both the above guidance documents has been available for some time in ISO TR 80002-2:2017 “Medical device software — Part 2: Validation of software for medical device quality systems”. This guidance document can be used to support the validation approach for devices sold in regions outside of the US. In the experience of the author, this ISO guidance has not been widely used up to now; the reasons for this are not entirely clear.
The FDA CSA Guidance states ‘To determine whether the requirement for validation applies, manufacturers must determine whether the software is or will be used as part of production or the quality system (whether directly or to support production or the quality system)’. Software that is used directly is regarded as High-Risk software, and software that is not used directly is regarded as Not High-Risk software. Not High-Risk software may be further classified as Moderate (Medium) or Low Risk.
High-Risk software includes ‘Software intended for automating production processes, inspection, testing, or the collection and processing of production data; and software intended for automating quality system processes, collection and processing of quality system data, or maintaining a quality record established under applicable quality system regulations’.
Examples of High, Moderate and Low risk software applications, given in the FDA CSA Guidance 2025, are shown in Figure 1 below.
Figure 1: Examples of High, Moderate and Low Risk software applications
We can see from the above examples that the FDA CSA Guidance provides real clarity on software systems that should be considered as High-Risk and those that should be considered as Not High-Risk. In fact, the recommended approach in the guidance represents a significant departure from past thinking, where systems such as CAPA and Document Control systems may have been regarded as High-Risk.
The new approach which considers those systems to be Not-High risk, is based on the fact that any risk to product safety is an indirect risk. For a system to be considered indirect means that there must be other systems and/or human interventions in place, which have the potential to mitigate any risks. The guidance given in GAMP 5 Ed. 2: 2022 is in line with the FDA CSA Guidance. In the opinion of the author, the FDA CSA Guidance gives even greater clarity than GAMP 5 in this respect.
It is very important, at this point, to note two things;
Examples of risk-based approaches to documentation (Figure 2) and testing (Figure 3) are shown below.
Figure 2: Example of a Risk-based approach to Software System Requirements Documentation
Figure 3: Example of a Risk-based approach to Software System Testing
Extract from ISO 13485:2016
ISO 13485:2016 Section 4.1.6 “Quality management system, General requirements” and 7.5.6 “Validation of processes for production and service provision” state the following “The organisation shall document procedures for the validation of the application of computer software used in the quality management system. Such software applications shall be validated prior to initial use and, as appropriate, after changes to such software or its application. The specific approach and activities associated with the software validation and revalidation shall be proportionate to the risk associated with the use of the software. Records of such activities shall be maintained”.
In a nutshell, what does the industry need to do?
The author, John Lafferty, has broken down the requirements into three elements, as shown in Figure 5 below:
Figure 4: Software Validation – 3 Key Elements
In Figure 5 below, the author offers a suggested layout for documenting risk within the Master Validation Plan or Master register.
Figure 5: Suggested layout for documenting risk within the Master Validation Plan or Master register
The Risk Rating shown above could be binary – High-Risk or Not High-Risk, or High, Medium or Low risk.
According to both the FDA CSA Guidance 2025 and GAMP 5 Ed. 2, developing a validation strategy begins with ‘Critical Thinking’.
What is Critical Thinking?
Critical Thinking involves taking a step back and asking fundamental questions such as;
The output of critical thinking will determine if the overall system and specific elements of the system are High-Risk or Not High-Risk. Critical thinking should be performed using a cross-functional team. Clinical input may be required if clinical matters are being discussed.
Record Keeping of the Critical Thinking
It is advisable to keep records of Critical Thinking outcomes; these will provide part of the rationale for the chosen validation approach.
Perform a Software Risk Assessment
Critical Thinking should be followed by performing a more detailed risk assessment of the system. Auditing bodies often regard the software risk assessment as the most important element of the validation. Another valuable input to determining the validation strategy is the GAMP Category of the system.
The GAMP 5 guideline is the easiest model to follow to categorise the software. The tables shown in Figure 6 below outline the GAMP 5 classification of software and the associated validation effort required:
Figure 6: GAMP 5 classification of software and validation effort required
A combination of the risk level and the GAMP Category can be used to develop a validation strategy for the software, as shown in the table below. It should be noted that the validation options shown below are an example of a possible validation strategy only, and should not be considered definitive.
Figure 7: Example of a Validation Strategy
Another aspect to consider, in relation to the software validation requirements of ISO 13485:2016, is software validation of outsourced processes. It has been noted at regulatory audits that auditors are more frequently requesting the reference numbers of the software validations of any critical processes that are outsourced by the organisation.
For example, if an organisation chooses to outsource a process, e.g. sterilisation, auditors may require the device manufacturer to maintain a record of the reference numbers of the software validation (if applicable) of that outsources sterilisation process.
This requirement is tied in with section 4.1.5 of ISO 13485:2016 as follows; “When the organization chooses to outsource any process that affects product conformity to requirements, it shall monitor and ensure control over such processes. The organisation shall retain responsibility of conformity to this international standard and to customer and application regulatory requirements for outsourced processes. The controls shall be proportionate to the risk involved and the ability of the external party to meet the requirements in accordance with 7.4. The controls shall include written quality agreements”
We deliver training courses in Software Validation, you can learn more and book dates here. We also provide company-specific training tailored to your individual needs – contact us for a quote anytime.
Explanation of Abbreviations Used in this Article:
AFAP: As-far-as-possible
Annex 11: EU GMPs Vol 4 Annex 11 Computerised Systems
ALARP: As-low-as-reasonably-practicable
CFR: Code of Federal Regulations (US Federal Law)
COTS: Commercial-off-the-shelf (systems)
CSA: Computer Software Assurance
CSV: Computerised Systems Validation
dFMEA: Design Failure Modes and Effects Analysis
DQ: Design Qualification
DS: Design Specification
ERES: Electronic Records and Electronic Signatures
FAT: Factory Acceptance Testing
FDS: Functional Design Specification
FMEA: Failure Modes and Effects Analysis
FMECA: Failure Modes Effects and Criticality Analysis
FRS: Functional Requirements Specification
FS: Functional Specification
GAMP: Good Automated Manufacturing Practice
HDS: Hardware Design Specification
IQ: Installation Qualification
LTPD: Lot Tolerance Percent Defective
OQ: Operational Qualification
Part 11: 21 CFR Part 11 Electronic Records and Electronic Signatures
PQ: Performance Qualification
pFMEA:: Process Failure Modes and Effects Analysis
PHA: Preliminary Hazard Analysis
RTM: Requirements Traceability Matrix
SAT: Site Acceptance Testing
SDS: Software Design Specification
SV: Software Validation
UAT: User Acceptance Testing
URS: User Requirements Specification
Sign up to receive the latest industry and company news direct to your inbox.