Software Validation

Validation is a critical tool to assure the quality of computer system performance. Computer system software validation increases the reliability of systems, resulting in fewer errors and less risk to process and data integrity.
Computer system validation also reduces long term system and project costs by minimizing the cost of maintenance and rework.

Software Validation commences with a user requirement document (URS). URS is prepared to describe the critical functionalities those are required for our analysis. It is essential that the document is properly scoped in order that the procurement, installation, commissioning, validation, user training, maintenance, calibration and cleaning tasks are all investigated and defined adequately.

To scope and define an adequate validation procedure the URS has to be detailed sufficiently for various assessments to be made. The main assessment that concerns with qualification documentation is the risk assessment. This assessment is only concerned with ensuring that the degree of validation that is proposed; is compliant with the regulatory requirements.

So at this early stage it is required to execute a Validation Risk Assessment protocol against the end user’s requirements. This step is purely to ensure that the more obscure pieces of ancillary equipment and support services are fully understood and their requirement investigated, priced and included in the final issue of the URS; which will be sent out with the Request to Tender. This is an essential stage if the URS is to accurately define what depth and scope of validation is appropriate for the verification that the software will deliver all the requirement detailed in the URS.

The outcome of the Validation Risk Assessment (VRA) drives a split in software validation documentation scope, if the VRA categorizes the software validation as requiring Full Life Cycle Validation (FLCV); then a considerable amount of the software validation effort is put into establishing how the software originated, was designed and developed, in order to establish that its basic concept and development can be considered robust, sound and in accordance with best practices.

The original development plans; code reviews, methods reviews and testing plans must be available to enable this software validation documentation to be executed successfully. Once this proof of quality build is established, validation then follows a more convention path in inspections and verifications.

Software that is not classified as requiring FLCV treatment does not require this depth of verification into quality build history and is validated mainly by the more convention path in inspections and verifications.

Dynamic Testing

Dynamic testing verifies the execution flow of software, including decision paths, inputs, and outputs. Dynamic testing involves creating test cases, test vectors and oracles, and executing the software against these tests. The results are then compared with expected or known correct behavior of the software. Because the number of execution paths and conditions increases exponentially with the number of lines of code, testing for all possible execution traces and conditions for the software is impossible.

Static Analysis

Code inspections and testing can reduce coding errors; however, experience has shown that the process needs to be complemented with other methods. One such method is static analysis. This somewhat new method largely automates the software qualification process. The technique attempts to identify errors in the code, but does not necessarily prove their absence. Static analysis is used to identify potential and actual defects in source code.

Abstract Interpretation Verification

A code verification solution that includes abstract interpretation can be instrumental in assuring software safety and a good quality process. It is a sound verification process that enables the achievement of high integrity in embedded devices. Regulatory bodies such as the FDA and some segments of industry recognize the value of sound verification principles and are using tools based on these principles.

Comments for this post are closed.