Computer systems innovation is an ongoing fact of life. Businesses are now more productive than ever but have higher risk cybersecurity issues. Further, computer systems validation activities are often still conducted on paper and include inefficient wasteful practices. This paper discusses cybersecurity and validation, green “paperless” validation, and lean validation strategies. Considerations of these topics will help eliminate waste and streamline the overall validation process. Validation engineers must develop testing strategies that provide objective evidence that computer systems protect networks, computers, programs, and data from attack, damage, or unauthorized access. ISPE and FDA have issued recommendations addressing cybersecurity. Validation strategies are demonstrating increased sensitivity to environmental, social, and economic considerations. Replacing wasteful paper-based processes with secure electronic processes is mandatory. The costs of paper validation and associated activities are significant. Green validation software incorporates electronic document and process workflows, electronic signatures, as well as requirements tracking and test protocol integration to help facilitate validation. Green computer systems validation includes integrated risk assessment to ensure that all validation efforts are conducted according to their defined risk. Lean manufacturing practices may be similarly applied to computer systems validation. Topics briefly discussed include manual testing, execution, and traceability; undocumented risks and requirements; over-testing; over-customization; and poor defect management. Current system innovations can automate and streamline validation processes while moving to minimize or eliminate costly paper-based processes to maximize efficiency and minimize waste.
For the past 30 years, computer systems innovation has moved at light speed. From main frames, to client/server, to cloud and mobile computing systems, highly regulated enterprises have rapidly adopted these technologies to accelerate business and to facilitate compliance with global mandates. In life sciences, software validation is a fact of life for highly regulated mission-critical business processes. The upside of this trend is that businesses are now more productive than ever. The down side of this new innovation and productivity is the ever-increasing security risk and vulnerability of highly sensitive regulatory data. Cybersecurity is a real issue in today’s business environment—yet most validation practices have not changed fast enough to keep pace with today’s security challenges. Further, many companies still conduct computer systems validation activities on paper.
For years, life sciences companies have struggled with the documentation burden associated with software validation. Over the past two decades, there has been much discussion about “paperless” validation. The idea, however, of paperless validation as well as other paperless processes has largely been an elusive goal that has plagued enterprise technology. During the early days of document management system deployment, a key driver of these systems was the promise of the “paperless” office environment. After millions of dollars spent on enterprise technology, paper lives on!
This article will explore the new challenges and new strategies associated with computer systems validation in the 21st century including: (1) cybersecurity and validation, (2) green “paperless” validation strategies, and (3) lean validation strategies. Implementing these approaches will help eliminate waste and streamline the overall validation process.
CYBERSECURITY AND VALIDATION
Computer systems used in highly regulated companies include sensitive and valuable information. Some of this information includes valuable electronic submissions, clinical information, medical device design control records, legal information, and other such information. Recent events have highlighted sophisticated cybersecurity attacks everywhere including the White House! Corporations have been attacked at an alarming rate. What does this mean for computer systems validation? Validation engineers need to start reconsidering testing strategies that confirm and provide objective evidence that computer systems have the requisite technologies, processes, and practices designed to protect networks, computers, programs, and data from attack, damage, or unauthorized access.
Many validation engineers today already create security test plans to document system security. Sadly, many regulated computer systems rely heavily on antiquated technology processes, such as password protections, to protect computer systems. In today’s challenging environments, new strategies are needed to address the formidable challenges associated with cybersecurity attacks.
Cybersecurity vulnerabilities create risks for validated computer systems, mobile devices, computers, smartphones, and tablets used to manage mission-critical data. In 2005, ISPE GAMP 5® recommended a risk-based approach to validation. The loss or theft of sensitive corporate, legal, and regulatory information places companies at heightened risk. Global companies are faced with the challenge of addressing cybersecurity threats, mitigating risks while embracing the latest innovative technologies. Cybersecurity issues have garnered the attention of the U.S. Food and Drug Administration (U.S. FDA). The Agency recently release two guidance documents related to cybersecurity:
- Content of Premarket Submissions for Management of Cybersecurity in Medical Devices - Guidance for Industry and Food and Drug Administration Staff, issued on October 2, 2014
- Cybersecurity for Networked Medical Devices Containing Off-the-Shelf (OTS) Software, issued on January 14, 2005.
The guidance documents urges manufacturers of medical devices to consider cybersecurity risks when designing and developing their medical devices. Several key recommendations from the guidance documents could be applied to any validated computer system project include the following:
- Cybersecurity Controls versus Risks. The new FDA guidance advises companies to include a traceability matrix that demonstrates the relationship between the cybersecurity controls implemented and the cybersecurity risks that were contemplated. This is an excellent strategy to ensure that companies have objective, documented evidence for testing cybersecurity controls.
- Validate Software Changes Made to Address Cybersecurity Vulnerabilities. The Cybersecurity for Networked Medical Devices Containing Off-the-Shelf (OTS) Software guidance document recommends “…You should validate all software design changes, including computer software changes to address cybersecurity vulnerabilities, according to an established protocol before approval and issuance….”
- Fuzz Testing. The FDA announced a new cybersecurity laboratory in which a fuzz testing capability is to be integrated. According to the FDA, fuzz testing is a negative software testing method that feeds a program, device, or system with malformed and unexpected input data in order to find defects. Malformed, anomalous input such as overflow, underflow, or repetition trigger a vulnerability in software, causing for example crashes, denial of service (DoS), security exposures or performance degradation. The test target is carefully monitored during testing to detect the failures. When software is fuzz tested proactively, vulnerabilities can be found and fixed before deployment, resulting in more secure and robust, high quality software. Fuzz-tested product has less critical vulnerabilities needing to be patched. This means less cost from patch development, release, and product recalls. See “Codenomicon Defensics – Fuzz Testing Software,” Solicitation Number: FDA-13-1120705 (July 21, 2013).
Validation engineers need to be more vigilant in today’s sometimes-hostile systems environment in order to detect cybersecurity issues before they become problems. Regulated companies rely more and more on computer systems and networks to drive business decisions and operations. Validation testing is all about finding issues before they get to production. A good cybersecurity strategy for validation testing should include the following:
- Validation testing data risks and vulnerabilities
- Rigorous validation testing of network and computer systems to reveal security flaws
- A comprehensive disaster recovery plan that includes plans for cybersecurity attacks
- Cybersecurity risks and controls in ISPE GAMP 5® risk assessments
- “Fuzz” testing in your validation testing strategy.
Validation engineers should recognize that security-specific validation testing of software must go beyond addressing software defects that emerge during ordinary usage. Cybersecurity threats demand more sophisticated validation testing designed to anticipate the unknown threats common with cybersecurity breaches and hacks to ensure that companies stay one step beyond the hackers. Fuzz testing should be employed based on risk. These risks may be business or regulatory risks as well as risks to public health and safety or marketed product.
The U.S. FDA selected a fuzzing tool called “Codenomicon Defensics.” According to Codenomicon’s website “…Defensics is a powerful testing platform that enables developers and asset owners to proactively discover and remediate unknown vulnerabilities in software and devices…” (http://www.codenomicon.com/products/). Validation engineers need to recognize that cybersecurity issues that may affect the capabilities and availability of validated computer systems are often non-functional. Thus, traditional functional validation testing is not sufficient.
How much testing is enough? This question has been asked many times by validation engineers over the last three decades. Companies such as Codenomicon have developed a suite of tools that address various system security requirements. A common complaint among many is the expense of validation testing. This expense much be weighed against the monetary risks of a cyberattack. It is not a question of “if” a cyberattack will occur -- experts believe it is a question of “when.”
There are now standards around cybersecurity which may be referenced for validation initiatives. The most rapidly emerging standard series is the IEC 62443 series, which includes 12 standards intended to serve as requirements for building secure and robust systems and devices. The U.S. FDA formally recognized the IEC 62443 standard as foundation for medical device cybersecurity in their latest guidance document.
The leading organization in developing certification requirements that align with the IEC 62443 series is the ISA Security Compliance Institute (ISCI), through its comprehensive certification program known as ISASecure. This program consists of the following:
- EDSA—Embedded Device Security Assurance
- SSA—System Security Assurance
- SDLA—Security Development Lifecycle Assurance.
Cybersecurity threats are changing the way validation engineers conduct computer systems validation. It is important not only to understand the risks and vulnerabilities. Validation engineers must have clear strategies to address the ever-increasing, ever-changing cybersecurity threats to computer systems to ensure their integrity, reliability and ultimate security.
GREEN VALIDATION STRATEGIES: PRACTICAL STRATEGIES FOR 21ST CENTURY VALIDATION
Life sciences companies today face mounting pressures to deliver systems with enhanced security, traceability, and control to ensure sustained compliance. In addition to these concerns, today's life sciences companies must operate within an environmental, social, and economic context where environmental sustainability is a key element of overall business operations. More and more, companies are independently seeking ways to become better stewards of natural resources taking into account the needs of future generations.
The overall premise of the “green” economy is to reconfigure business processes and infrastructure to deliver better returns on investments, while at the same time reducing greenhouse gas emissions, extracting and using fewer natural resources, and creating less waste. Given this imperative, any improvement in operational efficiency, cost and risk are compelling business drivers to replace inefficient, wasteful paper-based processes with secure, electronic ones. In support of green initiatives, I have been promoting the concept of “green validation.” The term “green validation” refers to a new, more responsible approach to software validation that leverages ISPE GAMP® methodologies and advanced technologies to promote a paperless validation environment. Now, more than ever, is the time to awaken the vision of green validation and move this strategy from vision to reality.
The Problem with Paper
There is an old adage in the life sciences community that says “… if it’s not documented, it didn't happen….” This could explain in part the love affair with paper. Paper is convenient. Most people still prefer to read printed documents in spite of all of the technology we have deployed. Paper is necessary to maintain an audit trail of paper records required by current global regulations. Organizations spend millions on content management systems with 21 CFR Part 11 signatures only to print the paper documents out and sign them by hand. Yet, what many people don't realize is that paper is expensive. Consider the fact that recent studies have repeatedly shown that in most corporate environments, knowledge workers spend up to 40 percent of their time trying to find paper documents. An excessive amount of time is spent searching for and retrieving documents costing billions of dollars of wasted time each year in the US alone. The cost of paper is about .003 cents per sheet. Thus, if a typical life sciences company purchases 10 million sheets of paper annually, the total cost is approximately $30,000. 95% of this paper will have to be disposed of and most will end up in filing cabinets. When you add up the cost of photocopying, printing, faxing, mailing, storage, and disposal costs associated with paper, studies reveal that the costs can skyrocket to nearly $500,000 per year just to manage paper.
Validation is very paper-intensive. All initial validation documents and their subsequent changes must be tracked and managed in a controlled manner. In addition, software validation documentation must be comprehensive to support the “intended use” principle. Given the broad range of systems on the market today, one of the main challenges associated with validation is applying a consistent methodology across multiple systems. For commercially off-the-shelf (COTS) software, this is particularly important. Most COTS vendors offer “validation test scripts” with their solutions. Given the varying levels of understanding of validation among the COTS players, companies must deal with the inconsistencies of the COTS-developed test scripts.
Consider the statement in the guidance for software validation “... computer systems used to create, modify, and maintain electronic records and to manage electronic signatures are also subject to the validation requirements....”
Also, 21 CFR Part §11.10(a) “…Validation of systems to ensure accuracy, reliability, consistent intended performance, and the ability to discern invalid or altered records….”
Beyond compliance, validation can offer significant business value and can be considered as somewhat of a “legal best practice” in providing traceability and accountability within business processes. FDA regulations are very clear in their expectations that organizations must adopt and follow very specific processes and procedures to ensure compliance. It is important to realize that regulations provide guidance and do not advocate the use of any specific technology to meet regulatory requirements. It is also important to understand that the U.S. FDA as well as other global regulations have embraced and even suggested the use of advanced technology to drive regulated business processes. It is a well-established fact that many processes including validation are more effective if driven by technology. Paperless systems help greatly to facilitate compliance audits and help to reduce regulatory risk. From a validation perspective, it is time to move into the 21st Century to promote “green validation.”
Green Validation: Paperless or Less Paper?
Validation lends itself nicely to automation. When it comes to validation, however, there is a clear distinction between paperless validation and validation with less paper. Green validation does not necessarily mean “no paper.” Paper is here to stay. It is a reality and a fact of life. There are clear business and environmental imperatives, however, for green validation. How is this achieved? The good news is that there are organizations developing validation toolkits and online electronic validation systems allowing you to track and manage requirements and validation protocols online to effectively produce validation document deliverables using an automated approach.
The concept of green validation includes integrated systems with built-in validation best practices such as ISPE GAMP 5® to drive validation efforts. Through the use of green validation software, you have the ability to apply validation principles and best practices to any validation project in a consistent, electronic manner. At the heart of the conceptual green validation system is a “requirements engine” that provides the ability to define validation user requirements and automate the traceability of the validation requirements to the validation test scripts. The automation of validation traceability will go a long way to save time and expense. Green validation software incorporates electronic document and process workflows, electronic signatures, as well as requirements tracking and test protocol integration that all help to facilitate validation. Most importantly, green computer systems validation includes integrated risk assessment to ensure that all validation efforts are conducted according to their defined risk.
Defect management or incident management is critical for validation projects. No green validation strategy would be complete without the ability to manage defects from validation testing. In a “green” validation world, defects should be automatically captured, managed, and reported in a closed-loop fashion. Validation engineers should have full visibility of their status throughout the validation process.
With respect to validation project management, green validation systems should have the ability to allow validation project managers to track and manage multiple validation projects easily and provide “validation intelligence” or business intelligence dashboard functionality. This allows project managers to quickly identify bottlenecks and address them to keep validation projects on time and within budget.
And finally, no green validation system would be complete without the ability to produce, manage, and track validation deliverables in a secure, compliant manner. The system would include helpful reminders of key validation due dates and assess the impact of changes to help maintain the validated state.
ValidationMaster™ is the first enterprise validation management system that includes the above capabilities with a fully integrated custom portal based on Microsoft SharePoint® that delivers these capabilities and more.
Taking Validation To The Next Level
Some may think, “why even consider the environment when discussing validation?” As we all try to do more with less, it is about working smarter as well as efficiently. Validation is a process that cries out for automation. Companies are spending excessively in terms of validating COTS software applications and creating reams of paper in the process. The same goes for custom solutions. In addressing these challenges of inefficiency and expense, it is prudent also to consider the impact on our environment because it is good business. Green validation systems take software validation to the next level. By leveraging advanced technology, companies can use less paper or paperless processes to produce validation documentation deliverables and maintain their systems in a validated state. Validation 2.0 is taking shape. The technology is here, and the time for green validation is now.
LEAN VALIDATION: STREAMLINING THE VALIDATION PROCESS
Lean manufacturing practices were designed by Toyota to help make value flow at the pull of the customer (just-in-time) and to help prevent and eliminate waste in their processes. The so called seven manufacturing wastes were categorized as:
The theory was that all of the wastes highlighted above have a direct impact on costs and are all non-value adding operations. Since the inception of validation and verification processes, there has been much waste and non-value added functions in the conduct of validation. What if lean principles were applied to computer systems validation?
What Is Lean Validation?
Lean validation is defined as the execution of validation and verification processes with as little waste as possible. Lean validation applies the principles and best practices of lean manufacturing to derive the best balance between quality, security, efficiency, and compliance. In today’s economy, doing more with less is the name of the game. Thus, lean validation is a strategy whose time has come.
In lean manufacturing, two Japanese words are often used which are:
- muda (non-value-adding work)
- muri (overdoing)
If you were to examine validation initiatives across the globe, you will find a lot of “muda” and “muri” perhaps in the form of:
- Conducting unit testing on COTS software rather than relying on supplier documentation (muri)
- Requiring more signatures than needed on validation documentation (muda)
- Non-risk-based validation testing (muri)
- Over-testing low risk applications (muri/muda)
- Manually capturing screenshots during computer validation testing (muda)
- Manual validation test execution (muri).
There are many other examples too numerous to list. It is clear that there is much room for improvement of the validation process to eliminate wasteful activities. Given today’s level of innovation, validation still remains one of the most wasteful manual processes in the information technology world.
If we look at the seven wastes from a validation perspective, they would be:
- Manual test capture and execution
- Manual requirements traceability
- Undocumented risks
- Undocumented requirements
- Poor defect management.
Careful examination of each of the above will eliminate the waste in your validation processes.
Manual Test Capture and Execution
For years, validation testing has primarily been conducted manually and on paper. In the early days, automated tools to support validation test script capture and execution were not readily available. Today, there are many tools available on the market, but they are not tightly integrated. Thus, validation engineers find themselves in the position of doing wasteful activities such as transferring data from one application or one system to another to complete validation activities or even worse, cutting and pasting screenshots for expected and actual results using old technologies such as “print screen.” The waste is compounded during regression testing initiatives required to maintain the validated state. In the manual world, test scripts are not reusable. Thus, when applications or systems are updated, validation engineers spend more wasteful time generating new test script. This causes many validation teams to cut corners to get systems back in a validated state quickly. In the worst case, some validation engineers skip testing all together for routine updates due to business pressures to get validated systems up and online quickly.
All of the manual test capture and execution processes cost businesses considerable time and money as they seek to establish and maintain the validated state.
Automated test capture and execution is the key. There are technologies such as ValidationMaster™ that deliver capabilities to create and execute fully documented IQ/OQ/PQ and unit test scripts in a fully automated or semi-automated manner saving both time and money. For regression testing exercises, the benefits of automated test scripts cannot be overstated. Instead of writing test scripts over again and managing reams of paper documents, automated test scripts may be triggered for test execution after updates/patches/enhancements are applied to the validated systems. The automated test scripts will quickly reveal where system vulnerabilities are and allow the validation engineer to focus on critical issues while thoroughly testing the application after a change as recommended by the FDA. The cost savings of this approach from practice may yield up to 40% - 60% cost savings during the validation testing phase.
Innovations exist today to streamline this very wasteful area and allow validation engineers to enjoy time savings as well as cost savings across the validation testing process.
Large enterprise systems such as enterprise resource planning systems (ERP) may have a significant number of requirements. Tracing these requirements manually to avoid errors involves considerable time and effort. This is often very wasteful and adds little value for the amount of effort expended. There are solutions on the market such as ValidationMaster that promote automatic traceability between user requirements and test scripts eliminating the need for manual traceability. When requirements are created, ValidationMaster™ allows test scripts to be created directly from requirements thus ensuring automatic traceability. In the case where multiple requirements may apply to a specific test script, systems like ValidationMaster™ allow the user to assign multiple requirements to one specific test script with the click of a mouse. The system generates forwards and backwards traceability automatically thus eliminating wasteful activity by a validation engineer going requirement by requirement to trace each one to a test script.
Today’s innovative technology for automated validation processes provides an easy practical solution to help eliminate waste in this key area. Traceability between requirements and test scripts is mandatory for FDA validated systems. Although this is a necessary process to ensure quality and compliance there are tools on the market that would allow you to meet this compliance challenge in the most efficient manner.
ISPE GAMP 5® and FDA regulatory guidance (Pharmaceutical cGMPs For The 21st Century—A Risk-Based Approach Final Report – September 2004) promoted the concept of a risk-based approach to validation. Part of this initiative was to encourage the early adoption of new technological advances by the pharmaceutical industry, facilitate industry application of modern quality management techniques, encourage implementation of risk-based approaches that focus attention on critical areas, and ensure consistency across the various processes. The guiding principles of this approach were:
- Risk-based orientation
- Science-based policies and standards
- Integrated quality systems orientation
- International cooperation
- Strong public health protection.
Although most of the recommendations in the FDA’s report focused on manufacturing, many of the principles can be applied to validation projects. The most common question asked regarding validation is how much validation is required? Between the FDA guidance and ISPE GAMP 5® the recommended approach is clear—conduct a risk assessment to guide how much validation due diligence is required. The risk assessment should indicate to the validation engineer how much validation is required based on risk. In the early days of validation, many validation engineers overdid it when it came to validation of systems out of fear of regulators. The mantra was to validate everything to ensure that all bases were covered. This approach is very wasteful and often doesn’t achieve the desired results. A risk-based approach allows organizations to focus their validation activities on the riskiest aspects of the system to help streamline validation processes. Integrated testing is common in the strategy.
Using this approach, it is recommended that validation engineers define critical process parameters (CPP) that affect quality and identify critical quality attributes (CQA) of the product during the risk assessment process. The risk assessment should identify proper controls and mitigations that are clearly documented. The risk assessment should not only determine which system should be validated (impact, criticality, risk potential) but also identify the boundaries of the system to be validated. Defining risk in this manner can save significant time and money by avoiding over-testing as well as over customization of highly regulated systems.
User requirements and functional requirements are essential parts of the validation documentation strategy. Requirements define the intended use of the system. One of the most wasteful activities in the validation process is undocumented requirements. The failure to document requirements often results in the need for retesting the system as well as regulatory issues which may arise during an audit for failure to define the intended use of the system. Failure to document requirements may often result in the need for retesting to cover aspects not previously documented.
The management and control of requirements is also one of the areas that can be very wasteful during the validation process. Validation requirements are typically captured in Microsoft Word or Microsoft Excel documents. When requirements change, validation engineers sometimes struggle finding the latest version of the requirements among the thousands of email messages that they may have been attached to or searching among non-control documents and non-validated document management systems where requirements are sometimes housed. One industry study claimed that knowledge workers spend 60-80% of their time searching for information. This is a significant amount of waste in the process just looking for documents.
A better way to manage user requirements is to automate the process of collecting and managing the requirements. Automated systems are key here. Through systems such as ValidationMaster™, users have the ability to manage user requirements in a more automated fashion. Instead of routing user requirements through inefficient email and document management processes, requirements are controlled and traced to their respective test scripts in an automated manner. This eliminates the waste of having to trace requirements over time. For each of the releases of software, requirements can be traced to a specific release and be managed under version control. This will eliminate the wasteful activity of validation engineers having to manually assigned versions to user requirements documents over time. Further in automated systems such as ValidationMaster™, users have the ability to track who change requirements and why. These metadata are not often associated with paper documents.
Requirements are documented but often it is not clear who change the requirement and why.
Tracking and managing user requirements and functional requirements in an automated manner is a 21st-Century approach to a basic process in validation that can save considerable time, money, and avoid unnecessary errors inherent in the manual validation requirements management process.
The question of how much validation testing is required as previously mentioned is one of the most frequently asked questions about validation. As stated earlier, risk-based approaches minimize over-testing. Another strategy to minimize over-testing, however, is planning. Effective upfront validation planning ensures your focusing your testing efforts on critical areas that may impact quality, safety, or efficacy of the product. Most validation planning is done through Gantt charts or other such tools that help validation engineers map out activities through the validation process.
Using lean validation techniques, validation Kanban boards with agile methodologies are essential to helping organizations minimize waste, providing a visual representation of validation projects not available before. ValidationMaster™ features a validation Kanban board which uses the stages in the software development lifecycle (SDLC) to represent the parallel stages in the validation process. The principle behind the validation Kanban is that it allows the validation process to be incremental. By controlling the number of validation tasks active at any one time, validation engineers may approach the overall project incrementally which gives the opportunity for agile principles to be applied to the validation process. It is only as tasks are completed that new tasks are pulled into the cycle. Using this method, requirements, tasks, test cases, and outstanding incidents (bugs, issues) can be scheduled against the different iterations / sprints in a specific release. The system will calculate the available effort as requirements are added. The Validation Kanban board allows validation teams to view Kanban requirement cards by iteration, status, or person for a given release or iteration. The validation Kanban also allows you to manage the number of backlog items at each stage of the validation process and also load-balance the members of the project team, saving critical time and money. This process helps to manage testing and eliminate over-testing of validated systems.
A common area of waste during validation processes is over-customization of the software application or business systems to be validated. Often companies, in their pursuit of implementing and validating systems that meet their needs, overlook basic features inherent in the out-of-the-box software and add customizations that are not necessary and may add considerable cost to the validation effort. As a general rule of thumb, the more customizations you do, the more validation due diligence is required. More validation means more expense for the project. In my 30 years of practice, I have seen over-customization be a great source of waste during the validation process. If customizations mimic basic features that are already developed in the out-of-the-box solution or close to the COTS solution, these are deemed wasteful and should be avoided.
One way to avoid over-customization is to understand business requirements up front and take time to review COTS systems thoroughly before deciding to customize. There are many instances were customization adds value. The converse is true also. In this case planning is essential. When conducting a requirements review, it is important to understand what is mandatory and what is nice to have. It is also critical to understand the limitations as well as extensibility of the solutions selected for your organization. Minimize customizations to avoid waste and work with vendors to ensure that the solution will meet the needs of the business over time. This will eliminate a great source of waste and streamline the validation process, making it easier for you to maintain the validated state.
Poor Defect Management
An essential requirement for any validation project is to manage defects or incidents which may occur during validation test preparation and execution processes. One of the main areas of waste during the validation process is the tracking and management of defects. For the past 30 years, many validation engineers have used forms either attached or detached to specific validation test scripts for the management of defects. Some organizations have specific systems that have been established specifically to manage defects, incidents, corrective actions, and preventive actions. Wasteful activities associated with defect management include the tracking and management of the many defect forms which may be associated with validation test scripts, copying information related to defects from one system to a form or yet another system designed specifically to manage defects, tracking, and managing defects among the various releases of software and during regression testing and other areas. The essential requirement for defect management is that defects be associated with the specific validation test(s) conducted.
One way to eliminate waste in this area is to have an automated incident management/defect tracking system such as the one included in ValidationMaster™ that allows you to create, edit, assign, track, manage, and close incidents (i.e. bugs, issues, risks, defects, enhancements) raised during the test script preparation and execution of the system. Incidents may be categorized into bugs, enhancements, issues, training items, limitations, change requests, and risks, and each type has its own specific workflow and business rules. Incidents can be logged in the system either through the web interface or by sending an email to a specific email address avoiding duplication of information between systems. As different users collaborate on the identification, verification, and resolution of a particular incident, their comments and resolutions are tracked as a discussion so that the case history is always available. Unlike a standalone defect management tool, incidents/defects may be traced back to the test case and the underlying requirement that generated them, giving the validation engineer unprecedented ability to analyze the “in-process” quality of a system during its development lifecycle. Using automated technologies such as this reduce the overall cost of validation, eliminate waste, and help ensure greater compliance.
SUMMARY AND CONCLUSION
Today’s system environments are rapidly changing and are more vulnerable than ever. To keep up with these changes, validation engineers must re-think and change the way their systems are validated. Manual processes may easily be converted to green automated processes, saving time and money. Validation is often viewed as a necessary evil and many companies lack the will to spend time and effort to change. Cybersecurity threats may give pause to validation engineers and cause them to reconsider antiquated processes that do not address today’s vulnerabilities. It is possible with current innovations to automate and streamline validation processes while moving to minimize or eliminate costly paper-based processes. In today’s cost-conscientious environment, can you afford not to?