Traditionally, after building a study, builders have had to enter test data by hand which is very time consuming. Auto-validation is a powerful feature unique to Clinical Studio allowing study builders to enter massive amounts of test data and have the system automatically run that data against the edit checks (validation conditional actions) which have been built on the corresponding form being validated. It then records where the edit checks fired and where they didn't. The system keeps a history of these validations, also regarded as test scripts, for future reference.
The Auto-validation tool is accessed in the Form Builder.
Validating a CRF is an automated three step process culminating in a thoroughly detailed report, complete with a validation status for each field being saved to the database. Once a CRF has been validated, a validated tag can be attached to the CRF. This validated tag will stay with the CRF as long as the CRF is not changed. The user can export a validated form to the Form Library and import it to other studies. Once a form is changed, the validated tag will be cleared and the form must be re-validated. Clinical Studio clearly identifies which forms are validated and which forms are not in the Form Builder and Import Form Library.
Prior to running validation data through a CRF for testing purposes, test data must be created. Test data can be automatically created by Clinical Studio. To begin the process of CRF validation by creating Test Data Sets, select the "Form Builder..." link from the Study menu. When the Form Builder application is open, the user will see a link named "Auto-Validate Subject CRF" in the right column. The figures below show how to open the “Manage Test Data" page.
When the "Auto-Validate Subject CRF" link is selected, the Manage Test Data application opens as shown in the figure below.
Follow these steps to create test data and prepare to run the test data through the form:
1. Select the Form and Version to be tested in the Change Form section. Once the form and protocol version are selected, click the Go button shown highlighted in the figure below. The Go button will load the form field names into data table(s) located above the form.
2. In the Add Data section, select the desired number of records for which test data will be tested and the starting date for the test data. Clicking the "Go" button will generate the number of test records selected. The figure below shows the parameters that need to be completed to generate test data. If there are any Sub Forms, test data will be generated for each of them as well. Simply enter the information in the fields to generate the test data.
After clicking the “Go” button the Test Data loads into the Test Data – Main Form data table as shown below. If there are any Sub Forms, the test data will populate in a table below the Test Data - Main Form data table as well.
3. Review the Test Data Set in the Test Data table for the Main Form (and any Sub Forms if present). The user has the capability to manually alter the test data to test for a specific case. To manually edit test data, click the Record ID located in the first column of the data table. This opens the test data into the form where the user can modify and save the data. If there are any sub forms, the user can review the generated test data in their corresponding data tables as shown below. The generated test data will contain the desired combinations of data entered into the fields, including if fields are left blank.
Once the test data is generated, the user can run the Test Data Set through the form; click the link in the Run Validation box located in the right column as shown in the figure below.
When the “Run Validation” link from the Manage Test Data page is selected, the CRF is displayed in "Auto-Validate Mode" as shown in the figure below. To run the validation, click the "Auto-Validate This Form" button as shown in the figure below. When the validation is complete a message stating the validation was successful will be displayed.
The following steps take place during the running of the validation for each record:
1. For each field the data is read from the test data set and the corresponding field control is populated with the value.
2. Once the entire record has been read, the Save Button is clicked. In this case the form is being saved in Insert Mode.
3. When the form is saved, the exact same process occurs for the auto validation as would occur for the user manually entering data.
4. The system goes through the edit check process. Each time an edit check fires an error, the system logs that information.
5. Now that the form has been saved and checked for errors, the system goes and clicks the Save Button again. At this stage, the sub forms are being populated with values and they go through the exact same process as the main form.
6. This second save forces the database to do an update which is required for doing true CRF validation.
Every step of the process is posted to the validation log table that is stored in the database for each form. When the process is done, the final step is to review the validation log and declare the form validated.
The user can initiate the viewing of the validation results from this page using the "View Validation Results" link located in the right column (as shown in the figure below) or from the"View Validation Results" link located in the right column of the Manage Test Data page.
After the "View Validation Results" link is clicked, the Validation Result Sets page is displayed. This page allows the user to view the Validation Log and declare the form Validated. It also allows the user to undo a validated form. The user must understand that each form for each version has a form tag that stores whether or not that form is in a validated state. If the form is in a validated state, the date and the user who validated the form is also stored. The Validation Results page is shown in the figure below.
Reviewing the Validation Results
The Result Sets data table contains every item logged during the form validation process including any sub form data that may exist. Below the Result Sets data table is a form that allows the user to filter the results.
There are six columns in the Result Sets data table which are explained below.
The Record column displays the current record and allows the user to drill into the form and see the form in its completed state. All edit check messages will be displayed just like the data was entered into the form manually.
The Message column describes exactly what the auto validation process was doing when the event was logged for the current record.
The Type column assigns a log type to each event. This allows the user to search the results by type. For example, if the user wants to see all errors for a given field in a given record, they can select “Error” in the Message Type field in the Filter By form and then select the desired Record and Field. Each log Type is described below:
This type of message indicates the normal status during the validation process to alert the user what the system is doing at any given time. This is the most common type of event.
This type of message indicates the field in question has passed validation. Passing validation indicates that the values stored in the database are exactly the same as the values in the test script.
This type of message indicates the field did not pass validation. The data in the test script is different from the data in the database. The user will probably never see this option. It is only available if a given field did not validate as described above.
This type of message indicates an edit check was fired and it produced an error. These errors are helpful when checking to be sure that edit checks and conditional actions fired properly and produced the desired result.
This column displays the field for which the event was logged. The user can filter by any given field and see all the events for that field using the Filter By form.
This column displays the row for which the event was logged in the sub form, if the form contains Sub Forms.
This column displays the name of the user who ran the validation check.
Saving Validation Results
Validation results can be saved outside the system, by exporting the results to a spreadsheet by clicking the "Export to Excel" link located in the data table header.
Once the user has reviewed the results and has determined the form is validated, the user can declare the form "validated" by attaching a validated tag to the form. The validated tag is displayed in both the Form Builder and Form Library. The first figure displayed below shows the window that allows the user to attach a validated tag to a form. Selecting the "This Form Has Been Validated" checkbox and then clicking the "Attach Form Validated Tag" will place the tag on the form. After, the button is clicked, note the changes in the Form Validation Status window in the second figure displayed below. In addition, the second figure below shows the "This Form Has NOT Been Validated" checkbox to be selected and the "Remove Form Validated Tag" button to remove a validated tag from a form.
Once the validation tag has been attached, the form will appear as "Validated" in both the Form Builder and Form Library. In the Form Builder table, the version number will appear in green text for validated forms and red text for non-validated forms as shown in the figure below. In addition, the green text “Validated” will appear in the Version Selection Window. If the form is not validated the text will be displayed in red and read “Not Validated”.
The figure below shows the Form Library page which is displayed when the Import Form link is from the Links box on the the Form Builder page. The validation status for the each form in the Form Library is displayed in the Validated column of the Existing Forms data table as shown in the figure below.