In other words, as an example, we validate if edit checks function in the way they are designed, but you validate if a specific edit check built in your study is working the way you need it to, to ensure there was no error in setting it up.
We are able to provide documentation for the validation of the software on a per-client basis. It's important to understand that each specific study, due to the ability for users to configure their own database, needs to be separately validated at that level as well.
Administrators/Study Builders may want to map out validation of their own studies, but we provide tools to help make that relatively simple. For example, you can download the data dictionaries of all the forms in your study complete with the edit checks and logic that exist on them. That can be used to check off line-by-line that they function as intended - which is a matter of entering test data. That test data can then be seen down the road in an audit as proof that the field was validated as functioning appropriately.