Validation Scripts - Python

The Validation Script tab helps you define, manage, and test the data validation logic for your project. It is divided into two key sections: Checks and Records.

Checks

The Checks page displays all the validation checks generated by the AI, grouped question-wise.

  • Each check is shown in a human-readable format, so you can easily understand what the validation is doing.

  • You can click Edit with AI to modify the check if it doesn’t meet your expectations or needs tweaking.

  • Clicking View Code reveals the actual Python script being executed under the hood for that validation check.

  • Use Run to execute the check for that specific question — you’ll get real-time feedback on whether the data passes or fails the check.

  • You can also Add Check(s) manually for any custom validation not already captured.

If a check fails for some records, Metaforms will display a note like:

X records failed this check” Clicking this takes you directly to the Records tab, filtered to show only the failed records for that specific check.

If a check has a syntax error, you’ll see a warning - and can click "Fix with AI" to automatically attempt to correct the code.

Records

The Records section is where you can review which data rows failed validation.

  • To use this tab, you’ll need to upload your .SAV file by clicking on "Upload File"

  • Once the checks are run, you’ll see a list of failed records filtered, making it easy to identify and act on data issues before proceeding.

Last updated

Was this helpful?