# Validation Scripts - Python

The **Validation Script** tab helps you define, manage, and test the data validation logic for your project. It is divided into two key sections: **Checks** and **Records**.

## Checks

The **Checks** page displays all the validation checks generated by the AI, grouped question-wise.

* Each check is shown in a **human-readable format**, so you can easily understand what the validation is doing.
* You can click **Edit with AI** to modify the check if it doesn’t meet your expectations or needs tweaking.
* Clicking **View Code** reveals the actual **Python script** being executed under the hood for that validation check.
* Use **Run** to execute the check for that specific question — you’ll get real-time feedback on whether the data passes or fails the check.
* You can also **Add Check(s)** manually for any custom validation not already captured.

<figure><img src="/files/2I3rddQy0oLm48AgZ2LM" alt=""><figcaption></figcaption></figure>

If a check fails for some records, Metaforms will display a note like:

> “**X records failed this check**”\
> Clicking this takes you directly to the **Records** tab, filtered to show **only the failed records** for that specific check.

<figure><img src="/files/Yfwz3bc1UL7w1B3S4MIn" alt=""><figcaption></figcaption></figure>

If a check has a **syntax error**, you’ll see a warning - and can click "**Fix with AI"** to automatically attempt to correct the code.

<figure><img src="/files/rEZqW9d6CpUkoDhNZ3pB" alt=""><figcaption></figcaption></figure>

## Records

The **Records** section is where you can review which data rows failed validation.

* To use this tab, you’ll need to **upload your `.SAV` file by clicking on "Upload File"**
* Once the checks are run, you’ll see a list of **failed records filtered**, making it easy to identify and act on data issues before proceeding.


---

# Agent Instructions: Querying This Documentation

If you need additional information that is not directly available in this page, you can query the documentation dynamically by asking a question.

Perform an HTTP GET request on the current page URL with the `ask` query parameter:

```
GET https://help.metaforms.ai/data-processing/build-page-walkthrough/validation-scripts/validation-scripts-python.md?ask=<question>
```

The question should be specific, self-contained, and written in natural language.
The response will contain a direct answer to the question and relevant excerpts and sources from the documentation.

Use this mechanism when the answer is not explicitly present in the current page, you need clarification or additional context, or you want to retrieve related documentation sections.
