Vexdata.io Documentation
  • Vexdata Overview
  • Installation
    • System Requirements
    • Windows Installation
    • Linux Installation
    • Cloud Installation
      • GCP
      • Azure
      • AWS Cloud Deployment Architecture Diagram
        • Pre-requisites
        • Manual Install on ec2
        • Cloud Formation Scripts (command line)
    • Admin Post Installation Setup
    • User Post Installation Setup
    • Server Improvement (Optional)
  • Getting Started
    • Roles and Permissions
    • Managing Groups and Projects
      • Folders
      • Projects
    • Settings
      • Slack Integration
      • Integrations
      • Rules
      • Properties
      • Integrations
      • Email Template
      • Report Template
      • Create Data Connections
        • Databases/Warehouses
        • Run On
  • Creating and Executing Test Cases
    • Test Case
      • Create Test Cases
        • Define Data Source and Target
          • Configure Table/s from Database
          • Input SQL Data
          • Input Files
            • Excel/CSV
            • XML
            • JSON
            • Parquet/AVRO
          • API Input
          • Table Filter
          • Advance - Input
        • Data Compare
          • ETL Testing/Cell By Cell Compare
            • Data Compare Mapping
            • Data Compare Options
          • Schema Compare
          • Counts Compare
        • Schema Monitoring
        • Data Quality
          • SQL Rules
          • Column Level Rules
          • Duplicates
      • Executing Test Cases
      • Defining Test Case Settings
    • Test Runs
    • Variables
    • Note
    • Label
  • Reports
    • User Reports
    • Admin Report or Manager Reports
  • Current Limitations
Powered by GitBook
On this page
  1. Creating and Executing Test Cases
  2. Test Case

Executing Test Cases

Steps to executing a Test case

PreviousDuplicatesNextDefining Test Case Settings

Last updated 1 year ago

Different ways of executing a test case

This documentation outlines the three methods available for executing test cases within the system. Each method caters to different scenarios and user needs.

1. Executing a New Test Case

To run a new test case immediately upon creation:

  • Navigate to the configuration step of creating a new Test Case.

  • Enter the name of the Test Case and click the Save button.

  • After saving, click the Save & Run button to execute the Test Case.

  • The execution results will be automatically added to the Test Runs tab.

Click on Save and the test case is saved and is visible in the "Test Case" page.

2. Batch Execution of Test Cases

For executing multiple test cases in a batch:

  • Go to the Test Case tab.

  • Select the desired Test Cases.

  • Click on the Execute Batch button to open a popup window.

  • In the popup window:

    • Batch Name: Provide a mandatory name for the batch.

    • Change Execution Order: Drag and rearrange the test cases as needed. The test case at the top of the list will execute first.

    • Scheduling:

      • By default, the batch is set to execute immediately under the option One-time (now).

      • Optionally, set the batch to execute periodically by selecting Periodically and entering the required cron expression.

3. Re-executing Previous Batches

To rerun a batch from prior executions:

  • Navigate to the Test Runs screen to view all previous batch executions.

  • Locate the batch you wish to run and click the Run Batch button to execute it immediately.

By following these steps, users can efficiently manage and execute their test cases as individual runs or in batches, with options for immediate or scheduled execution.