How to Effectively use Parameters
In this article, we will explore the effective utilization of parameters in iceDQ. Parameters play a crucial role in executing the same query with different sets of values, providing flexibility and efficiency in data validation. We will learn how to create and implement parameters within iceDQ to enhance query customization.
Steps
Below steps have been followed in the video.
Creating Parameters
To begin, navigate to the parameter tab in iceDQ and create a new parameter. Name it as "date" since our objective is to parameterize queries based on different dates. Add two keys - "source transaction date" and "target transaction date" - both with a default value of "2023-05-10."
Add Parameters in Reconciliation Rule
Next, create a new reconciliation rule, selecting the date parameter file. Establish the source data connection, such as the Adventure Works database, schema, and transaction history table. Customize the SQL query with a filter condition based on the parameter to fetch data for the specified transaction date.
Similarly, set up the target data connection to Adventure Works Data Warehouse, schema, and transaction detail table. Modify the target query using the same parameter-based filter condition.
Defining Join Conditions and Checks
Set the join conditions using column names since they match in both source and target tables. Pin the primary key and add other columns as checks.
Publish & Run
Publish the rule and execute it. Review the results to ensure data alignment between the source and target based on the specified parameter values.
Parameter Flexibility
Parameters offer flexibility to modify query parameters for different scenarios. For example, change the parameters to a new date, such as 05/11, and execute the rule again. Refresh the run to observe the data for the updated date.
Video: How to effectively use Parameters in iceDQ
Conclusion
By effectively using parameters in iceDQ, we can easily reuse SQL queries with different sets of parameters, providing valuable insights into data validation and facilitating data analysis based on various dates. Implementing this approach optimizes data processing and ensures data quality for different business requirements.