2025.3 Releases
This documentation outlines all releases for Q3 2025.
Release 7.1
Build Number: 28 Release Date: 07/17/2025
Data Testing
Connectors
We have added support for the following connectors and authentications to be used in rules.
Connector | Authentication Type |
---|---|
Rest API | API Key, Basic, Bearer, OAuth & No Auth |
SAP ECC* | Username & Password |
Excel (Azure Blob Storage)* | Azure Service Principal |
Flat File SQL (Azure Blob Storage)* | Azure Service Principal |
Flat File Native (Azure Blob Storage)* | Azure Service Principal |
We have added crawling support for Oracle, Teradata, SAP ECC*, Snowflake, SQL Server, Redshift, Google BigQuery, Databricks*
. This will provide users with faster responses when creating rules and generating SQL using LLMs.
Note: Everything marked with * is in beta.
API Testing
With the release of the API Connector, users can now test data from their GET and POST APIs in iceDQ. To support this, we have added two new types of Rule Templates, API Validation and API Recon. Both the templates always use an API Connector as the source.
![]() |
---|
Export Import
Users can now export and import Rules and Workflows across data testing workspaces within the same instances or across instances of iceDQ. We have released the following APIs.
Name | Method |
---|---|
Export Rules | POST /api/v1/exports/rules |
Export Workflows | POST /api/v1/exports/workflows |
Status of Export task | GET /api/v1/exports/<task-instance-id>/status |
Logs of Export task | GET /api/v1/exports/<task-instance-id>/log |
Download Exported File | GET /api/v1/exports/<task-instance-id>/download |
Terminate Export task | POST /api/v1/exports/<task-instance-id>:terminate |
Import Rules | POST /api/v1/imports/rules |
Import Workflows | POST /api/v1/imports/workflows |
Status of Import task | GET /api/v1/imports/<task-instance-id>/status |
Logs of Import task | GET /api/v1/imports/<task-instance-id>/log |
Terminate Import task | POST /api/v1/imports/<task-instance-id>:terminate |
BI Report Testing
Users can now connect and test their Qlik Cloud Dashboards and Reports from iceDQ.
Key Features
- Auth Type: Users can connect to Qlik Cloud using Qlik App OAuth.
- Flexible Filtering: Users can filter data using Bookmarks or Filter Pane.
![]() |
---|
Improvements
- Users can now create checks in bulk by uploading an Excel file and a JSON schema file, enabling faster and more efficient setup of multiple checks.
- Added support for encrypted private keys in Snowflake Key Pair authentication.
- Added support for Programmatic Access Tokens in Snowflake Username and Password authentication.
- Made the Warehouse field mandatory when creating or updating Snowflake connections.
- Added support for .dat file extensions in Flat File (SQL) connectors during rule creation and file uploads.
- Enhanced the Oracle connector to display timestamp values in exception reports and preview data for
DATE
datatype columns using the useDateAsTimestamp property.
Bug Fixes
- Fixed an issue where the provided secret was not visible at the connection level.
- Fixed an issue with publishing Script Rules using External Libraries.
- Improved performance for Flat File (SQL) connection testing when the S3 bucket size is large.
- Added Account Name display after the workspace name in the workspace selection drop-down.
Note: Over 100+ other bugs have also been resolved in this release.
Vulnerability Fixes
- 168 vulnerabilities have been resolved across multiple components, resulting in improved application security.
- The fixes span a wide range of libraries and frameworks, including but not limited to: commons-beanutils, glibc, netty, OpenJDK, libxml2, fileupload, parquet, tomcat, zookeeper, keycloak, postgresql, and spring.
- 4 vulnerabilities remain open, while 60 newly discovered vulnerabilities have been identified.
Severity | Resolved | Not Resolved |
---|---|---|
High | 53 | 2 |
Medium | 57 | 2 |
Low | 58 | 0 |
Known Issues
- Some connectors display certain data types inconsistently with and without preview when the Use Metadata Cache option is enabled.
- Crawl jobs fail to detect database, schema, or table names if filters are wrapped in double quotes, even when the filters are valid.
- No error message is shown when a rule's check name begins with a numeric character.
- Users with group-level access are unable to view dashboard access or workspace lists.