NOTE: Releases v899 through v901 were removed.
Tonic Data Pipeline V2 for PostgreSQL ends beta
During the first half of 2023, Tonic has run a beta program for PostgreSQL for our new Data Pipeline V2. The beta program is now ending. Thank you to all of those who provided feedback.
Starting with version V905, Tonic.ai will progressively enable Data Pipeline V2 for all customers. To ensure a smooth transition for all our PostgreSQL customers, Tonic.ai controls the rollout remotely.
The remote rollout mechanism is controlled by an HTTPS request from your instance to https://feature-flag.tonic.ai
. A JSON payload with a unique identifier for your deployment is sent, and the status of Data Pipeline V2 is returned. This request happens at the start of each data generation. If your Tonic server cannot reach https://feature-flag.tonic.ai
, then the check is skipped.
What to expect for the enrollment:
For jobs that run on V2:
We will continue to improve Data Pipeline V2 as we expand coverage to other data connectors.
Expanded Graph View for subsetting
The subsetting Graph View is expanded to use more of the available screen space. The Configure Subset panel, which includes the Options and Latest Results tabs, no longer displays on Graph View. It only displays on Table View.
Other updates
Fixed an issue where when a data generation job failed, tables that used Preserve Destination table mode were not restored.
The generated Tonic API documentation now includes the endpoints for managing file groups for file connector workspaces.
Fixed an issue that caused jobs for some workspaces to fail with the exception "Cannot modify workspace whose schema is not the latest version.
".
Fixed an issue where the job details view displayed incorrect information.
Updated the /api/DataSource
endpoint to not contain secure data such as the API key.
Updated our OpenAPI documents to ensure that all values of operationId are unique.
Improved error messages for failed data generation.
Databricks
Test Connection
button on the workspace details view now works correctly.MongoDB
MySQL
Oracle
PostgreSQL
Snowflake
Spark
SQL Server