Why Isn’t My Data Pipeline Sync-ing?
Sometimes you might find that your data pipeline doesn’t appear to be syncing data – that’s not great! There are a few reasons why this might be happening – this guide will walk you through actions you can do and common problems that might be affecting your pipelines, to fix that right away.
Initial Setup Authentication Errors
When you connect a destination database with Blendo, you will need to provide the necessary credentials for Blendo to sync the data into your data warehouse. In case of wrong initial credentials, we will try our best to give you a descriptive message so you know where the problem is. Common errors are:
- Wrong username and password credentials supplied
- Wrong Host address or Port number supplied.
- Wrong Database name
- Network connection errors / Firewall rules / Haven’t white-listed the necessary IP addresses.
Other Authentication Data Source Errors
If you have succeeded to authorize Blendo, but you receive a notification that “your pipeline from {your source} to {your destination} database has failed at {time of pipeline execution} due to an authentication error…” it means that Blendo’s permissions to that service are no longer valid.
Reasons for this happening are:
- The credentials you used originally to authorize Blendo changed or are not valid anymore. For example, a username or password changed, or a security rule may not be active anymore.
- Firewall rules are not valid anymore.
- The database you are trying to send data to has a downtime.
- Blendo is experiencing an internal issue.
Before contacting support here are workarounds you may try.
Verify your Credentials
Please verify you used the correct credentials with your data source. Things to check are:
Username / Passwords
In order to check that you need to make sure you have the correct username and password credentials for your instance. You may find help in the following sections for Amazon Redshift, Google BigQuery(oAuth), PostgreSQL, Microsoft SQL server, Snowflake, and Panoply.
Host address or Port number
In order to check that you need to make sure you have the correct Host address and port for your instance. You may find help in the following sections for Amazon Redshift, PostgreSQL, Microsoft SQL server, Snowflake, and Panoply.
Permissions Settings
Some integrations may require certain user roles or permissions to access data. For example here are the instructions on BigQuery Roles. You may find help in the following sections for Amazon Redshift, PostgreSQL, Microsoft SQL server, Snowflake, and Panoply too.
Firewall rules / Networking Settings
In order for Blendo to send data to your data warehouse ,it needs to connect through the network. So that means certain IP addresses have to be whitelisted. Please find below the networking rules section for:
- Amazon Redshift
- Google BigQuery
- Microsoft SQL Server ( AWS, Azure, Generic Microsoft SQL Server)
- PostgreSQL (AWS RDS, Heroku, Generic PostgreSQL)
- Snowflake
If you need to reauthorize a destination integration please follow the next steps: Re-authorize Destination Integrations
- Go to Blendo Dashboard
- Find the destination you need to reauthorize.
- Click on Settings
- Click Edit.
- In the new pop-up change the credentials you need and click Validate and Save.