How to Load data from Salesforce to Snowflake

Blendo Team

This post helps you with loading your data from Salesforce to Snowflake. This article considers you are going to use custom ETL scripts to move your data from Salesforce and then model it accordingly. If you are looking to get analytics-ready data into Snowflake, without the manual hassle you can use an official Snowflake cloud based ETL partner like Blendo, so you can focus on what matters, getting value out of your data.

Extract data from Salesforce

You can’t use a Data Warehouse without data, so the first and most important step is to extract data you want from Salesforce.

Salesforce has many products and it’s also a pioneer in cloud computing and the API economy. This means that it offers a plethora of APIs to access the services and the underlying data. At this post, we’ll focus only on Salesforce CRM, which again exposes a large number of APIs.

People that write their own scripts to ETL cloud data from their data source, can benefit from this excellent post from Salesforce’s Helpdesk about which API to use.

You will have the following options:

  • REST_API
  • SOAP API
  • Chatter REST_API
  • Bulk API
  • Metadata API
  • Streaming API
  • Apex REST_API
  • Apex SOAP API
  • Tooling API

You will need more time to read this post than integrating Salesforce to Snowflake.

Effortlessly Sync All Your Salesforce Data to Snowflake

Pull data from the Salesforce REST API

From the above list, the complexity and feature richness of the Salesforce API is more than evident. The REST API and the SOAP API are exposing the same functionalities but using different protocols. Interacting with the REST_API can be done by using tools like CURL or Postman or by using HTTP clients for your favorite language or framework. A few suggestions:

  • Apache HttpClient for Java
  • Spray-client for Scala
  • Hyper for Rust
  • Ruby rest-client
  • Python http-client

The Salesforce REST API supports oAuth 2.0 authentication, more information can be found in the Understanding Authentication article. After you successfully authenticate with the REST_API, you have to start interacting with its resources and start fetching data from it in order to load them on a warehouse.

It’s easy to get a list of all the resource we have access to, for example using curl we can execute the following:

curl https://na1.salesforce.com/services/data/v26.0/ -H "Authorization: Bearer token"

A typical response from the server will be a list of available resource in JSON or XML, depending on what you have asked as part of your request.

{
"sobjects" : "/services/data/v26.0/sobjects",
"licensing" : "/services/data/v26.0/licensing",
"connect" : "/services/data/v26.0/connect",
"search" : "/services/data/v26.0/search",
"query" : "/services/data/v26.0/query",
"tooling" : "/services/data/v26.0/tooling",
"chatter" : "/services/data/v26.0/chatter",
"recent" : "/services/data/v26.0/recent"
}

The Salesforce REST_API is very expressive, it also supports a language called Salesforce Object Query Language (SOQL) for executing arbitrarily complex queries. For example, the following curl command will return the name fields of accounts:

curl https://na1.salesforce.com/services/data/v20.0/query/?q=SELECT+name+from+Account -H "Authorization: Bearer token"

and the result will look like the following:

{
"done" : true,
"totalSize" : 14,
"records" :
[
{
"attributes" :
{

Again, the result can be either in JSON or XML serialization. We would recommend using JSON as it will make the whole data moving process easier because the most popular data warehousing solutions natively support it.

With XML you might have to transform it first into JSON before loading any data to the repository. More information about SOQL can be found on the Salesforce Object Query Language specification page.

If for any reason you would prefer to use SOAP, then you should create a SOAP client first, you can, for example, use the force.com Web Service Connector (WSC) client. Or create your own using the WSDL using the information provided by this guide.

Although the protocol changes the architecture of the API remains the same so again you will be able to access the same resources etc.

After you have your client ready and you are able to communicate with Salesforce you ought to perform the following steps:

  1. decide which resources to extract from the API
  2. Map these resources to the schema of the warehouse of the data repository that you will use
  3. transform data into it and
  4. load the transformed data on the repository based on the instructions below

As you can see, accessing the API alone is not enough for ensuring the operation of a pipeline that will safely and on time deliver data you own on a data warehousing solution for analysis.

Pull Data using the Salesforce Streaming API

Another interesting way of interacting with Salesforce is through the Streaming API.

With it, you define queries and every time something changes to the data that register to this query you get notifications. So for example, every time you get a new account created the API will push a notification about the event to your desired service. This is an extremely powerful mechanism that can guarantee almost real-time updates on a Data Warehouse repository.

In order to implement something like that though, you must take into consideration the limitations of both ends, while ensuring that delivery semantics that your use case requires for every data management infrastructure that you will build.

For more information, you can read the documentation of the Streaming API.

About Salesforce

load data from Salesforce to Snowflake

Salesforce is the innovative company behind the world’s #1 CRM platform that employees can access entirely over the Internet — there’s no infrastructure to buy, set up, or manage — you just log in and get to work. But Salesforce has become something much bigger than its CRM solution, currently, it offers products for analytics, marketing, data and even cloud infrastructure for IoT. It’s easy to understand that there’s an abundance of data that your company is generating on Salesforce if you are using its products.

About Snowflake

As you will be generating more data on Salesforce, you will need to update your older data on Snowflake. This includes new records together with updates to older records that for any reason have been updated on Salesforce.

You will have to periodically check Salesforce for new data and repeat the process that has been described previously while updating your currently available data if needed. Updating an already existing row on a Snowflake table is achieved by creating UPDATE statements.

Snowflake has a great tutorial on the different ways of handling updates, especially using primary keys.

Another issue that you ought to take care of is the identification and removal of any duplicate records on your database. Either because Salesforce does not have a mechanism to identify new and updated records or because of errors on all data pipelines, duplicate records might be introduced to a database you own.

In general, ensuring the quality of data that is inserted in your own database is a big and difficult issue.

Salesforce Data Preparation for Snowflake

The first step, before you start ingesting your data into a Snowflake data warehouse instance, is to have a well-defined data schema.

Data in Snowflake is organized around tables with a well-defined set of columns with each one having a specific data type.

Snowflake supports a rich set of data types. It is worth mentioning that a number of semi-structured data types is also supported. With Snowflake, is possible to load directly data in JSON, Avro, ORC, Parquet, or XML format. Hierarchical data is treated as a first-class citizen, similar to what Google BigQuery offers.

There is also one notable common data type that is not supported by Snowflake. LOB or large object data type is not supported, instead, you should use a BINARY or VARCHAR type instead. But these types are not that useful for the warehouse of data use cases.

A typical strategy for loading data from Salesforce to Snowflake is to create a schema where you will map each API endpoint to a table.

Each key inside the Salesforce API endpoint response should be mapped to a column of that table and you should ensure the right conversion to a Snowflake data type.

Of course, you must ensure that as any data types from the Salesforce API might change, you will adapt any database tables accordingly, there’s no such thing as automatic data type casting.

After you have a complete and well-defined data model or schema for Snowflake, you can move forward and start loading any data into the database.

Load data from Salesforce to Snowflake

Usually, data is loaded into Snowflake in a bulk way, using the COPY INTO command. Files containing data, usually in JSON format, are stored in a local file system or in Amazon S3 buckets. Then a COPY INTO command is invoked on the Snowflake instance and data is copied into a data warehouse.

The files can be pushed into Snowflake using the PUT command, into a staging environment before the COPY command is invoked.

Another alternative is to upload data directly into a service like Amazon S3 from where Snowflake can access the data directly.

Finally, Snowflake offers a web interface as a data loading wizard where someone can visually setup and copy data into the data warehouse. Just keep in mind, that the functionality of this wizard is limited compared to the rest of the methods.

Snowflake in contrast to other technologies like Redshift does not require a data schema to be packed together with data that will be copied. Instead, the schema is part of the query that will copy any data into the warehouse. This simplifies data loading process and offers more flexibility on datatype management.

An easy way to ETL cloud data to Snowflake

Using cloud ETL tools like Blendo for the cloud integration needs of your data infrastructure will save you the effort of writing any custom ETL scripts and you will get consistent, always current, analytics ready data to Snowflake.

Just:

  • Connect your Snowflake data warehouse, as the Destination to load your Salesforce data
  • Connect your account from Salesforce.

*Blendo is an official Snowflake cloud based ETL partner 

Updating your Salesforce data on Snowflake

As you will be generating more data on Salesforce, you will need to update your older data on Snowflake. This includes new records together with updates to older records that for any reason have been updated on Salesforce.

You will have to periodically check Salesforce for new data and repeat the process that has been described previously while updating your currently available data if needed. Updating an already existing row on a Snowflake table is achieved by creating UPDATE statements.

Snowflake has a great tutorial on the different ways of handling updates, especially using primary keys.

Another issue that you need to take care of is the identification and removal of any duplicate records on the database. Either because Salesforce does not have a mechanism to identify new and updated records or because of errors on data pipelines, duplicate records might be introduced to your database.

In general, ensuring the quality of data that is inserted in your database is a big and difficult issue. In such cases using cloud ETL tools like Blendo can help you get valuable time back.

The best way to load data from Salesforce to Snowflake

So far we just scraped the surface of what you can do with Snowflake and how to load data into it. Things can get even more complicated if you want to integrate data coming from different sources.

Are you striving to achieve results right now?

Instead of writing, hosting and maintaining a flexible data infrastructure use Blendo that can handle everything automatically on the cloud integration level for you.

Blendo with one click integrates with sources or services, creates analytics-ready data and syncs your Salesforce to Snowflake right away.

Help your sales and executive team take ownership of the insights that live in your Salesforce CRM.

Blendo is the easiest way to automate powerful data integrations.

Try Blendo free for 14 days. No credit card required.