This post will help you with syncing your QuickBooks accounting data to Google BigQuery. By doing this you will be able to perform advanced analytics on a system that is designed for this kind of data payloads, like Google BigQuery. Alternatively, you can simplify the process of syncing data from QuickBooks to Google BigQuery by using Blendo, where the whole process will be handled by Blendo and you can focus on what matters, the analysis of your accounting and financial data.
Access your data on QuickBooks
The first step in loading your QuickBooks data into any data warehouse solution, is to access the data and start extracting it through the available web API.
QuickBooks has a very rich and well defined API, reflecting the extensive development that the product has gone through. The API is designed around the following main groups of resources.
- Transaction resources.
- Name list resources.
- Report resources.
- Supporting resources.
Report resources, contain all the reports that QuickBooks also offer from within the application. They have a different data model than the rest of the resources and you need to account for these differences when extracting data from the API.
The rest of the resources contain pretty much every possible entity that QuickBooks define, each one with a different data model that is serialized in JSON.
In addition to the above, the things that you have to keep in mind when dealing with any API like the one Quickbooks has, are:
- Rate limits. Every API, has some rate limits that you have to respect.
- Authentication. You authenticate on QuickBooks using an API key.
- Paging and dealing with big amount of data. Platforms like QuickBooks tend to generate a lot of data, as financial transactions and accounting involve many different events that can happen. Pulling big volumes of data out of an API might be difficult, especially when you consider and respect any rate limits that the API has.
QuickBooks is an accounting software released and maintained by Intuit. It currently has two versions, a desktop one and a cloud based one. This guide is about the latter, where the data can be accessed through the REST API that Intuit has built around the product.
QuickBooks is mainly used by small and medium sized companies and covers the whole spectrum of accounting related activities of a company, from payroll to the management and payment of bills.
Historically, QuickBooks is one of the first accounting software that was ever released, its initial release was for the DOS operating system and it managed to dominate the small and medium sized companies market for many years.
Transform and prepare your QuickBooks data
After you have accessed your data on QuickBooks, you will have to transform it based on two main factors,
- The limitations of the database that the data will be loaded onto
- The type of analysis that you plan to perform
Each system has specific limitations on the data types and data structures that it supports. If for example you want to push data into Google BigQuery, then you can send nested data like JSON directly. But when you are dealing with tabular data stores, like Microsoft SQL Server, this is not an option. Instead, you will have to flatten out your data before loading into the database.
Also, you have to choose the right data types. Again, depending on the system that you will send the data to and the data types that the API exposes to you, you will have to make the right choices. These choices are important because they can limit the expressivity of your queries and limit your analysts on what they can do directly out of the database.
QuickBooks has a very rich data model, where many of the resources that you can access might have to be flatten out and be pushed in more than one tables.
Also, QuickBooks has a special set of resources, the reports, that have a tabular but nested format that looks similar to a complex spreadsheet. In order to make these reports compatible with a database data model, you need to redesign, parse and transform the reports into a tabular form that can be stored into a database.
Load data from Quickbooks to Google BigQuery
If you want to load Quickbooks data to Google BigQuery, you have to use one of the following supported data sources.
- Google Cloud Storage
- Sent data directly to BigQuery with a POST request
- Google Cloud Datastore Backup
- Streaming insert
- App Engine log files
- Cloud Storage logs
From the above list of sources, 5 and 6 are not applicable in our case.
For Google Cloud Storage, you first have to load your data into it, there are a few options on how to do this, for example, you can use the console directly as it is described here and do not forget to follow the best practices.
Another option is to post your data through the JSON API, as we see again APIs play an important role in both the extraction but also the loading of data into our data warehouse. In its simplest case, it’s just a matter of one HTTP POST request using a tool like CURL or Postman.
After you have loaded your data into Google Cloud Storage, you have to create a Load Job for BigQuery to actually load the data into it, this Job should point to the source data in Cloud Storage that have to be imported, this happens by providing source URIs that point to the appropriate objects.
The best way to load data from Quickbooks to Google BigQuery and possible alternatives
So far we just scraped the surface of what can be done with Google BigQuery and how to load data into it. The way to proceed relies heavily on the data you want to load, from which service they are coming from and the requirements of your use case.
Things can get even more complicated if you want to integrate data coming from different sources. A possible alternative, instead of writing, hosting and maintaining a flexible data infrastructure, is to use a product like Blendo that can handle this kind of problems automatically for you.
Blendo integrates with multiple sources or services like databases, CRM, email campaigns, analytics and more. Quickly and safely move all your data from Quickbooks into Google BigQuery and start generating insights from your data.