Skip to main content

Google Cloud Storage (GCS)

Overview


Unit allows its clients access to their raw data, so that they can build their own reporting solutions and make informed business decisions. This integration provides a daily upload into the client's bucket. You can pull the data into your own data warehouse easily with existing connectors, eliminating the need to build a sequence of API calls to obtain data.

The data is structured in a way that makes it easy to do robust analysis on any part of the banking offering. Additionally for companies who are able to securely store personally identifiable information, KYC information can be securely transferred.

By accessing the raw data and creating your own analytics stack on top of it, decision makers in your organization can make data driven decisions on various areas. Some questions that can be looked at include:

  • Which end-customer cohorts and/or marketing campaigns created the highest revenue?
  • How does pricing (interest, payment fees and rewards programs) affect engagement?
  • Which end-customers are most engaged and deserve to be promoted to your "VIP" terms? (example: higher cash-back)
  • Which merchants do your end-customers spend most on?
  • How do different end-customers segments use ACH, wire and check deposits differently?

The data transferred includes the following data related to ordinary and special bank accounts :

  • Authorization Requests
  • Payments
  • Applications
  • Cards
  • Counterparties
  • Disputes
  • Authorizations
  • Customers
  • Check deposits
  • Transactions
  • Accounts

Respectively, each bucket will contain any of these folders, while folders that don't contain data will be suffixed with _$folder$ (for example check_deposits_$folder$).

Each of the above folders will contain more folders with the format year_month=yyyy-mm (for example year_month=2021-04/).Each of these folders will contain one or more Parquet files, each containing a portion of the data.

Most Data pipelines frameworks (e.g Fivetran) allow consuming data from the bucket very easily.

note

If you’re interested in accessing your data with this integration, contact Unit to get it enabled and set up.

note

Data transfers occur twice a day:

  1. 7 a.m. UTC with data that was created up until 04:00 a.m.

  2. 3 p.m. UTC with data that was created up until 12:00 p.m.

Prerequisites

  • Active Organization in Unit
  • Google Cloud Platform (GCP) Admin privileges

Configuring your GCP Project

Sign in to the Google Cloud Console and follow the following steps in order to provide Unit with access permissions to your GCS bucket. Once this is set, Unit will transfer your data to the chosen bucket daily.

note

Configuration should be done per environment.

Create Service Account

  • In the Google Cloud console, go to the Service Accounts page under IAM & Admin.
  • Click CREATE SERVICE ACCOUNT.
  • Enter a name for the service account, e.g. Unit Integration Prod.
  • Enter a Service account ID, e.g. companyname-unit-integration-prod.
  • The following optional fields are not required for this service account.
  • Click DONE.
  • Find and click the new service account.
  • Go to the KEYS page.
  • Click ADD KEY and select Create new key.
  • Select JSON key type and click CREATE.
  • A JSON key file is downloaded to your computer, send the key file to Unit

Create Role

  • In the Google Cloud console, go to the Roles page under IAM & Admin.
  • Find and select the Storage Object Admin role.
  • Click CREATE ROLE FROM SELECTION.
  • Enter a name for the new role: Unit Integration Editor.
  • Enter ID: unitintegration.editor.
  • Click ADD PERMISSIONS.
  • Add the storage.buckets.get permission.
  • Click CREATE.

Create Bucket

  • In the Google Cloud console, go to the Buckets page under Cloud Storage.
  • Click CREATE.
  • Enter a name for the bucket, e.g. companyname-unit-data-prod.
  • The following fields are not required for this bucket.
  • Click CREATE.
  • Make sure that ”Enforce public access prevention on this bucket” is checked.
  • Click CONFIRM.
  • Find the bucket in the list and click the bucket name.
  • Go to the PERMISSIONS page.
  • Click GRANT ACCESS.
  • Add the email for the Service Account principal.
  • Assign the Unit Integration Editor role.
  • Click SAVE.

Send to Unit

  • Bucket name
  • Service Account JSON key file