Our purpose with Data Streams is to move data from external platforms (Adobe Analytics, Google Analytics, Google Ads, Facebook Pages, Facebook Ads, YouTube, Twitter Ads, Bing Ads, Kochava, DoubleClick for Advertisers, and other platforms to be added soon) and bring it into data warehouses such as Snowflake, Amazon Redshift, MySQL, Google Big Query, PostgreSQL, Microsoft SQL Server and Teradata.
Data Streams also supports adding data warehouses as Data Extracts Sources (Snowflake, Google BigQuery, Microsoft SQL Server, MySQL, PostgreSQL, Amazon Redshift and Teradata) allowing users to transfer data between data warehouses.
This is a step-by-step guide to get you started.
DATA EXTRACT SOURCE AND DATA DESTINATION:
1. Let's create a new data stream. Under the Data Extract Source section select Connect To A New Data Source.
2. Choose a Data Connector from the drop down list under the Data Connector tab. For this example, we will select Adobe Analytics.
3. Link your credentials to the data stream. To do so, click the Link button. This will open a new window that will prompt you to enter your Adobe credentials. Log in with your username and Adobe secret.
4. Once your credentials are linked, you can select your Adobe credential and Report Suite. Click the SUBMIT button to create a new existing data source.
5. Data Destination: Now let's choose the destination for your data. In the Data Destination section choose your Data Connector. For this example, we will be using Snowflake.
6. It's time to link your Snowflake credentials by hitting the green Link button. It will prompt a new window that you will ask you to input your login and additional server and port information. The same goes for Amazon Redshift and other data warehouse sources.
Once you do that, you will be able to select your credentials from the credentials list.
7. You can choose an Existing table or create a New table. I’m going to go with create a new table.
SCHEDULE & DATE RANGE / FILTERS & SEGMENTS:
8. Next step: Schedule and Data Range.
You can set your stream as
You have a lot of predefined sections in Date Range.
You can also choose to set your own Start Date and End Date.
In terms of Granularity, our product offers
If you want your stream to be recurring you can also select the Frequency, Time of day and Timezone. You can also make use of the fields “Import Data since” and “Run this report until” fields.
9. The Filters & Segments section is optional. This feature allows you to choose a metric or dimension to filter specific data or Segment your data. Simply use the drop down tab and select the data you would like to segment.
- In this example, we did not use any filters or segment any data.
METRICS & DIMENSIONS / DATA STREAM NAME & SAVE:
10. In the next section, Metrics & Dimensions, you will see 3 tabs. The first tab is Source. Here you will set your Metric or Dimension in the drop down. Once the Source is set you can select your Destination. This will be the column in the table. The Data Type will automatically select when choosing your Source and Destination.
- To have more than one Metric or Dimension simply click the green button below that says Add Metric or Dimension.
- Note: at least one metric is mandatory for your stream.
11. Before you save your stream, you can preview your data. Click Preview Data.
12. Now I can name my stream and save it as "test0416".
Once it's saved, my stream will show in the stream list, top of the list. It's in STARTED status and in couple minutes it will pass in SUCCESS status.
Once it’s successful I know I was able to successfully transfer Adobe Analytics data into my Snowflake data warehouse.