How to Create a Data Stream

A step-by-step guide on how to create a Data Stream

Data Streams is designed to move data from external platforms (Adobe Analytics, Google Analytics, Google Ads, Facebook Pages, Facebook Ads, YouTube, Bing Ads, Kochava, DoubleClick for Advertisers, Search Ads 360, and others to be added soon) into data warehouses such as Snowflake, Amazon Redshift, MySQL, Google BigQuery, PostgreSQL, Microsoft SQL Server and Teradata. 

Data Streams also supports adding data warehouses as Data Extracts Sources (Snowflake, Google BigQuery, Microsoft SQL Server, MySQL, PostgreSQL, Amazon Redshift and Teradata) allowing users to transfer data between data warehouses.

2018-12-11_1055

Data Extract Source And Data Destination

1. Let's create a new data stream. Under the Data Extract Source section select Connect to a New Data Source.

2018-12-11_1131

2. Choose a Data Connector from the drop down list under the Data Connector tab. We'll go with Adobe Analytics for this particular example.

2018-12-11_1133

3. Link your credentials. To do so, click the Link button. This will open a new window that will prompt you to enter your Adobe details. Log in with your username and Adobe secret. Here's more information about your Adobe details.

2018-12-11_1135

4. Once your credentials are linked, you can select your Adobe credential and Report Suite. Click the Submit button to create a new data source.

2018-12-11_1139

5. Data Destination: Now let's choose the destination for your data. In the Data Destination section, choose your Data Connector. For this example, we will be using Snowflake.

2018-12-11_1141

6. It's time to link your Snowflake credentials by hitting the green Link button. It will prompt a new window that you will ask you to input your login and additional server and port information.

2018-12-11_1204

Once you do that, you will be able to select your credentials from the credentials list.

2018-12-11_1205

7. You can choose an Existing table or create a New table. I’m going to go with create a new table.

Schedule & Date Range/Filters & Segments

8. Next step: Schedule and Data Range.

You can set your stream as:

  • One-time
  • Recurring 

You have a lot of predefined sections in Date Range. 

You can also choose to set your own Start Date and End Date. 

In terms of Granularity, our product offers

  • Daily
  • Weekly
  • Monthly
  • None
2018-12-11_1245

You can also set up a recurring stream. If this is the case, you can select the time of day when this stream should refresh, the timezone, and the frequency.
You can also change the granularity and calendar when selecting your date range. 

When setting up a recurring stream, it's mandatory to set the date range (run this report until and import data since ).

2018-12-11_1249

9. The Filters & Segments section is optional. This feature allows you to choose a metric or dimension to filter specific data or segment your data. Simply use the drop-down tab and select the data you would like to segment.

  • We did not use any filters or segment any data in this tutorial.

Metrics & Dimensions/Data Stream Name & Save

10. In the Metrics & Dimensions section, we have 3 tabs. The first tab is Source. This is where we set out Metrics or Dimensions in the drop down. Once the source is set, we can move on to select our Destination. This will be the column in the table. The Data Type will automatically select when choosing our source and destination.

  • To have more than one metric or dimension, simply click the green button below that says Add Metric or Dimension.
  • Note: At least one metric is mandatory for your stream.
2018-12-11_1257

11. Before you save your stream, you can preview your data. Click Preview Data.

2018-12-11_1310

12. Now you can name your stream and save it.

Once it's saved, your stream will show in the stream list right in the top of the list. It's in 'Waiting in queue' status, and in a couple of minutes, it will turn into 'Success'.

Once it’s successful, that means you successfully transferred your data into your Snowflake data warehouse.

2018-12-11_1313