With the integration of Blueshift and Snowflake, you can import customer data, catalogs, and customer interaction data (events and transactions) into your Blueshift account very easily and build a 360 degree view of all your customers.
Note
Importing data via direct integration with Snowflake is currently available for customers with the 'Schemas' or 'Advanced Data' package. For more information, reach out to your CSM.
Prerequisites
Before you can import data from Snowflake you must set up integration of Blueshift with Snowflake and configure at least one adapter.
Data format
When you set up an import task, you can map the fields from the imported data to the fields in Blueshift and specify the data type for the field.
The following data types are supported in Blueshift:
- Integer
- Decimal
- String
- Boolean
- Date
- Array
- Object
Importing semi-structured data
Consider the following points to understand the types of nested data that can be imported and how it can be used in Blueshift.
1. Simple objects (JSON with no hierarchy or nesting)
You can import Simple JSON data that does not have any hierarchy or nesting. This data can be used for Segmentation and Trigger filters.
{ "Key1":"Value1", "Key2":"Value2", "Key3": Value3 } |
For example: { "House_number": 2, "Street_name": "Woodbridge Road", "Locality": "Kesgrave", "City": "Ipswich" } |
2. Array of primitive data types
You can import arrays of primitive data types. This data can be used for Segmentation and Trigger filters.
[ Value1, Value2, Value3 ] |
For example: [ "Table", "Chair", "Monitor", ] |
3. Array of objects
When you import an array of objects, the array is flattened and the data is stored at each key level. If the same key is present in each object, then the key will be stored with multiple values. As a result, during a segmentation or trigger filter query, the results might not provide an accurate result.
Consider the following example:
[ { "Key1":"Value1", "Key2":"Value2", "Key3": Value3 }, { "Key1":"Value4", "Key2":"Value5", "Key3": Value6 }, ] |
For example: [ { "House_number": 2, "Street_name": "Woodbridge Road", "City": "Princeton", "State": New Jersey }, { "House_number": 36, "Street_name": "Benedict Avenue", "City": "White Plains", "State": "New York" } ] |
In this example, when the array is flattened, the data is stored as follows:
House_number: {2, 36}
Street_name: {Woodbridge Road,Benedict Avenue}
City: {Princeton,White Plains}
State: {New Jersey, New York}
During segmentation if you run a query for “Street_name=Woodbridge Road AND City=White Plains”, a match will be returned, whereas it does not necessarily match with a single object within the array exclusively.
Note: You can make each object exclusively referenceable by declaring the target datatype as Nested Object. Contact support@blueshift.com for more information.
4. Nested Objects
When you import nested objects, each key is stored as one node. Objects with one level of nesting can be used in segmentation and trigger filters.
Objects having more than one level of nesting would not be accessible for segmentation or trigger filter conditions.
{ "Outer Key1" : { "Inner Key1": "Inner Value 1", "Inner Key2": "Inner Value 2" }, "Outer Key2": [ { "Inner Key3": "Inner Value 3", "Inner Key4": Inner Value 4 }, { "Inner Key3": "Inner Value 5", "Inner Key4": Inner Value 5 } ] "Outer Key3": Outer Value 1 } |
For example: [ { “Address”: { "House_number": 2, "Street_name": "Woodbridge Road", "City": "Princeton", "State": New Jersey }, }, { “Address”: { "House_number": 36, "Street_name": "Benedict Avenue", "City": "White Plains", "State": "New York" }, } ] |
Set up an import task
You can import catalogs, events, and customer data using Snowflake. The starting point for your import tasks depends on the type of data you are importing.
To set up a task to import data from Snowflake, complete the following steps:
- To import events, go to the Setup menu and click Events. Click the Import Events icon and then click +Event Import.
- Select Snowflake as the Source.
- Add a Name for the task. The import task form opens.
- In the Destination section, you can see the type of data being imported as Events.
- Select the type of event from the Select Event list.
- Select the Trigger Campaigns option if you want the events data to be used to trigger any campaigns..
- Set up Notification Preferences to send a notification email to the specified email addresses when there is a change in the status of the task or when a certain percentage of records fail during import.
- In the Source section, select the adapter that you want to use for the import task.
- For Import From, select either Table or View. If your data is spread across multiple tables, it is recommended that you provide a View.
- Select the Table or the View from which the data is to be imported.
- Sample data consisting of 10 records is fetched from the table or view specified in the Source section. This data is displayed in the Configuration section.
- Map the fields from the imported data to the fields in Blueshift and specify the data type for the field.
- If the event is a purchase, add_to_cart, wishlist, or view event, one column from the source data must be mapped to product_ids in Blueshift.
- The Source Attribute Name is the attribute in Snowflake and the Destination Attribute Name is the attribute in Blueshift.
- In case the sample data does not contain all the available fields, Add more fields to the data mapping.
- Specify any additional mapping requirements for events that include multiple product IDs.
- Click Test Run to test the mapping. A maximum of 10 records are fetched during this test run.
- Verify that the data mapping is done correctly. Edit the data mapping if required. Click Test Run again after you make the changes.
- For Additional Configurations, select the Type of Import.
- Select Full Import if you are importing bulk data. For a Full import, the entire data from the selected table or view is imported from Snowflake every time you run the import task.
- Select Incremental Import to set up an incremental import task. Select the Diff Identification and the Diff Identifier.
- In the Schedule section, set the Start Time for the import task.
- To set up a recurring task, under Schedule select the Is it a recurring data import? option.
- Set the frequency using the Task executes every field. You can set the frequency in minutes, hours, days, weeks, or months. You cannot set an import frequency shorter than a day for a Full import.
- Click Save to save the task.
- Click Launch to run the import task.
Import task status
The status for each Snowflake task is shown on the Events index page along with an overall status that identifies which state the task is in.
Additional Information
Watch this video for a quick overview of setting up an import task.
Comments
0 comments