With the integration of Blueshift and Snowflake, you can import customer data, catalogs, and customer interaction data (events and transactions) into your Blueshift account very easily and build a 360 degree view of all your customers.

Note

Importing data via direct integration with Snowflake is currently available for customers with the 'Schemas' or 'Advanced Data' add on. For more information, reach out to your CSM.

Prerequisites

Before you can import data from Snowflake you must set up integration of Blueshift with Snowflake and configure at least one adapter.

Data format

When you set up an import task, you can map the fields from the imported data to the fields in Blueshift and specify the data type for the field.

The following data types are supported in Blueshift:

  • Integer
  • Decimal
  • String
  • Boolean
  • Date
  • Array
  • Object

Importing semi-structured data

Consider the following points to understand the types of nested data that can be imported and how it can be used in Blueshift.

1. Simple objects (JSON with no hierarchy or nesting)

You can import Simple JSON data that does not have any hierarchy or nesting. This data can be used for Segmentation and Trigger filters.

{
"Key1":"Value1",
"Key2":"Value2",
"Key3": Value3
}
For example:

{
"House_number": 2,
"Street_name": "Woodbridge Road",
"Locality": "Kesgrave",
"City": "Ipswich"
}

2. Array of primitive data types

You can import arrays of primitive data types. This data can be used for Segmentation and Trigger filters.

[
Value1,
Value2,
Value3
]
For example:

[
"Table",
"Chair",
"Monitor",
]

3. Array of objects

When you import an array of objects, the array is flattened and the data is stored at each key level. If the same key is present in each object, then the key will be stored with multiple values. As a result, during a segmentation or trigger filter query, the results might not provide an accurate result.
Consider the following example:

[
{
"Key1":"Value1",
"Key2":"Value2",
"Key3": Value3
},
{
"Key1":"Value4",
"Key2":"Value5",
"Key3": Value6
},
]
For example:

[
{
"House_number": 2,
"Street_name": "Woodbridge Road",
"City": "Princeton",
"State": New Jersey
},
{
"House_number": 36,
"Street_name": "Benedict Avenue",
"City": "White Plains",
"State": "New York"
}
]

In this example, when the array is flattened, the data is stored as follows:

House_number: {2, 36}
Street_name: {Woodbridge Road,Benedict Avenue}
City: {Princeton,White Plains}
State: {New Jersey, New York}

During segmentation if you run a query for “Street_name=Woodbridge Road AND City=White Plains”, a match will be returned, whereas it does not necessarily match with a single object within the array exclusively.

Note: You can make each object exclusively referenceable by declaring the target datatype as Nested Object. Contact support@blueshift.com for more information.

4. Nested Objects

When you import nested objects, each key is stored as one node. Objects with one level of nesting can be used in segmentation and trigger filters. 

Objects having more than one level of nesting would not be accessible for segmentation or trigger filter conditions.

{
"Outer Key1" : {
    "Inner Key1": "Inner Value 1",
    "Inner Key2": "Inner Value 2"
               },
"Outer Key2": [
     {
     "Inner Key3": "Inner Value 3",
     "Inner Key4": Inner Value 4
     },
    {
    "Inner Key3": "Inner Value 5",
    "Inner Key4": Inner Value 5
    }
    ]
"Outer Key3": Outer Value 1
}
For example:

[
{
“Address”: 
{
"House_number": 2,
"Street_name": "Woodbridge Road",
"City": "Princeton",
"State": New Jersey
},
},
{
“Address”:
{
"House_number": 36,
"Street_name": "Benedict Avenue",
"City": "White Plains",
"State": "New York"
},
}
]

Set up an import task

You can import catalogs, events, and customer data using Snowflake. The starting point for your import tasks depends on the type of data you are importing.

To set up a task to import data from Snowflake, complete the following steps:

  1. To import catalogs, go to the Setup menu and click Catalogs. Click +CATALOG.

    snowflake_catalog_add.png
  2. Select Snowflake as the Source.
  3. Add a Name for the task. The import task form opens.

    snowflake_catalog_info.png
  4. In the Destination section, you can see the type of data being imported as Products.
  5. Specify a name for the catalog.
  6. Set up Notification Preferences to send a notification email to the specified email addresses when there is a change in the status of the task or when a certain percentage of records fail during import.
  7. In the Source section, select the adapter that you want to use for the import task.

    snowflake_selectadapter.png
  8. For Import From, select either Table or View. If your data is spread across multiple tables, it is recommended that you provide a View.
  9. Select the Table or the View from which the data is to be imported.
  10. Sample data consisting of 10 records is fetched from the table or view specified in the Source section. This data is displayed in the Configuration section.

    snowflake_category_fetchdata.png
  11. Map the fields from the imported data to the fields in Blueshift and specify the data type for the field.
    • For catalog data: a column from the source data must be mapped to each of the following product attributes in Blueshift: item_id, item_titleitem_url, main_image and category.
    • The Source Attribute Name is the attribute in Snowflake and the Destination Attribute Name is the attribute in Blueshift.

      snowflake_category_mapdata.png
  12. In case the sample data does not contain all the available fields, Add more fields to the data mapping.
  13. Specify additional mapping criteria as required. For catalogs you must Map Item Category and Map Item Tags.

    snowflake_category_map2.png
  14. Click Test Run to test the mapping. A maximum of 10 records are fetched during this test run.

    snowflake_catalog_testrun.png
  15. Verify that the data mapping is done correctly. Edit the data mapping if required. Click Test Run again after you make the changes.
  16. For Additional Configurations, select the Type of Import.
    • Select Full Import if you are importing bulk data. For a Full import, the entire data from the selected table or view is imported from Snowflake every time you run the import task.
    • Select Incremental Import to set up an incremental import task. Select the Diff Identification and the Diff Identifier.

      snowflake_typeofimport.png
  17. In the Schedule section, set the Start Time for the import task.

    snowflake_schedule.png
  18. To set up a recurring task, under Schedule select the Is it a recurring data import? option.
  19. Set the frequency using the Task executes every field. You can set the frequency in minutes, hours, days, weeks, or months. You cannot set an import frequency shorter than a day for a Full import.
  20. Click Save to save the task.
  21. Click Launch to run the import task.

Import task status

The index page for catalog import indicates the status for the catalog import task as either Draft, Launched, Paused, or Completed. For more information, see View Catalog Status.

Additional Information

Watch this video for a quick overview of setting up an import task.

Was this article helpful?
0 out of 0 found this helpful

Comments

0 comments

Please sign in to leave a comment.