With the integration of Blueshift and Google BigQuery, you can import customer interaction data (events and transactions) into your Blueshift account very easily and build a 360 degree view of all your customers. For information about data types and formats, see Import events data.

Prerequisites

Before you can import data from Google BigQuery you must set up integration of Blueshift with Google BigQuery and configure at least one adapter.

Set up an import task

You can import catalogs, events, and customer data using Google BigQuery. The starting point for your import tasks depends on the type of data you are importing.

To set up a task to import event data from Google BigQuery, complete the following steps:

  1. Go to Customer Data > Events in the left navigation. Click the Import Events icon and then click +Event Import.
  2. Select Google BigQuery as the Source.
  3. Add a Name for the task. The import task form opens
  4. In the Destination section, you can see the type of data being imported as Events.
  5. Select the type of event from the Select Event list.
    Screenshot 2024-07-16 at 7.25.14 PM.png
  6. Select the Trigger Campaigns option if you want the events data to be used to trigger any campaigns..
  7. Set up Notification Preferences to send a notification email to the specified email addresses when there is a change in the status of the task or when a certain percentage of records fail during import.
  8. In the Source section, select the adapter that you want to use for the import task.
    Screenshot 2024-07-16 at 7.25.45 PM.png 

     

  9. For Import From, select either Table or View. If your data is spread across multiple tables, it is recommended that you provide a View.
  10. Select the Table or the View from which the data is to be imported.
  11. Sample data consisting of 10 records is fetched from the table or view specified in the Source section. This data is displayed in the Configuration section.
     

    databricks_events_sample.png

  12. Map the fields from the imported data to the fields in Blueshift and specify the data type for the field.
    • If the event is a purchase, add_to_cart, wishlist, or view event, one column from the source data must be mapped to product_ids in Blueshift.
    • The Source Attribute Name is the attribute in Google BigQuery and the Destination Attribute Name is the attribute in Blueshift.
       

      events_import_map.png

  13. In case the sample data does not contain all the available fields, Add more fields to the data mapping.
  14. Specify any additional mapping requirements for events that include multiple product IDs.
     

    snowflake_events_map2.png

  15. Click Test Run to test the mapping. A maximum of 10 records are fetched during this test run.
     

    events_import_testrun.png

  16. Verify that the data mapping is done correctly. Edit the data mapping if required. Click Test Run again after you make the changes.
  17. For Additional Configurations, select the Type of Import.
    • Select Full Import if you are importing bulk data. For a Full import, the entire data from the selected table or view is imported from Google BigQuery every time you run the import task.
    • Select Incremental Import to set up an incremental import task. Select the Diff Identification and the Diff Identifier.
       

      databricks_typeofimport.png

  18. In the Schedule section, set the Start Time for the import task.
     

    databricks_schedule.png

  19. To set up a recurring task, under Schedule select the Is it a recurring data import? option.
  20. Set the frequency using the Task executes every field. You can set the frequency in minutes, hours, days, weeks, or months. You cannot set an import frequency shorter than a day for a Full import.
  21. Click Save to save the task.
  22. Click Launch to run the import task.

Import task status

The status for each Google BigQuery task is shown on the Events index page along with an overall status that identifies which state the task is in.

Was this article helpful?
0 out of 0 found this helpful

Comments

0 comments

Please sign in to leave a comment.