You can import events data into Blueshift by uploading a file to an Amazon S3 bucket and setting up an import task to import the file into Blueshift. You can use the Amazon S3 bucket provided by Blueshift or your custom Amazon S3 bucket.
The Events CSV file
Create a CSV file by including a user identifier (email or customer_id) and a list of event attributes that you want to import. For more information, see Events Data Format.
Upload CSV file to an S3 bucket
You can upload the customers data CSV file to the S3 bucket provided by Blueshift (using the default adapter Blueshift Import) or to your custom S3 bucket (using the Custom Import adapter).
If you are using the Blueshift Import option:
- Add data files to the folder bsft-customers/<site-name>/import/event-imports/
- You can obtain the S3 credentials from the Account Settings > API Keys tab. The credentials are visible only to users with an Admin role.
If you are using the Custom Import option:
- Ensure that your S3 bucket has a CORS configuration.
- Provide your own S3 credentials and S3 file path. For example,
Set up an import task
To import events data into Blueshift, complete the following steps:
- To import events, go to the Setup menu and click Events. Click the Import Events icon and then click +Event Import.
- Select Upload via S3 bucket as the Source.
- Specify the Import Event. For example, you can select add to cart, purchase, or identify. The data for the selected event will be updated when the file is uploaded.
- Select the Trigger Campaigns option if you want the events data to be used to trigger any campaigns.
- Specify an email address in the Email Upload Summary section to receive a notification when the upload process is completed. If there are errors, an error report is also sent. In the CC field, specify the email addresses of other people who should receive the notification.
- For the Adapter, select Blueshift Import if you are using the S3 bucket provided by Blueshift. Select Custom Import to use your own S3 location.
- If you select Blueshift Import, specify the S3 Path where you upload the customers data files. For example, you can create a location on the S3 bucket called /event-imports/blueshiftreads, and upload files to this path.
- If you select Custom import, provide the access credentials for the S3 bucket. For more information, see Working with Amazon S3 buckets in the AWS documentation.
- Access Key: The Access Key ID that we use to get files that contain your customers' data from your S3 bucket.
- Secret Access Key: The Secret Access Key that we use to get the files from your S3 bucket.
For more information on the Access Key ID and Secret Access Key, see Programmatic access on AWS documentation.
- Bucket: Provide the URL to your S3 bucket where you store the files.
- AWS Region: Specify the AWS region where the S3 bucket resides.
- Path: Specify the path on the S3 bucket where you upload the files with your customers' data.
- Add the Update Frequency. You can select from Only once, Weekly, Daily, or Hourly.
- Specify the Update Time. You can upload a file Immediately or set a date and time for the upload.
- For recurring uploads, specify an End date and time.
- Click advanced to specify more details for the import task.
- Specify the character encoding for the CSV file you upload. You can choose either Unicode (UTF-8), Unicode (UTF-16), or Western Europe (ISO-8859-1).
- Specify a Delimiter if required.
- Specify the Update Time.
- Click Fetch Sample S3 File to fetch a sample CSV file from the location that you specified in the Adapter field.
- The specified CSV file is validated and the fields in the file are matched to the the fields on the Blueshift platform. For more information about events that Blueshift can capture see this document on our developer portal.
- Verify the mapping and map any fields that were not automatically mapped. You can create a new data type and specify its format.
- Click Import Events to run the import task. You can view the task status from the Events Import page.
- You will receive an email confirmation after the catalog has been uploaded. The email includes information for both processed and failed records.