You can set up a one time or recurring import from the Amazon S3 bucket provided by Blueshift or your custom Amazon S3 bucket.
Data Format
After you prepare a CSV file that contains the customers data, upload it to the Amazon S3 bucket. You can then import it to Blueshift by creating an import task.
Blueshift supports the following data types:
- String
- Integer
- Date
- Decimal
- Boolean data types
Note: Attributes with data type as Boolean cannot be set to a NULL or blank value and must be set as either true or false. If you require an attribute to have a blank values, use the a String data type instead of a Boolean data type.
Prepare a .csv file with the customer data and include the user attributes to import. Here are some suggested attributes, but feel free to add additional ones as well.
When you create a task, you can click Download Sample CSV to get the sample data file.
phone_number |
|
cannot have more than 64 characters. |
|
gender |
must be one of the following:
|
unsubscribed | value can be true or false and must be in lower case. |
joined_at |
You can also format the date in excel/numbers using custom format to convert the date to ISO8601 format yyyy-mm-ddThh:mm:ss+/-00:00. Replace +/-00:00 with the time zone offset for your timezone. |
Any dates |
Dates can be converted in excel by using a function to add (=A2+TIME(8,0,0) ) or subtract hours (=A2-TIME(8,0,0)) so the time is represented in the UTC time. Replace A2 with the cell of the date you are converting. Please note that this is good for one time uploads as it does not take into account daylight savings for ongoing file uploads. For example: yyyy-mm-ddThh:mm:ss-05:00 for Eastern (Est) timezone. Converting time = A2,5,0,0 will add 5 hours to the time converting EST timezome to UTC. |
File size
File size up to 3GB can be uploaded via the dashboard. For larger files, coordinate with your Blueshift Success Manager to determine the most optimal process.
Upload CSV file to an S3 bucket
You can upload the customers data CSV file to the S3 bucket provided by Blueshift (using the default adapter Blueshift Import) or to your custom S3 bucket (using the Custom Import adapter).
If you are using the Blueshift Import option:
- Add data files to the folder bsft-customers/<site-name>/customer_attributes.
- You can obtain the S3 credentials from the Account Settings > API Keys tab. The credentials are visible only to users with an Admin role.
If you are using the Custom Import option:
- Ensure that your S3 bucket has a CORS configuration.
- Provide your own S3 credentials and S3 file path. For example,
bucket: bsft-customers
path: demosite.com/customer_attributes/import/crm_feeds.csv
Create an import task
Complete the following steps to set up an import task:
- To import customer data, go to Customer Data > Attributes in the left navigation. Click +CUSTOMERS > Import Customers.
- Select Upload via S3 Bucket as the Source.
- On the Customer CSV Upload page, specify the Customer Attribute Feed Name. This field is used for identification purposes only.
- Specify an email address in the Email Upload Summary section to receive a notification when the upload process is completed. If there are errors, an error report is also sent. In the CC field, specify the email addresses of othet people who should receive the notification.
- For the Adapter, select Blueshift Import if you are using the S3 bucket provided by Blueshift. Select Custom Import to use your own S3 location.
- If you select Blueshift Import, specify the S3 Path where you upload the customers data files. For example, you can create a location on the S3 bucket called /customer-imports/blueshiftreads, and upload files to this path.
- If you select Custom import, provide the access credentials for the S3 bucket. For more information, see Working with Amazon S3 buckets in the AWS documentation.
- Access Key: The Access Key ID that we use to get files that contain your customers' data from your S3 bucket.
- Secret Access Key: The Secret Access Key that we use to get the files from your S3 bucket.
For more information on the Access Key ID and Secret Access Key, see Programmatic access on AWS documentation. - Bucket: Provide the URL to your S3 bucket where you store the files.
- AWS Region: Specify the AWS region where the S3 bucket resides.
- Path: Specify the path on the S3 bucket where you upload the files with your customers' data.
- Add the Update Frequency. You can select from Only once, Weekly, Daily, or Hourly.
- Specify the Update Time. You can upload a file Immediately or set a date and time for the upload.
- For recurring uploads, specify an End date and time.
- Click advanced to specify the character encoding of the CSV file you upload. You can select Unicode (UTF-8), Unicode (UTF-16), or Western Europe (ISO-8859-1).
- Set whether to Update All Matching Profiles and to Allow Blank User Attributes.
Note: Attributes with data type as Boolean cannot be set to a NULL or blank value and must be set as either true or false. - Click Fetch Sample S3 File to fetch a sample CSV file from the location that you specified in the Adapter field.
- Map the fields from the imported data to the fields in Blueshift and specify the data type for the field.
- For customer data, one column must be mapped to either of the following customer identifiers in Blueshift: email, retailer_customer_id, customer_id, or device_id.
- The Column is the attribute in the imported data and the Customer Attribute is the attribute in Blueshift.
- Click Import Customers to run the import task.
You can view the task status from the View Uploads page.
Comments
0 comments