Before you go live with our platform, we ask that you do a one-time upload of your user data. Going forward, all user updates (new and existing) will occur through the events we receive.

Prepare a .csv file with as many user attributes as you would like to import. However, are some suggested attributes, but feel free to add additional ones as well.



Data Format

  • Ensure that the phone_number:
    • includes the country code
    • starts with a +
    • follows the E.164 standard, contains 6 to 14 digits, and does not start with a 0
  • An email must not have more than 64 characters
  • gender should be either male/female/unknown  
  • unsubscribed is true/false (lower case)
  • joined_at must be in ISO8601 format (1980-05-19T15:22:01-07:00).  You can also format in excel/numbers using custom format to convert to ISO8601 format yyyy-mm-ddThh:mm:ss+/-00:00. Replace +/-00:00 with the time zone offset for your timezone.  If you do not include joined_at during upload process, we will infer a joined_at date as the upload date.
  • Dates can be converted in excel by using a function to add (=A2+TIME(8,0,0) ) or subtract hours (=A2-TIME(8,0,0)) so the time is represented in the UTC time. Replace A2 with the cell of the date you are converting. Please note that this is good for one time uploads as it does not take into account daylight savings for ongoing file uploads. Example: yyyy-mm-ddThh:mm:ss-05:00 for Eastern (Est) timezone.
    • Converting time =A2,5,0,0 will add 5 hours to the time converting EST timezome to UTC.

Upload through the blueshift web app

After you prepare a CSV that contains your customers' data, you can upload it to our platform. To upload your customers' data, navigate to the Customer Data > Attributes page using the left navigation panel. The Customer Attributes page appears. You can also setup recurring customer attributes upload jobs through our dashboard. On the Customer Attributes upload page, choose the Upload method as S3 Upload.  The default adapter is Blueshift Import. You can also choose Custom Import and provide your own s3 credentials and s3 file path. With the default adapter, you can drop any files in to the folder bsft-customers/<site-name>/customer_attributes.

Sample S3 path

bucket: bsft-customers, 

You can find the s3 credentials in your Account profile tab. The s3 credentials are visible only to Admins.

Below is a screenshot with a sample configuration. You can click on Fetch Sample S3 File to get some sample data and add attribute mappings.



Perform the following steps on the Customer Attributes page:

  1. On the Customer Attributes page, click + CUSTOMERS > Upload CSV.
  2. On the Customer CSV Upload page, provide the following details:

    Import method 

    You can choose either of the two:

    • File upload: Choose this option if you want to upload a CSV file once. 
    • S3 upload: Choose this option if you want to set up a feed, and want our platform to pick up the CSV files automatically from an S3 location.
    Customer Attribute Feed Name (if the import method is S3 uploads)

    Specify a name for the feed. This field is used for identification purposes only. 

    Adapter (if the import method is S3 uploads)

    You can either choose to upload files to a location that we provide or you can choose an S3 location of your choice and provide us its details. Ensure that your S3 bucket has a CORS configuration. If you plan to use the S3 bucket that we provide, choose Blueshift Import. If you choose Blueshift Import, here are the additional details that we need:

      • S3 Path: Specify the path where you upload files with your customers' data. For example, you can create a location on the S3 bucket called /customer-imports/blueshiftreads, and upload files to this path. Specify this path here. You can get more information about the S3 bucket that we provide on the Account Settings page of the Blueshift app

    If you plan to use your own S3 location, choose Custom Import. Options to provide additional details appear if you choose Custom Import. Here are the additional details that we need:

    • Access Key: The Access Key ID that we use to get files that contain your customers' data from your S3 bucket. 
    • Secret Access Key: The Secret Access Key that we use to get the files from your S3 bucket.
      For more information on the Access Key ID and Secret Access Key, see Programmatic access on AWS documentation
    • Bucket: Provide the URL to your S3 bucket where you store the files. 
    • AWS Region: Specify the AWS region where the S3 bucket resides. 
    • Path: Specify the path on the S3 bucket where you upload the files with your customers' data. 

    For more information, see Working with Amazon S3 buckets on AWS documentation.

    Update Frequency (if the import method is S3 uploads)

    If you choose S3 uploads as the import method, you can specify how often we should pick your customers' data and import it. For example, if you are uploading files to the S3 location every week, we can pick the file weekly. Your options are:

    • Only once
    • Weekly
    • Daily
    • Hourly

    In addition, you can specify Update Time. You can use this option to let our platform know whether:

    • It should import files from the S3 location instantly
    • Or it should start the import at a date and time that you specify (for weekly, daily, and hourly imports). In addition, you can also specify when our platform should stop importing the files based on the setting that you specify in the End option. 

    You can click advanced and specify the character encoding of the CSV file you upload. You can choose either Unicode (UTF-8), Unicode (UTF-16), or Western Europe (ISO-8859-1).

    Fetch Sample S3 File (if the import method is S3 uploads)

    Use this button to let our platform fetch a sample CSV file from the location that you specify in the Adapter field. 

    Download Sample CSV

    You can download a sample CSV and review its format to understand the type of file our platform can ingest. 

  3. After you fetch a CSV file from the S3 location/upload a CSV file, the Map Event Attributes section is enabled that provides options to map event data from your site with the event data on our platform. The following section describes how you can use it. Specify an email address under the Email Upload Summary section to receive a notification when the upload process completes. If there are errors, we send an error report as well. You can send a copy of the notification to multiple people using the cc field.

File size 

File size up to 3GB can be uploaded via the dashboard. For larger files, coordinate with your Blueshift Success Manager to determine the most optimal process.

Map columns from .csv to standard Blueshift columns

For custom columns, create a new column in Blueshift by selecting Custom Attributes in the drop down and providing a column name. Make sure to select the correct data type.


Data Type

We support String, Integer, Date, Decimal and Boolean data types

Import File

You will receive an email notification when the upload process completes, including the error report.  You can also cc' multiple people within and outside the org on customer or event data imports.



Validate Import

You can validate your import by searching for users from your file. Go to the Customer Attributes page and search for a sample user.

  • On the user uploads page dashboard, the status column reflects the last upload. If the most recent upload finished with no errors, we mark it as success.
  • If the most recent upload finished but has some errors, we mark it as completed.



Was this article helpful?
1 out of 1 found this helpful



Please sign in to leave a comment.