Export reports and data into S3

Blueshift uses Amazon S3 buckets to store data, reports (campaign activity, segment, template), audit trail logs, and so on. You can import this data for your own custom workflows or data warehouse applications. If you don't want to use the S3 bucket provided by Blueshift and want to set up your own Amazon S3 bucket, you must first set up integration with Amazon S3.

Note: You must set up CORS configuration if you want to put data into an S3 bucket or fetch data out of it.

Access S3 Folders

You can find the details for the S3 bucket and the credentials in the Account Settings > API keys tab. The account Admin role can access the S3 folder credentials from the Account settings page.

S3 bucket: bsft-customers/<site_name>/


You can use an FTP/SFTP client to access the S3 folders. Some examples of open source FTP clients include Cyberduck, Transmit, FileZilla, WinSCP.


Blueshift Data in S3 Folders

The following table lists the various folders and the information that is available in the folders in the S3 bucket.

Folder Description

Audit trail records. Generated once a day.


CSV files in .gz format: 

automatic_campaign_reports Daily campaign detail summary statistics at trigger or experiment level.
campaign_actions_dump Campaign activity data. This data can be accessed by using webhooks. Error data for any errors that occur when you access this folder using webhooks is also stored in a sub-folder. Frequency is daily.

User campaign activity reports. Blueshift captures and archives all campaign activity (sends, clicks, opens, delivered, and so on) to your S3 bucket. The data is batched and written to the folder every 5 mins.

path: <site_name>/campaign_activity/<year>/<month>/<day>

campaign_reports Daily exports of statistics at a campaign level.  This is no longer recommended.  Please use automatic_campaign_reports going forward.
campaigns Contains execution statistics, rendered templates for debugging/test campaigns, precomputed campaign segments, holdout users for automatic winner selection, data to be exported for segment reports and so on.

Catalogs exported from Blueshift.

The catalog is exported in CSV format and the filename is <catalog name>_<date>.csv. If all catalogs are exported, the filename is export_catalog_<date>.csv.

catalogs Contains catalog file imports and error reports (in sub-folder).
computed_events Blueshift generated events like price_change, back_in_stock, and so on.

Contains the click reports for creatives.

Sub-folder click_reports: click reports by date.

Sub-folder raw_clicks: click data.

custom_lists Any custom list files that are imported.

Any user import files that are processed.

The user_updates sub-folder contains logs for user updates made using APIs.


Blueshift generated derived events. For example, price drop, back in stock, and so on.

errors Backend processing errors that might occur during user processing.
events Any events that are sent to Blueshift using file upload or the API.
file_upload Processed file uploads for recommendations or events.
import Upload file imports for recommendations or events.
on_demand_campaign_reports Campaign reports that are exported from the Dashboard on demand.
promotions File imports related to promotions.
segment Contains data for segments that are exported.
segment_export Scheduled segment exports.

File uploaded to various syndication destinations.

Also contains intermediate files that are aggregated by syndications.

useractions User campaign interaction raw data.


Was this article helpful?
0 out of 0 found this helpful



Please sign in to leave a comment.