Starting your recommendation feeds import
You can start your feed import by navigating to the import recommendations overview page and selecting the appropriate import type.
You can import files using one of the following methods:
- Direct file upload – Upload a file directly from your local system.
- Upload via S3 bucket – Import files from an AWS S3 storage bucket.
- Upload via SFTP – Transfer files securely using an SFTP server.
Choose the appropriate method based on your data source and access preferences.
Left-side panel overview
- Source: Displays the data source for the import. Example: Google BigQuery.
- Destination: Specifies where the imported data will be directed, i.e., recommendation feeds in this case.
-
Select recommendation type: (mandatory field)
- User-to-product: Imports recommendations based on user identifiers and related product IDs, ideal for personalized promotions.
- Product-to-product: Suggests related products based on user interactions, such as viewed or purchased items.
- Recommendation feed name: (mandatory field) Enter a name for the feed (minimum of 3 characters).
- Description: Add details to describe the recommendation feed.
-
Notification preferences:
- To: Specify email IDs to receive notifications about the import task.
- CC: Add additional recipients to be copied on notifications.
-
Settings:
- Notify - any status: Sends notifications for all import task statuses.
- Notify - If more than [X]% of records fail: Sends alerts when failures exceed the defined threshold (e.g., 70%).
Source configuration
Direct File Upload
- Click Choose File, upload the CSV file, and click Continue to proceed.
SFTP Upload
- Select an adapter: Choose a configured SFTP adapter.
- Enter file path: Provide the full file path, including the file name.
-
Displayed information:
- Bucket: Name of the storage location.
- Host address: The SFTP server’s address.
- Destination path: Folder path on the SFTP server.
- Click Continue to proceed.
S3 bucket upload
- Select an adapter: Choose a configured S3 adapter.
- Enter file path: Provide the full file path, including the file name.
-
Displayed information:
- Bucket: The S3 bucket name.
- S3 base path: Folder path on the S3 server.
-
- Click Continue to proceed.
A sample file is available for download to use as a reference.
Import file settings
- Format (Mandatory): Choose from CSV or JSONL.
-
Encoding (Mandatory): Options include:
- Unicode (UTF-8, UTF-16, UTF-16 Little Endian, UTF-16 Big Endian)
- Western Europe (ISO-8859-1).
-
Delimiter (Mandatory): Options include:
- Comma (,), Tabs (\t), Pipe (|), Semicolon (;)
- Record Separator (ASCII 30), Unit Separator (ASCII 31), Others (customizable).
- Click Continue to proceed.
Data configuration & validation
- Sample Data: Displays 10 records fetched from the source to assist in field mapping.
-
Field Mapping:
- Map the Source Attribute Name to the corresponding Destination Attribute Name and specify the Destination Data Type.
-
Mapping Instructions:
- Only columns mapped to a destination attribute will be imported.
- Map Floating point numeric data types from the source to Decimal in Blueshift.
- Example: For events like purchase, add_to_cart, wishlist, or view, map a column to product_ids (required for events interacting with product catalogs).
- Ensure one column is mapped to a customer identifier, such as customer_id, email, my_custom_id, cookie, device_id, or my_external_id.
Data Quality Check:
- Use Check Data Quality to validate field mappings.
- View data quality percentages, errors, and hints for corrections in the modal.
- Adjust mappings as needed.
Test Run:
- Click Test Run to validate with up to 10 records.
- The modal displays source data alongside mapped JSON.
- Update mappings and re-run tests if necessary.
- Next Steps: Address all validation issues and click Continue to proceed with the import.
Scheduling and launching the import task
For CSV direct file upload, select import immediately to begin the import. Otherwise, select the start date and time for the import.
For S3 and SFTP:
- Select the Start Date using the date picker.
- Check 'Is it a recurring data import?' to enable recurring imports.
- Choose when the task ends:
- 'Never' for an indefinite schedule.
- 'At some time' to set an End Date.
- Set the execution frequency (e.g., every 15 minutes).
- Scheduling options: Minutes, Hourly, Daily, Weekly, and Monthly.
Review the setup and the top right corner of the screen:
- Click the Save button to save the task.
- Click the Launch button to start the task.
Import errors and possible causes
Some records may not be imported due to errors. Below are common errors and their possible causes:
Error message | Possible cause |
---|---|
Firefly resource not found error={"{\"errors\": [\"attribute_value_not_found\"]}"} | An item in the product_ids list is not present in the catalog. |
Firefly resource not found error={"{\"errors\": [\"product_id_not_found\"]}"} | The source_product_id itself is not in the catalog. |
Comments
0 comments