Blueshift supports multiple ways of sending user data. We recommend using Recurring S3 Uploads when possible.
You can make a bulk api call with identify events. You can get high throughputs with bulk events, but the processing happens asynchronously, so you would not get notified if there are any errors. All the errors are captured and shown in the click-stream tracking page as "Processing Errors". The maximum size of a payload is 1MB or 100 events per payload. Multiple bulk events may be sent as part of an array. We recommend you limit the number of bulk calls to 5 per second ( 500 events/sec)
2. User API
You can use the user api to create/update users. The call happens synchronously and you get immediate feedback on any processing errors. You need to make parallel calls to get high throughput. The throughput is less when compared to bulk event api calls. Please implement error handling when using the user api. We recommend you to limit the user api calls to 50 requests/second.
You can use the Bulk User API to create/update users in bulk. The bulk call happens synchronously and you get immediate feedback on any processing error's. The throughput is better than user api, but you need to iterate through the response to check if any users ran into processing errors. Please implement error handling when using the bulk user api. The maximum size of the payload is 50 users per call. We recommend you limit the number of bulk calls to 5 per sec (250 users/second).
You can setup recurring Customer Attribute upload job through our dashboard. You can configure the job to run hourly/daily/weekly. You can get high throughput with the customer attribute upload job, we can process around 1M users/hr and our background job also takes care of retries. When the upload is done, you will get an email with the upload status. We recommend using this method when possible.