You can upload order data to WorkMagic using APIs, cloud platforms, or spreadsheets. This guide covers all supported methods, including channel-specific behavior and setup steps.
❗ Important: If the same data is uploaded through multiple methods, the most recent upload will overwrite previous ones.
Supported Methods by Sales Channel
Selected Channels | Supported Upload Methods | Schema | Directory Standards |
Custom Site | Orders API, Amazon S3, GCS, Snowflake, BigQuery | ||
Imported Online Store | S3, GCS, Snowflake, BigQuery, Google Sheets | ||
Imported Marketplace / Retail | S3, GCS, Snowflake, BigQuery, Google Sheets |
How to Upload Order Data (Step-by-Step)
Go to Settings > Sales channels and select your sales channel(s).
Scroll down to Store data settings > Upload order data.
Choose one of the supported upload methods based on your setup:
Amazon S3
Google Cloud Storage
Snowflake
BigQuery
Google Sheets (Only for marketplace or Imported online store)
Follow the method-specific configuration steps below to upload your data.
Verify your upload is successful by checking the integration status in the UI. Each channel’s upload status is tracked separately:
Completed (for custom site)
Incomplete (for imported online store)
Interruption detected (for imported marketplace)
Method 1: How to upload order data via Amazon S3
Upload files via S3 API script to WorkMagic’s or your own S3 bucket. Make sure you follow WorkMagic's Orders API schema & directory standards for custom site, and follow Imported Orders API schema & imported directory standards for imported online store / marketplace.
Option 1: Use WorkMagic's S3 Bucket
Step 1: Provide IAM User ARN
Navigate to IAM > Users in AWS Console.
Copy the IAM ARN (e.g.,
arn:aws:iam::588759585297:user/huishan
).Paste into WorkMagic and confirm access.
❗ Amazon S3 and Snowflake share the same IAM User ARN for uploading order data. Changes made here will sync to the other method. You can add ARN under either method, but can only edit or delete it from where it was originally added.
Step 2: Upload Files via Script
Amazon S3 bucket name |
|
Assigned S3 path (for imported online store / marketplace) |
|
Assigned S3 path (for custom site) |
|
IAM role |
|
External ID |
|
Example Script
Example Script
import boto3 import os from datetime import datetime, timezone # S3 bucket configuration BUCKET_NAME = 'workmagic-inbound-data' # Name of the S3 bucket where files will be uploaded DATA_CATEGORY = "{DATA_TYPE}" TENANT_PREFIX = f'account-{WM_ACCOUNT_ID}/{DATA_CATEGORY}' # Tenant-specific prefix used to organize files in the bucket # Initialize AWS session with credentials # Note: Replace with your actual credentials initial_session = boto3.Session(aws_access_key_id='{aws_access_key_id_of_client}', aws_secret_access_key='{aws_secret_access_key_of_client}', region_name='us-east-1') # 1. Get temporary credentials using STS # Create STS client to assume a role sts_client = initial_session.client('sts') try: # Assume IAM role with appropriate permissions # Replace with your actual role ARN, session name, and external ID assumed_role = sts_client.assume_role( RoleArn='arn:aws:iam::588759585297:role/{WM_ACCOUNT_ID}', RoleSessionName='{session_name}' #Take any name ExternalId='{WM_EXTERNAL_ID}' ) except Exception as e: print(f"Assume role failed: {e}") exit() # Extract temporary credentials from the assumed role response credentials = assumed_role['Credentials'] # 2. Create S3 client using temporary credentials s3 = boto3.client( 's3', aws_access_key_id=credentials['AccessKeyId'], aws_secret_access_key=credentials['SecretAccessKey'], aws_session_token=credentials['SessionToken'] ) # 3. Upload local file to S3 try: # Path to the local file you want to upload # Replace with your actual file path local_file_path = '{/path/to/your/file}' # Verify if file exists if not os.path.exists(local_file_path): print(f"Error: File '{local_file_path}' does not exist") exit() # Extract filename from path file_name = os.path.basename(local_file_path) # Get current UTC time and format it for directory structure # Format: YYYY/MM/DD utc_now = datetime.now(timezone.utc) time_path = utc_now.strftime("%Y/%m/%d/%h") # Build S3 object key using tenant prefix and time-based path s3_object_key = f"{TENANT_PREFIX}/{time_path}/{file_name}" print(f"Uploading {local_file_path} to s3://{BUCKET_NAME}/{s3_object_key}...") # Perform the file upload to S3 s3.upload_file( Filename=local_file_path, Bucket=BUCKET_NAME, Key=s3_object_key ) print(f"File upload successful: s3://{BUCKET_NAME}/{s3_object_key}") except Exception as e: print(f"File upload failed: {e}")
Option 2: Use Your Own S3 Bucket
Step 1: Provide IAM User ARN (same as Option 1)
Step 2: Install AWS CLI
To perform sync, you need to install the AWS CLI: Install guide
Step 3: Upload files to your assigned S3 path
Set up the script and run hourly using the provided IAM role. Copy the script below into a new file (Example:sync_hourly_order_data_to_workmagic.sh
). Update SOURCE_BUCKET_NAME
to your own S3 bucket name. The listed information is automatically inserted into the example script.
Amazon S3 bucket name |
|
Assigned S3 path (for imported online store / marketplace) |
|
Assigned S3 path (for custom-site) |
|
IAM role |
|
External ID |
|
Example Script
Example Script
#!/bin/bash # === Config (edit if needed) === # Client's internal account ID in WorkMagic WM_ACCOUNT_ID="121226" # e.g., 123456 # Client's source bucket (edit this to their actual bucket name) SOURCE_BUCKET_NAME="customer-source-bucket" # === Compute UTC timestamp for the previous hour === DATE_PATH=$(date -u -d "1 hour ago" "+%Y/%m/%d/%H") SOURCE_PATH="s3://${SOURCE_BUCKET_NAME}/${DATE_PATH}/" # Destination bucket and final path DEST_BUCKET="workmagic-inbound-data" DATA_CATEGORY="order-geo-date-data" DEST_PATH="account-121226/{DATA_CATEGORY}/${DATE_PATH}/" # Role info (provided by your platform) ROLE_ARN="arn:aws:iam::588759585297:role/AmazonS3FolderAccess121226" EXTERNAL_ID="ext_id_account_121226" # Replace with default workmagic provided external id # Temp directory TMP_DIR="/tmp/order-upload-121226-${DATE_PATH}/-}" # AWS Region REGION="us-east-1" # === Execution === set -e echo "[1/4] Preparing temp directory: $TMP_DIR" rm -rf "$TMP_DIR" mkdir -p "$TMP_DIR" echo "[2/4] Syncing data from customer bucket: $SOURCE_PATH" aws s3 sync "$SOURCE_PATH" "$TMP_DIR" --region "$REGION" echo "[3/4] Assuming role: $ROLE_ARN" ASSUME_OUTPUT=$(aws sts assume-role --role-arn "$ROLE_ARN" --role-session-name "order-upload-session-121226" --external-id "$EXTERNAL_ID" --region "$REGION") export AWS_ACCESS_KEY_ID=$(echo "$ASSUME_OUTPUT" | jq -r '.Credentials.AccessKeyId') export AWS_SECRET_ACCESS_KEY=$(echo "$ASSUME_OUTPUT" | jq -r '.Credentials.SecretAccessKey') export AWS_SESSION_TOKEN=$(echo "$ASSUME_OUTPUT" | jq -r '.Credentials.SessionToken') echo "[4/4] Uploading to: s3://${DEST_BUCKET}/${DEST_PATH}" aws s3 sync "$TMP_DIR" "s3://${DEST_BUCKET}/${DEST_PATH}" --region "$REGION" echo "✅ Sync complete: ${SOURCE_PATH} → s3://${DEST_BUCKET}/${DEST_PATH}" rm -rf "$TMP_DIR"
Method 2: How to upload order data via Google Cloud Storage (GCS)
Upload files via GCS SDK script to WorkMagic’s or your own GCS bucket. Make sure you follow WorkMagic's Orders API schema & directory standards for custom site, and follow Imported Orders API schema & imported directory standards for imported online store / marketplace.
❗ Important:
Custom-site data is order id level and can be used for all features of the platform.
Imported online store/marketplace data is aggregate and only for lift tests.
Option 1: Use WorkMagic's GCS Bucket
Step 1: Authorize WorkMagic
To allow WorkMagic to receive your export files, share a Google service account for authorization. You can use an existing one or create a new one. Optional: add developer emails for temporary debugging access.
Share a Google Service Account with:
[email protected]
Supports multiple accounts. Can be created here.
❗ Important:
GCS and BigQuery share the same Google Service Account for uploading order data. Changes made here will sync to the other method. You can add account under either method, but can only edit or delete it from where it was originally added.
Step 2: Upload Script
Replace the placeholder order-uploader-key.json
in the example script with your own service account key JSON file. The provided upload script uses the Google Cloud Storage Python SDK, and you should customize the implementation to fit your pipeline.
GCS bucket name |
|
Assigned GCS path (for imported online store / marketplace) |
|
Assigned GCS path (for custom site) |
|
Example Script
Example Script
from google.cloud import storage import os # Path to the customer's service account key file SERVICE_ACCOUNT_JSON = "order-uploader-key.json" # Your service account key JSON file # Target bucket and destination path DEST_BUCKET = "workmagic-inbound-data" DEST_FOLDER = "account-{WM_ACCOUNT_ID}" DATA_CATEGORY = "{DATA_TYPE}" SUBPATH = "2025/05/27/13" # generate current date time FILENAME = "order_data_0001.json.gz" DEST_OBJECT_PATH = f"{DEST_FOLDER}/{DATA_CATEGORY}/{SUBPATH}/{FILENAME}" # Local file to upload LOCAL_FILE = os.path.join(SUBPATH, FILENAME) def main(): # Authenticate with the customer's service account storage_client = storage.Client.from_service_account_json(SERVICE_ACCOUNT_JSON) # Get the bucket bucket = storage_client.bucket(DEST_BUCKET) # Create a blob and upload the file blob = bucket.blob(DEST_OBJECT_PATH) blob.upload_from_filename(LOCAL_FILE) print(f"✅ Uploaded {LOCAL_FILE} to gs://{DEST_BUCKET}/{DEST_OBJECT_PATH}") if __name__ == "__main__": main()
Option 2: Use Your Own GCS Bucket
Step 1: Authorize Google Service Account (same as above)
Step 2: Prepare the upload script
Schedule the script to run hourly to sync a specific folder from your bucket to WorkMagic's bucket. Replace the placeholder SOURCE_BUCKET_NAME
in the example script with your own GCS bucket name.
GCS bucket name |
|
Assigned GCS path (for imported online store / marketplace) |
|
Assigned GCS path (for custom-site) |
|
Example Script
Example Script
#!/bin/bash # === Config to edit === WM_ACCOUNT_ID="{WM_ACCOUNT_ID}" SOURCE_BUCKET="customer-source-bucket" # === Compute UTC timestamp for the previous hour === DATE_PATH=$(date -u -d "1 hour ago" "+%Y/%m/%d/%H") # === Construct GCS paths === DEST_BUCKET="workmagic-inbound-data" DEST_FOLDER="account-{WM_ACCOUNT_ID}" DATA_CATEGORY="{DATA_TYPE}" SOURCE_PATH="gs://${SOURCE_BUCKET}/${DATE_PATH}/" DEST_PATH="gs://${DEST_BUCKET}/${DEST_FOLDER}/${DATA_CATEGORY}/${DATE_PATH}/" echo "[INFO] Syncing previous hour's data:" echo " From: ${SOURCE_PATH}" echo " To: ${DEST_PATH}" # === Perform the sync === gsutil -m rsync -r "${SOURCE_PATH}" "${DEST_PATH}"
Method 3: How to upload order data via Snowflake
Export order data from Snowflake to WorkMagic using the native COPY INTO
function. Secure, automated, and code-free for data teams. Make sure you follow WorkMagic's Orders API schema & directory standards for custom site, and follow Imported Orders API schema & imported directory standards for imported online store / marketplace.
Step 1: Create Storage Integration and run SQL in Snowflake
Create the Storage Integration in Snowflake using the credentials provided by WorkMagic, then run the following SQL in Snowflake.
STORAGE_AWS_ROLE_ARN |
|
STORAGE_ALLOWED_LOCATIONS |
|
STORAGE_AWS_EXTERNAL_ID |
|
SQL
SQL
CREATE OR REPLACE STORAGE INTEGRATION s3_order_export_to_workmagic TYPE = EXTERNAL_STAGE STORAGE_PROVIDER = 'S3' ENABLED = TRUE STORAGE_AWS_ROLE_ARN = 'arn:aws:iam::588759585297:role/AmazonS3FolderAccess121226' STORAGE_ALLOWED_LOCATIONS = ('s3://workmagic-inbound-data/account-121226/') STORAGE_AWS_EXTERNAL_ID = 'ext_id_account_121226';
Step 2: Provide WorkMagic with IAM User ARN
Run the following SQL (Describe Integration) in Snowflake, find the STORAGE_AWS_IAM_USER_ARN
value in the result, copy and paste into the IAM User ARN field in WorkMagic.
SQL
SQL
DESC INTEGRATION s3_order_export_to_workmagic;
❗ Amazon S3 and Snowflake share the same IAM User ARN for uploading order data. Changes made here will sync to the other method. You can add ARN under either method, but can only edit or delete it from where it was originally added.
Step 3: Automate Data Export to S3 via SQL
Use the following SQL to export data hourly to your assigned S3 folder. Replace line 5-7 with your real order query SQL, and ensure the `json_row` meets the WorkMagic's Orders API schema.
SQL
SQL
DECLARE data_category STRING DEFAULT "order-geo-date-data'; sql STRING; BEGIN LET sql := ' COPY INTO ''s3://workmagic-inbound-data/account-121226/<data_category>/<date_path>/'' FROM ( SELECT OBJECT_CONSTRUCT(*) AS json_row FROM your_schema.your_table WHERE order_update_time > now() - interval 2 hour ) STORAGE_INTEGRATION = s3_order_export_to_workmagic FILE_FORMAT = ( TYPE = ''JSON'' COMPRESSION = ''GZIP'' ) MAX_FILE_SIZE = 1048576 OVERWRITE = TRUE;'; sql := REPLACE(sql, '<date_path>', TO_VARCHAR(CURRENT_TIMESTAMP(), 'YYYY/MM/DD/HH')); EXECUTE IMMEDIATE (sql); RETURN sql; END;
Method 4: How to upload order data via BigQuery
Export order data from BigQuery to WorkMagic's GCS bucket using SQL only. Secure, automated, no engineering needed.
Step 1: Authorize Service Account
Use your BigQuery Data Transfer API service account.
Example:
[email protected]
Can be located in GCP Console > Credentials under the BigQuery Data Transfer API.
Step 2: Set up BigQuery export SQL
Use the SQL block below to set up Scheduled Queries in BigQuery to export your data to WorkMagic's bucket. This will export orders updated in the past 2 hours into a gzipped JSON file.
GCS bucket name |
|
SQL
SQL
DECLARE run_datetime STRING; DECLARE export_uri STRING; DECLARE export_sql STRING; DECLARE timestamp_str STRING; DECLARE data_category STRING; -- Set data category SET data_category = 'order-geo-date-data'; -- Get current timestamp in format YYYY/MM/DD/HH (UTC) SET date_path = FORMAT_TIMESTAMP('%Y/%m/%d/%H', CURRENT_TIMESTAMP()); -- File timestamp string SET timestamp_str = FORMAT_TIMESTAMP('%Y-%m-%d-%H', CURRENT_TIMESTAMP()); -- Build the export file URI SET export_uri = CONCAT('gs://workmagic-inbound-data-account-121226', data_category, date_path, timestamp_str,'-*.json.gz'); -- Construct the EXPORT DATA statement SET export_sql = ''' EXPORT DATA OPTIONS ( uri = "''' || export_uri || '''", format = "JSON", compression = "GZIP", overwrite = true ) AS ( SELECT * from your_order_data where order_update_time > now() - interval 2 hour ); '''; -- Execute the EXPORT DATA statement EXECUTE IMMEDIATE export_sql;
Method 5: How to upload order data via Google Sheets
Ideal for small to medium datasets. Only one sheet is supported, only the first tab is read, and the full sheet is synced each time. Old rows can be removed to manage file size. Make sure you follow WorkMagic's Imported Orders API schema &directory standards.
Step 1: Create a copy of the template sheet
Sign in to your Google account, open the template, and click [Make a copy] to create your own editable sheet.
Step 2: Enter your data and share view access
Fill in the sheet with your aggregated order data, and share viewing access with our service account:
Share with:
[email protected]
Step 3: Copy the created sheet link to WorkMagic
Paste link into WorkMagic
Click Verify
If valid, status updates and sync begins hourly
After Connection:
Shows sheet name and last update time
Manual sync available via Sync now
Replace sheet anytime; historical data remains stored
Implementation Tips
Always test using small datasets first.
Uploads to the same destination path will be overwritten.
Maintain consistent file naming and timestamp-based folders (
YYYY/MM/DD/HH
).Use gzip-compressed JSON format when applicable.
Only the first tab in Google Sheets will be processed.
Troubleshooting & Known Issues
Problem | Solution |
Invalid IAM/Service Account | Double-check ARN or email format. Ensure it exists in your cloud console. |
Access Denied | Confirm IAM role assumptions and external ID correctness. |
Sheet Not Syncing | Verify link, tab placement, and view access. Use full Google Sheet URL. |
Cloud Sync Skipped | Ensure file timestamp and path match expected structure. |
Data Missing in Upload | Confirm query logic includes latest order_update_time rows. |
Frequently Asked Questions
Is there a size limitation (<1.5M), how to deal with it?
For Google sheet <100,000 lines, split the data in to multiple sheet, put the first set of data on the first sheet, click sync now->move 2nd sheet to first sheet, click sync now, until all sheets are synced.
For Object storage (S3/GCS/Snowflake/BigQuery) <1.5M, split files to upload to the folder path
Can I upload multiple retail store data?
Yes, there is a
sales_platform
field in the schema, so we can support multiple.
Can I import Amazon or other marketplace?
Yes, any secondary sales channel (non-DTC) can be imported
My client's retail data is weekly aggregated, not daily. Can it be supported?
Yes, we support both daily or weekly aggregation.
If weekly data is uploaded, we will divide by 7 to average the data into each date. Client can specify which date is the beginning of the week by inputting the date of the beginning of the week in the
date_code
field