The File Upload feature allows Signals Gateway users to ingest events that are in a CSV file. This feature supports uploading events that are compatible with Meta servers, including website, offline (such as in-store) app and CRM events. There are two methods for uploading events: automatic uploads using S3 integration, or manually uploading files within the Signals Gateway by setup file data source. The automatic method is recommended as it eliminates the need for human intervention and runs seamlessly in the background without disrupting instance usage.
While creating first party pipeline, choose “File upload” as data source.
Add a name for this source:
The file source will then be created:
After finishing the pipeline creation, you can start using the “File Upload” features manually or further config for automation upload as described below. Events can be viewed by clicking on the File events icon at pipeline detail page:
To set up automatic uploads, find the setting through Account settings -> Automatic file uploads.
You will be redirected to the page that provides an overview of automatic uploading (shown below), click Start setup to continue.
A modal will pop up, which provides resources and guidance on creating an S3 bucket with ease. Click Continue when you’ve finished set up process:
Provide necessary information for S3 and IAM user created in previous step:
When the setup is finished, you’ll be provided with a sample file for reference.
Now you can go back to the same setup page from the Overview page to monitor the job run, and to modify connection information by clicking the “...” button.
Below is a detailed header definition table to use when formatting your data files. Follow the format closely to help ensure the best results. You can also view this table on the Settings page.
| Key | Description | Required | Default Value |
|---|---|---|---|
| ID of Signals Gateway file data source. | Yes | Event will be dropped if an ID is not included. |
| Event name of the record | Yes | Purchase |
| Date and time when transaction happens, formatted as “YYYY-MM-DD HH:mm:ss” | Yes | empty |
| The Meta defined action source indicates what type of the event is. See a list of sources here. | Yes | physical_store |
| The value of the transaction. | Required for Purchase event | 0 |
| currency of the transaction. | Required for Purchase event | USD |
| The IDs for contents or items with transaction. Support multi values split by pipe symbol, for example ID1|ID2. | No | empty |
| The transaction ID or order ID for the record. | No, but recommended | empty |
| Customer’s email | No, but recommended | empty |
| Customer’s phone number | No, but recommended | empty |
| Customer’s last name | No, but recommended | empty |
| Customer’s first name | No, but recommended | empty |
| Customer’s date of birth strin. We accept YYYYMMDD format.
| No, but recommended | empty |
| Customer’s country Use two letter code. For example, US | No, but recommended | empty |
| Customer’s state | No, but recommended | empty |
| Customer’s city | No, but recommended | empty |
| Customer’s zip code | No, but recommended | empty |
| Identifier associated with User’s link click. (Also known as fbc or fbclid) | No, but recommended | empty |
| Advertising ID for Apple or Android | No, but recommended | empty |
| Third party user ID | No, but recommended | empty |
| The lead ID from your Lead Ads | No, but recommended | empty |
| Setup this header to send custom data. Replace to the actual keys you want to use. See special note below.﹡ | No | empty |
﹡Automatic uploads can handle top-level custom data, but the feature does not currently support nested objects. The header of the CSV column should be labeled as "custom_data.<data-key>", where <data-key> is the name of the custom data field you wish to send, for example “custom_data.CustomField”.
Test Automatic Upload: To verify the S3 integration, upload some sample files to the connected S3 bucket and check if the job finished successfully. For non-App Runner instances, you can trigger an instant job by clicking the refresh button on the settings page; otherwise, the job will run automatically in an hour.
Click the file event data source from Pipeline to start using the manually upload feature.
Then locate the action menu from the popup modal and click Upload event data.
Event data manually uploaded to a data source must contain a set of parameters, some mandatory and some optional. The CSV file headers use JSONPath format to define the payload keys Check the detailed payload parameters here. Use the template available by clicking the sample file link to format your events data.
Once uploaded, the name of the CSV file will appear and the number of Uploaded or Excluded events will be shown.