Private Computation: Setup Guide

Step 4: Private Computation Runs

Prerequisites

1: Check Config before Running Computation (5 mins)

Open the Conversions API Gateway Shell https://<private-computation-infrastructure.instance.url>/hub/shell

Run the following commands:

a. config read Kinesis

  • Expected values:
{
  "PUBLISH_TO_KINESIS" : true,
  "BATCH_PUBLISH_PERIOD" : 1000,
  "BATCHING_ENABLED" : true,
  "FIREHOSE_DELIVERY_STREAM_NAME" : "cb-data-ingestion-stream-<TAG>",
  "AWS_REGION" : "<AWS REGION>"
}

For AWS_REGION, it should be lower case and formatted, for example “us-west-2”.


b. config read Athena

  • Expected values:
{
  "AWS_REGION" : "<AWS REGION>",
  "CATALOG_NAME" : "AwsDataCatalog",
  "DATABASE_NAME" : "mpc-events-db-<TAG>",
  "TABLE_NAME" : "fb_pc_data_<TAG WITH UNDERSCORE>",
  "QUERY_RESULTS_S3_BUCKET_PATH" :  "s3://fb-pc-data-<TAG>/query-results/",
  "ID_FIELDS" : "user_data.device_id,user_data.email"
  "USE_MULTIKEY" : false,     "MULTIKEY_ID_FIELDS":"user_data.device_id|id_device_id,user_data.email|id_email,user_data.processed_client_ip_address|id_ip"
}

For AWS_REGION, it should be lower case and formatted, for example “us-west-2”.

For ID_FIELDS

  • If your data only has email PII data. Please update the ID_FIELDS to email only with following command.
    • config write Athena /ID_FIELDS "user_data.email"
  • If your data only has device_id PII data. Please update the ID_FIELDS to device_id only with the following command:
    • config write Athena /ID_FIELDS "user_data.device_id"

c. config read CloudResources

  • Expected value for a new deployment:
{  "AWS_ACCESS_KEY" : "",
  "AWS_SECRET_KEY" : "",
  "AWS_SESSION_TOKEN" : "",
  "CONFIG_FILE_S3" : "s3://fb-pc-config-<TAG>/config.yml",
  "IMAGE_TAG" : "latest",
"USE_IAM_USER_AUTH" : false
}
  • Expected value for an older deployment:
{  "AWS_ACCESS_KEY" : "<YOUR AWS ACCESS KEY>",
  "AWS_SECRET_KEY" : "<YOUR AWS SECRET KEY>",
  "AWS_SESSION_TOKEN" : "",
  "CONFIG_FILE_S3" : "s3://fb-pc-config-<TAG>/config.yml",
  "IMAGE_TAG" : "latest",
"USE_IAM_USER_AUTH" : false
}

2: (Optional) Automate Diagnostic Data Sharing with Meta

To help clients better troubleshoot issues and improve the product, it’s highly recommended to opt-in for sharing diagnostic data with Meta, for automatically uploading logs to Meta within 5 minutes after a failed run.

You can open the Environment tab and find the setting “Automatic diagnostic data sharing”, then click the Edit button to update the setting.

Note that:

  • Only for the Private Lift. We will add support for Private Attribution later.
  • Logs collection won’t happen if the computation run failed to start, e.g., due to invalid AWS credentials assigned to config values, failure in input data preparation.

More details can be found in Sharing Diagnostic Data with Meta.

Now you are ready to use Private Computation products. Follow the section below to run Private Lift, or go to this section to run Private Attribution.

Private Lift

Step 1: Run Private Lift Computation (15 mins)

Go to Lift Report UI (sample URL: https://business.facebook.com/ads/lift/report/?ad_study_id=<your ad_study_id>) and select an MPC Conversion objective.

  • replace <your ad_study_id> with your own study ID

  • Click Update Results. A new window will pop up. Enter your Private Computation Infrastructure instance url here and click Go to Gateway.

  • You are now re-directed back to the Private Computation Infrastructure. To start the computation, click Update Results.
    • Format: https://<private-computation-infrastructure.instance.url>/hub/pcs/calculation/<your ad_study_id>/<your ad_study_name>
  • Enter the following URL and click on Go to Gateway.

  • To proceed, click Generate report to launch a pop-up window with instructions and multiple steps to guide you through.

  • After reading the instructions and basic info on the “Getting started” step, click Continue to go to the next step. Enter the Graph API token generated in Step 2 and click Validate token.

  • Once the token is validated, click Continue.

  • Click Prepare Data to confirm if enough data has been ingested for a successful run and generate a CSV file that is stored in the S3 data ingestion bucket.

  • Once data preparation completes, click Continue.

  • After reviewing the Private Lift report settings, access token validation, and data preparation results, if everything looks good, click Generate report to start generating the Private Lift report. Please note that computation will run for approx 3 - 6 hours (at most 24 hours) before completion.

  • Once the computation begins, logs will be printed to output.txt in your S3 bucket under the directory <data bucket>/query-results/fbpcs_instances_<studyId>_<postfix>. This will be a key resource to monitor and use for debugging purposes in case any issue occurs.

Step 2: View Private Lift Results

After the computation is complete, click on “View Results” to navigate to Lift UI. It can take up to 2 days for the results to populate.


Private Attribution

Step 0: Preparation

Request the following information if your Meta representative hasn’t provided them to you:

  • Dataset Id

Step 1: Run Private Attribution Computation (30 mins)

  • Log in to your Conversions API Gateway instance.
  • Navigate to “Private Computation” → “Private Attribution” page
  • Click the New Run button, and a dialog will appear

  • Enter the Graph API token generated in Step 2, and the Dataset ID provided by your Meta representative.

  • Click the Load Available Date.
  • Choose any date from the Select Available Date dropdown list. If you don’t know which one to choose, use the latest one or ask your Meta representative.
  • Enter the Data Source ID and Event Type. If you don’t know what the correct values should be, ask your Meta representative.
  • Click the Start Data Preparation button.

  • Wait until it shows Succeeded.

  • Click the Submit button to proceed.
  • Once the computation begins, logs will be printed to output.txt in your S3 bucket under the directory <data bucket>/query-results/fbpcs_instances_<ddataset id>_<ddataset timestamp>_<dpostfix>. This will be a key resource to monitor and use for debugging purposes in case any issue occurs.

Step 2: View Private Attribution Results

Ask your Meta representative for the results.