Set up the Openflow Connector for Meta Ads

Note

The connector is subject to the Connector Terms.

This topic describes the steps to set up the Openflow Connector for Meta Ads.

Prerequisites

  1. Ensure that you have reviewed About Openflow Connector for Meta Ads.

  2. Ensure that you have set up Openflow.

Get the credentials

As a Meta Ads administrator, perform the following actions in your Meta Ads account:

  1. Create a Meta App (https://developers.facebook.com/docs/development/create-an-app/) or ensure that you have access to one.

  2. Enable Marketing API (https://developers.facebook.com/docs/marketing-api/get-started) in the App dashboard (https://developers.facebook.com/apps).

  3. Generate a long-lived token (https://developers.facebook.com/docs/facebook-login/guides/access-tokens/get-long-lived/).

  4. Optional: Increase the rate limit by changing the app access type (https://developers.facebook.com/docs/marketing-api/overview/rate-limiting) from Standard access to Advanced access of the Ads Management Standard Access. Enable the ads_read and ads_management permissions (https://developers.facebook.com/docs/permissions/).

Set up Snowflake account

As a Snowflake account administrator, perform the following tasks:

  1. Create a new role or use an existing role and grant the Database privileges.

  2. Create a new Snowflake service user with the type as SERVICE.

  3. Grant the Snowflake service user the role you created in the previous steps.

  4. Configure with key-pair auth for the Snowflake SERVICE user from step 2.

  5. Snowflake strongly recommends this step. Configure a secrets manager supported by Openflow, for example, AWS, Azure, and Hashicorp, and store the public and private keys in the secret store.

    Note

    If for any reason, you do not wish to use a secrets manager, then you are responsible for safeguarding the public key and private key files used for key-pair authentication according to the security policies of your organization.

    1. Once the secrets manager is configured, determine how you will authenticate to it. On AWS, it’s recommended that you the EC2 instance role associated with Openflow as this way no other secrets have to be persisted.

    2. In Openflow, configure a Parameter Provider associated with this Secrets Manager, from the hamburger menu in the upper right. Navigate to Controller Settings » Parameter Provider and then fetch your parameter values.

    3. At this point all credentials can be referenced with the associated parameter paths and no sensitive values need to be persisted within Openflow.

  6. If any other Snowflake users require access to the raw ingested documents and tables ingested by the connector (for example, for custom processing in Snowflake), then grant those users the role created in step 1.

  7. Designate a warehouse for the connector to use. Start with the smallest warehouse size, then experiment with size depending on the number of tables being replicated, and the amount of data transferred. Large table numbers typically scale better with multi-cluster warehouses, rather than larger warehouse sizes.

Configure the connector

As a data engineer, perform the following tasks to configure a connector:

  1. Create a database and schema in Snowflake for the connector to store ingested data.

  2. Download the connector definition file.

  3. Import the connector definition into Openflow:

    1. Open the Snowflake Openflow canvas.

    2. Add a process group. To do this, drag and drop the Process Group icon from the tool palette at the top of the page onto the canvas. Once you release your pointer, a Create Process Group dialog appears.

    3. On the Create Process Group dialog, select the connector definition file to import.

  4. Right-click on the imported process group and select Parameters.

  5. Populate the required parameter values as described in Flow parameters.

Flow parameters

This section decribes the flow parameters that you can configure based on the following parameter contexts:

MetaAdsConnectionContext

Parameter

Description

Example

Required

Access Token

Token required to request Meta Ads Insights API

Not applicable

Yes

SnowflakeConnectionContext

Parameter

Description

Example

Required

Snowflake Account

Name of the Snowflake account to which the connection is to be made

example.snowflakecomputing.cn

Yes

Snowflake User

Username used to connect to the Snowflake instance

admin

Yes

Snowflake Role

Snowflake role used during query execution

N/A

Yes

Snowflake Private Key

The RSA private key used for authentication. The RSA key must be formatted according to PKCS8 standards and have standard PEM headers and footers. Note that either Snowflake Private Key File or Snowflake Private Key must be defined.

Not applicable

No

Snowflake Private Key File

The file that contains the RSA private key used for authentication to Snowflake, which is formatted according to PKCS8 standards and has standard PEM headers and footers. The header line starts with -----BEGIN PRIVATE.

/opt/resources/snowflake/rsa_key.p8

No

Snowflake Private Key Password

The password associated with the Snowflake Private Key File

Not applicable

No

Warehouse Name

Snowflake warehouse on which the queries are executed

APP_WAREHOUSE

Yes

GetMetaAdsReportContext

Parameter

Description

Example or possible values

Required

Report Name

Name of the report to be used as a destination table name. The name must be unique within the destination schema.

Example: MY_REPORT_NAME

Yes

Report Object Id

Identifier of the downloaded object from Meta Ads

Example: 123456

Yes

Report Ingestion Strategy

Mode in which data is fetched, either snapshot or incremental

Possible values:

  • SNAPSHOT

  • INCREMENTAL

Yes

Meta Ads Version

Version of Meta Ads API used for downloading reports

Possible value: v22.0

Yes

Report Level

Presents the aggregation level of the result

Possible values:

  • account

  • campaign

  • ad

  • adset

No

Report Fields

List of report fields

Example values separated by comma: campaign_name, ad_id

No

Report Breakdowns

List of report breakdowns

publisher_platform, device

No

Report Time Increment

Level of aggregation based on the day count

Possible values:

  • 1 - Daily

  • 3 - Every 3 days

  • 7 - Weekly

  • monthly - Monthly

  • 90 - Quarterly

  • all_days - All days; do not slice the result

No

Report Action Time

Time of action stats

Possible values:

  • conversion - Reports action based on conversion date

  • impression - Reports action based on impression date

  • mixed - Mixed approach between conversion and impression

No

Report Click Attribution Window

Attribution window for the click action

Possible values:

  • 1d_click

  • 7d_click

  • 28d_click

No

Report View Attribution Window

Attribution window for the view action

Possible values:

  • 1d_view

  • 7d_view

  • 28d_view

No

Report Schedule

Schedule time for processor creating reports

Example: 1 day

Yes

Destination Database Name

Name of the Snowflake database where the data will be ingested

Example: EXAMPLE_DB

Yes

Destination Schema Name

Name of the Snowflake schema where tables will be created

Example: EXAMPLE_SCHEMA

Yes

Run the flow

  1. Right-click on the plane and select Enable all Controller Services.

  2. Right-click on the imported process group and select Start. The connector starts the data ingestion.

Language: English