ProductPricingAboutBlogRequest demo
Log inRequest demo

Categories

Welcome to Campaignswell
About documentation
Marketing spend data
Advertising platforms
Apple Search Ads
AppLovin Ads
Unity Ads
Moloco Ads
Mintegral
Affiliate marketing
Everflow
Ad trackers
Voluum
Attribution
Mobile
Appsflyer
Adjust
Singular
Branch
Web
Attribution links
Google Analytics
Amplitude
Monetization data
Subscription and in-app
AppStore Connect
Google Play
Stripe
PayPal
BrainTree
CheckOut
FastSpring
Apphud
Adapty
RevenueCat
TapJoy
Chargebee
Ads monetization
AppLovin Max
Home
/
Documentation
/
Google Analytics

How to integrate Campaignswell with Google Analytics

‍

‍

Step 1. Google Cloud BigQuery Project

‍

Open Google Cloud Console

‍

Then click project menu

‍

And click to create new project 

‍

Name your project and click Create button

‍

Now you can select your newly created project in the projects menu

‍

You should enable the BigQuery API first. Type bigquery api in the search bar and select corresponding option

‍

Enable this API and now you can move to the next step

‍

Step 2. Google Analytics BigQuery Link

‍

Open your Google Analytics account admin panel and select BigQuery links there

‍

Click Link button

‍

Then choose your previously created BigQuery project

‍

And select all the available checkboxes in the next step of linking

‍

Congratulations. Link is successfully created. Your data will be available in BigQuery in approximately 24 hours from now.

‍

Step 3. Create Google Cloud Storage Bucket

 

Go back to your previously created BigQuery project and type cloud storage in the search bar

‍

Click Create button there

‍

Name your bucket (example: cw-data-bucket-name)

‍

Follow the steps in bucket creation dialog and once all the steps are done you will see Create button

‍

Press it and you will see confirmation dialog

‍

Congratulations. Bucket is created. Now fulfil bucket name like in the Table below

‍

Paste your bucket name in our UI

‍

Step 4. BigQuery Scheduled Queries

‍

Open BigQuery and choose project name created during the Step 1 and click CREATE SQL QUERY

‍

Fill the <project>, <database>* as it is shown on the picture below and take <bucket_name> from the Step 3

*Note: Database(s) will be created automatically, and may take up to 24 hours. It's not necessary to create it manually

‍

ga4_events_intraday_export

‍

Copy/Paste script below

Script: ga4_events_intraday_export

‍

‍

BEGIN

 CREATE TABLE IF NOT EXISTS `<project>.<database>.cw_increment_events_intraday`

 AS

 SELECT event_timestamp, event_name, event_bundle_sequence_id, CURRENT_TIMESTAMP() AS load_dttm

 FROM `<project>.<database>.events_intraday_*`

 WHERE 1 = 2;

 CREATE TEMP TABLE ga4_events_intraday_export_temp

 AS (

   SELECT ev.*

   FROM `<project>.<database>.events_intraday_*` ev

     LEFT JOIN `<project>.<database>.cw_increment_events_intraday` incr ON

       ev.event_timestamp = incr.event_timestamp

       AND ev.event_name = incr.event_name

       AND ev.event_bundle_sequence_id = incr.event_bundle_sequence_id

   WHERE _TABLE_SUFFIX BETWEEN REPLACE(STRING(CURRENT_DATE()-1), '-', '') AND REPLACE(STRING(CURRENT_DATE()+1), '-', '')

AND incr.event_timestamp IS NULL

 );

 EXPORT DATA

   OPTIONS (

     uri = 'gs://<bucket_name>/ga_events_intraday/<database>/' || EXTRACT(YEAR FROM CURRENT_DATE) || '/' || EXTRACT(MONTH FROM CURRENT_DATE) || '/'|| EXTRACT(DAY FROM CURRENT_DATE) || '/ga4_events_intraday_export_temp_' || UNIX_MICROS(CURRENT_TIMESTAMP()) || '_*.parquet',

     overwrite=true,

     format = 'PARQUET',

     compression = 'SNAPPY')

 AS (

   SELECT * FROM ga4_events_intraday_export_temp

 );

 INSERT INTO `<project>.<database>.cw_increment_events_intraday`

 SELECT event_timestamp, event_name, event_bundle_sequence_id, CURRENT_TIMESTAMP() AS load_dttm FROM ga4_events_intraday_export_temp;

 DELETE FROM `<project>.<database>.cw_increment_events_intraday`

 WHERE DATE_DIFF(CURRENT_TIMESTAMP(), load_dttm, HOUR) > 50;

END

‍

‍

‍

Set up the schedule: name = ga4_events_intraday_export, every 5 minutes

‍

ga4_events_export

‍

Create new empty SQL Script

Copy/Paste script below

Script: ga4_events_export

‍

‍

BEGIN

 IF EXISTS (

       SELECT *

       FROM `<project>.<database>.__TABLES__`

       WHERE table_id = 'events_' || REPLACE(STRING(CURRENT_DATE() - 1), '-', '')

         AND TIMESTAMP_ADD(CURRENT_TIMESTAMP(), INTERVAL -16 MINUTE) <= TIMESTAMP_ADD(TIMESTAMP "1970-01-01", INTERVAL creation_time MILLISECOND)

         AND CURRENT_TIMESTAMP() > TIMESTAMP_ADD(TIMESTAMP "1970-01-01", INTERVAL creation_time MILLISECOND)

     )

 THEN

   CREATE TEMP TABLE ga4_events_export_temp

   AS (

     SELECT *

     FROM `<project>.<database>.events_*`

     WHERE _TABLE_SUFFIX BETWEEN REPLACE(STRING(CURRENT_DATE()-3), '-', '') AND REPLACE(STRING(CURRENT_DATE()), '-', '')

   );

   EXPORT DATA

     OPTIONS (

       uri = 'gs://<bucket_name>/ga_events/<database>/' || EXTRACT(YEAR FROM CURRENT_DATE) || '/' || EXTRACT(MONTH FROM CURRENT_DATE) || '/'|| EXTRACT(DAY FROM CURRENT_DATE) || '/ga4_events_export_temp_' || UNIX_MICROS(CURRENT_TIMESTAMP()) || '_*.parquet',

       overwrite=true,

       format = 'PARQUET',

       compression = 'SNAPPY')

   AS (

     SELECT * FROM ga4_events_export_temp

   );

 END IF;

END

‍

‍

Set up the schedule: name = ga4_events_export, every 15 minutes

‍

As a result you should see 2 scheduled queries

‍

‍

Step 5. Google Cloud Storage Permissions

‍

Press the checkbox once the steps above are done. The table in your integration doc should look like in the example below

‍

Creating a Custom IAM Role

Create a custom role that has the permissions required to access the bucket and get objects.

1. Log into the Google Cloud Platform Console as a project editor.

2. From the home dashboard, choose IAM & admin » Roles.

3. Click Create Role.

4. Enter a name, and description for the custom role.

5. Click Add Permissions.

6. Filter the list of permissions, and add the following from the list: storage.buckets.get,storage.objects.get, storage.objects.list

7. Click Create.

‍

Assigning the Custom Role to the Cloud Storage Service Account


1. Log into the Google Cloud Platform Console as a project editor.

2. From the home dashboard, choose Cloud Storage » Browser:

Bucket List in Google Cloud Platform Console

‍

3. Select a bucket created during the Step 3 (example cw-data-bucket-name)

4. Click the ADD PRINCIPAL button.

‍

5. In the New principals field, search for the service account name vbamqhyxhc@va3-22da.iam.gserviceaccount.com (campaignswell Service Account)

Bucket Information Panel in Google Cloud Platform Console

‍

6. From the Select a role dropdown, select Custom » <role>, where <role> is the custom Cloud Storage role you created in Creating a Custom IAM Role (in this Step).

7. Click the Save button. The service account name is added to the Storage Object Viewer role dropdown in the information panel.

Storage Object Viewer role list in Google Cloud Platform Console

‍

Step 6. Pub/Sub Topic/Subscription

‍

Creating the Pub/Sub Topic

Create a Pub/Sub topic using Cloud Shell or Cloud SDK.

‍

Execute the following command to create the topic and enable it to listen for activity in the specified GCS bucket:

gsutil notification create -t <topic> -e OBJECT_FINALIZE -f json gs://<bucket-name>

Where:

  • <topic> is the name for the topic.(for example: cw-ga4-topic-name)
  • <bucket-name> is the name of your GCS bucket created in the Step 3 

If the topic already exists, the command uses it; otherwise, the command creates a new topic.

‍

Creating the Pub/Sub Subscription

Create a subscription with pull delivery to the Pub/Sub topic using the Cloud Console:

In Pub\Sub service choose the project created during the Step 1, in the Subscription section click Create Subscription

‍

Enter Subscription-id: for example cw-ga4-subscription 

Choose topic created here (within this step earlier)

Delivery type: Pull

The rest could be default

Click Create

‍

Provide us with the Pub/Sub Subscription ID

Click on the Subscription created during the previous section

‍

Copy the Subscription Name

‍

Paste in the  UI

‍

‍

Step 7. Pub/Sub Permissions

‍

1. Log into the Google Cloud Platform Console as a project editor.

2. From the home dashboard, choose Big Data » Pub/Sub » Subscriptions.

3. Select the subscription created during the previous step.

4. Click SHOW INFO PANEL in the upper-right corner. The information panel for the subscription slides out.

5. Click the ADD PRINCIPAL button.

‍

6. In the New principals field, search for the service account name xbvddzmrms@va3-22da.iam.gserviceaccount.com (campaignswell Service Account)

7. From the Select a role dropdown, select Pub/Sub Subscriber.

8. Click the Save button. The service account name is added to the Pub/Sub Subscriber role dropdown in the information panel.

‍

9. Navigate to the Dashboard page in the Cloud Console, and select your project from the dropdown list.

‍

10. Click the ADD PEOPLE TO THIS PROJECT button.

‍

11. Add the service account name xbvddzmrms@va3-22da.iam.gserviceaccount.com (campaignswell Service Account)

12. From the Select a role dropdown, select Monitoring Viewer.

13. Click the Save button. The service account name is added to the Monitoring Viewer role

‍

‍

Step 8. Complete Integration

Press the checkbox once your integration is setup

‍

Bucket Subscription ID Status
cw-data-<example> projects/bigqueryproject-ga/subscriptions/<example>

‍

‍

‍

Step 9 (optional).Configure data streams and events to prevent BigQuery 1,000,000 daily export limits

‍

Go to BigQuery links

‍

Go to Configure data streams and events

‍

You can exclude unnecessary Stream

‍

Or specify events by name to exclude

‍

Apply and Save changes!

‍

Do not sellTerms of ServicePrivacy PolicyCustomer DPAinfo@campaignswell.com
Made By FactorTheme. Powered By Webflow
Privacy PolicyTerms of Service
Close Cookie Popup
Cookie Settings
By clicking “Accept All Cookies”, you agree to the storing of cookies on your device to enhance site navigation, analyze site usage and assist in our marketing efforts. More info
Accept All Cookies
Cookie Settings
Close Cookie Preference Manager
Cookie Settings
By clicking “Accept All Cookies”, you agree to the storing of cookies on your device to enhance site navigation, analyze site usage and assist in our marketing efforts. More info
Strictly Necessary (Always Active)
Cookies required to enable basic website functionality.
You may not disable strictly necessary cookies.
Accept All Cookies
Save Settings
Close Cookie Preference Manager
Your Privacy Choices
We allow certain online advertising partners to collect information from our services through cookies and similar technologies to deliver ads which are more relevant to you and assist us with advertising-related analytics. This may be considered "selling" or "sharing" for targeted online advertising under certain laws. To opt-out of this activity, use the toggle below.
You may learn more about our privacy
practices by reviewing our Privacy Policy.
Do Not Sell or Share
My Personal Information
We allow certain online advertising partners to collect information from our services through cookies and similar technologies to deliver ads which are more relevant to you and assist us with advertising-related analytics. This may be considered "selling" or "sharing" for targeted online advertising under certain laws. To opt-out of this activity, use the toggle below.
Accept All Cookies
Save
Cookies Settings