Jun 20, 2023

Automating GA4 Cost Data Import - Part 1

Lean how to automate importing ads cost data to Google Analytics 4 (GA4)

Automating GA4 Cost Data Import - Part 1

Having a consolidated view of your advertising campaigns is essential for assessing their performance. While individual advertising platforms can provide insights into specific campaigns, a comprehensive view is often needed for a holistic understanding. This is where Google Analytics 4 (GA4) cost import comes in handy for many marketers.

While importing cost data from Google Ads to GA4 is straightforward due to native integration, there's no built-in functionality for automatically importing data into GA4 from non-Google ad platforms like Facebook, Microsoft, LinkedIn, and Twitter. GA4 supports 2 ways to import cost data:

  1. Manual CSV Import via the GA4 Interface
  2. Automatic import via SFTP which we will cover mainly through this article

Automating Cost Data Import with SFTP

In this article, we will explore the automation of cost data import into GA4 using SFTP. I will guide you through building a complete data pipeline that retrieves advertising data from platforms such as Facebook and uploads the cost data to an SFTP server, which GA4 can access. The article will be divided into five main sections:

  1. Setting up the SFTP server on GCP
  2. Authenticating GA4 to access the SFTP server using public key
  3. Building a Python pipeline between the SFTP server and advertising platforms (e.g. Facebook).
  4. Deploying the pipeline to Cloud Function
  5. Scheduling the pipeline using Cloud Scheduler

This article is divided into two parts. in Part 1 will cover the setup of the SFTP server, GA4 authentication, and a file upload test to verify functionality. In Part 2, our focus will shift to building the Python pipeline that automates file transfers to the SFTP server which you can read here:

Automating GA4 Cost Data Import - Part 2

Let's begin with Part 1 and lay the groundwork for a streamlined cost data import process.

1- Setting up the SFTP server

In this article, we will be using Google Cloud as our platform of choice. However, it's important to note that you can use any cloud platform such as AWS. Now, let's dive into the process.

A- Creating the VM machine

Step 1: Navigate to the GCP Console and create a new VM instance.

Step 2: Provide a name for your instance and select the micro E2 machine type. Alternatively, you can choose any other machine type according to your specific requirements. However, for the purpose of this exercise, we don't need anything fancy, so the micro E2 machine type will suffice.

Step 3: In the advanced options, navigate to the network settings and enter 'ga4-sftp' as the firewall rule. This rule will be created in the next step and will enable us to expose port 22 for the SFTP server.

Step 5: Click "Create" to create the VM instance.

B- Configuring the firewall:

Now that we have created the VM machine, it's time to configure the firewall to allow access to the SFTP server through port 22. Follow these steps:

Step 1: Navigate to the "VPC Network" section in the GCP Console and click on "Firewall".

Step 2: Click on "Create Firewall Rule".

Step 3: Provide a name for your rule, and in the "Network target tags" field, set it to 'ga4-sftp', which is the tag we previously added to the network settings of the VM.

Step 4: Set the "Source IP ranges" to "" to allow incoming connections from any IP. This will ensure that the SFTP server can accept connections from GA4.

Step 5: Set the port to 22 under the TCP protocol. This will configure the firewall rule to allow incoming connections specifically on port 22, which is the standard port for SFTP.

Step 6: Click on 'Create' to deploy your firewall rule. This will finalize the configuration and apply the firewall rule to allow incoming connections on port 22 for the SFTP server.

C- Installing the SFTP server

Next, we will connect to our VM machine to configure the SFTP server. Follow these steps:
On the VM page, locate the machine that you created and click on "SSH" to access the terminal.

Once you are in the terminal, execute the following commands to install the OpenSSH server:

sudo apt-get update
sudo apt-get install openssh-server

Now, let's create a new user specifically for SFTP. We'll name this user 'ga4-sftp'.
To create the user, use the following command in the terminal:

sudo adduser ga4-sftp

You'll be prompted to enter and confirm a new UNIX password, and to enter additional information about the new user.
To configure the server to accept both password authentication (for accessing the SFTP server through your Python script) and public key authentication (which Google Analytics will use to access your server), run the following command to open the SSHD configuration file:

sudo nano /etc/ssh/sshd_config

To allow 'PasswordAuthentication' and 'PubkeyAuthentication' access to the server, add the following three lines to the end of the SSHD configuration file:

ClientAliveInterval 120
PasswordAuthentication yes
PubkeyAuthentication yes

These lines ensure that the SSH server keeps the connection alive, allows password authentication, and enables public key authentication. Once you have added these lines, save the changes by pressing Ctrl + O and exit the Nano editor by pressing Ctrl + X. With these configurations in place, your server is now ready to accept connections from GA4 and your Python scripts.

2- Authenticating GA4 to access the server using the public key

A- Create the cost data source in GA4

To upload cost data from non-Google channels, you'll need to create a data source in GA4. Go to the Admin panel of your GA4 account, select Data Import then click on ‘Create data source’

Give your data source a name then select ‘Cost Data’

Select "SFTP" as the import method for your data source. Here are the details you need to provide for the SFTP configuration:

SFTP server username: Enter the username that you created when setting up the server. In this case, it should be 'ga4-sftp'.

SFTP server URL:  The server URL consists of three parts:

  • IP address:Enter the IP address of your VM
  • Home directory: //home/ga4-sql
  • File path: /upload.csv

Putting it all together, here's an example of the SFTP server URL: sftp://

Set the schedule for data import according to your desired frequency. In this case, we will select ‘Daily’. Click on "Next" to proceed to the next step.

You will be asked to map the Analytics field with your CSV file that will have cost data. The columns in your CSV files must match this column mapping. I changed the column mapping to be as follows:

Finally, click on 'Create & Generate Key' to create the data source and generate the key required for authentication. Once the key is generated, make sure to copy it and keep it in a secure location. We will use the key in the next steps to allow GA4 to access our VM.

B- Adding GA4 Public key to your SFTP server

Return to your VM terminal and execute the following commands to create an 'authorized_keys' file within the '.ssh' directory, which will store the public key:

sudo mkdir /home/ga4-sftp/.ssh
sudo touch /home/ga4-sftp/.ssh/authorized_keys

Next, it's time to add the GA4 public key to the 'authorized_keys' file. Replace "your-public-key" with the actual GA4 key that was generated in the GA4 UI. Run the following command:

echo "your-public-key" | sudo tee -a /home/ga4-sftp/.ssh/authorized_keys

Now, GA4 should have the necessary access to your SFTP server. Let's perform a quick test to ensure that Google Analytics can access the files on our server.
For this test, you can create a sample CSV file named 'upload.csv' that includes the columns specified in the column mapping we set up in the GA4 UI.

I will upload the file to the home directory of the 'ga4-sftp' user. Simply click on 'Upload File' and ensure that the file is placed in the designated directory we have specified in the Google Analytics UI: /home/ga4-sftp/upload.csv

Now, let's return to the Google Analytics UI and click on 'Import Now' to initiate a test.

If everything goes smoothly, you will receive a success message indicating that the operation was successful.

Now that our test has been successful, it's time to move on to the next step: automating the process of uploading the CSV file to SFTP. You can read the second part here.