Skip to main content

How Visma migrated to Google Analytics 4 – and how you can too

In 2023, Google will retire their current version of Google Analytics – called Universal Analytics (UA) – and replace it with Google Analytics 4 (GA4). Visma has many websites that currently use UA, and we’ve been working hard on transitioning to GA4. Find out how we did it in this handy overview.

At Visma we’ve been working hard to implement GA4 on all of our websites that use analytics from Google. That includes a number of web properties, including visma.com, visma.no, and many more.

During the project, we documented our learnings so that companies within Visma wanting to do the same had a clear framework to follow. Now that we’re done, we hope this can be of help to any company that is transitioning to GA4.

We’ll take you through the key phases we went through, and the tasks related to each phase. We hope this can serve as a template for your implementation project (whether it’s done by your internal team or by an external web analytics partner).

Project goal

Make sure to spend some time defining a project goal so that you get the desired outcome. A well-formulated goal gives you a destination and provides you with success criteria so you know whether the project has been a success.     

If your setup is already implemented and you’re happy with it, the goal could be to implement GA4 while still supporting the same insights as Universal Analytics (UA).

In this project we set the following goals:

  1. Implement GA4 on all relevant web properties, ensuring that we meet the same requirements for tracking and insights as today’s UA setup
  1. Set up server-side tag management using a “Visma-owned server” (Google Cloud Platform) to comply with privacy regulations

Since GA4 is a total rebuild of UA, an additional goal was to learn more about the potential, functionality and more modern approach to web analytics that GA4 offers.

Project organisation

In most projects, it’s beneficial to have an assigned Project Manager. Even though the majority of tasks were executed by specialists, the Project Manager was there to help the teams break down the project into more manageable pieces, by assigning tasks, milestones, and deadlines.

A designated Project Manager can direct teams more effeciently, allowing your team to focus on the work that matters, free from the distractions caused by tasks going off track or budgets spinning out of control.

Project Phases

If you divide your project into phases, you can ensure that the deliverables produced at the end of each phase meet their purpose, and that project team members are properly prepared for the next phase.

We therefore chose to split the project into phases, making it easier to track progress while working on different tasks and/or phases simultaneously.

Phase 0: Preparation

In the preparation phase, you should communicate with stakeholders to find out what the scope of your project should be. 

  • Is it crucial to bring everything from UA to GA4?
  • Is something missing in today’s setup? 
  • Other questions?

We think that the time spent in the preparation phase was really important for the outcome. We would highly recommend spending enough time on preparation before starting a similar project.

Here are the tasks we set up for this phase:

Phase 1: Configuration

The focus of this phase was to find out what needed to be prioritised, and what could be skipped or removed in the new implementation.

As we prepared the interview questions, we had the users of Google Analytics in mind. Hence, it was important to reach out to these stakeholders and pinpoint their use of GA today, what they were missing, what they expected from a new setup and also what was less important to them. Doing this, we got really valuable insights for translating the old setup into the new one. This is reflected in the feedback analysis in this task overview:

Phase 2: Integrations

Phase 2 revolved around creating documentation in a way that would support us during the project. But, more importantly, it had to make it easy for others to reproduce the work.

That translated to writing out all the events along with the data you want to send with each event, called event parameters. As an example, we can use the event form_submit. When sending this event to GA4, it is not very valuable on its own. But if we populate the event with the timestamp, the form_type, business_unit and so on, it will be much more valuable for users. Remember though that more data isn’t always better, so it is important to find a balance here.

Here’s an overview of all the tasks we set up for this phase:

Phase 3: Data Management

In this phase, the goal is to have a test environment so you can test your new setup on the fly. In the case of Google Tag Manager, you can create containers to isolate changes before they are published. It is Google’s own version control, allowing you to test changes without affecting the production environment until necessary.

Here’s an overview of all the tasks we set up for this phase:

Click through the website and try to trigger all the tags to see if values are being collected as they should. Use the debug feature in the GA4 interface.

Phase 4: Testing

The main goal of the test phase is to map and evaluate whether the proposed setup works as intended. This is an iterative process, meaning we used trial and error to adjust our work on the fly and inform the next step in the process.

We did testing on two different levels. The first was through Google Tag Manager, and the second was in Big Query where we collected and queried data on a more granular level.

We recommend double-checking the data in BigQuery. However, if this is not an alternative for you, there is a built-in module called DebugView in GA4, which you find under Configure in the left-side menu. Here, it’s possible to debug your data flow live. For a thorough guide to this feature, we recommend this resource.

After publishing the first version, data will start to flow into GA4 as well as BigQuery (if the link is made). This is when testing the data flow by using BigQuery comes in handy. This is probably the most important quality assurance of the data we are collecting. 

By applying BigQuery, we tested whether we were:

  • sending the right data
  • actually collecting the data
  • having any double tracking 
  • collecting any personally identifiable information (PII)

Here’s an overview of all the tasks we set up for this phase:

Phase 5: Go Live Preparations

In this phase, you prepare for going live with your new GA4 setup. 

Just like phase 4, this phase consists of a lot of testing. The setup does not have to be perfect before you publish it, but we recommend you have a good idea of what the output looks like. 

When the flaws are identified and fixed, it’s time to apply and implement. This moment – when the setup is live – is the first time that the connection to BigQuery is possible. However, we also elaborated on it in the previous phase to emphasise the importance of testing there.

Further, we worked with the zones, or country-specific areas. When focusing on GA data, the heavy lifting is done in the base container, but there are some country-specific tags in the various zones that need to be accounted for. 

Finally, we made the link to Google Search Console once the zones were “live”. This is a new feature where you can get your Search Console data directly in the interface of GA4.

Here’s an overview of all the tasks we set up for this phase:

Phase 6: Documentation and Training

In this phase, we finalised documentation and planning of training sessions for anyone in Visma who might want to implement our setup on their properties. 

As previously mentioned, a big part of our project scope was to make sure our implementation was well documented and that our experiences could be used as “best practice” for Visma companies wanting to implement GA4 on their sites. 

Here’s an overview of all the tasks we set up for this phase:

While preparing, implementing and testing, you should remember to document the technical implementation as you go. This will make it easier to replicate and it will be a great tool for you to learn even more.

So, what did we learn?

The time spent on preparation had a big impact on the outcome. We highly recommend spending enough time on preparation before starting a similar project.

We found the support and documentation provided by Google really helpful. And we think Google has done a great job in this regard, which also helped us in our implementation.

As we now close the project, we are really looking forward to further familiarising ourselves with the new and exciting features in Google Analytics 4 and learning more about its potential, functionality and its more modern approach to web analytics.

Tips, tricks and things to think about

Before we wrap up this article, we would like to share some insight that we gained during the course of the project. 

Here are some more things you should think about: 

  • Remember: Closure data on UA : July 1st, 2023 
  • Plan the migration of events before you start – The structure is different so “translating” it directly might not be the best way to do it.
  • Within the GA4 property, the data collection revolves around data streams. In Google’s own words, a data stream is “a flow of data from a customer touchpoint” (e.g., app, website) to Analytics.

Explore other ways Visma works with technology

Visit the Technology topic