search rankings on google on mobile

Utilizing Google Ads Experiments: How Testing Can Drive Consistent and Efficient Growth

search rankings on google on mobile

Over the last few years, Google has made drastic improvements to their Ad’s platform as a whole. Automation is being put to the forefront, new campaign types are helping advertisers reach more users than ever, and efficiently optimizing your account is now at the tip of your fingers. One place where we are seeing some of the biggest changes is Google Ad’s Experiment feature. This tool has become a necessity for optimization across all of Tuff’s partners.

When we think about what it means to be a growth marketing agency, testing is maybe the most important aspect of all strategies. In order to scale, you need to learn what works best for you and your business, and learn quickly. If you’re capitalizing on PPC channels, this is where Google’s Experiments come into play.

What kind of testing is available?

Google Ad’s experiment tool is split up into two areas; Custom Experiments and Ad Variations.

Both are incredibly useful in split testing different components of an ad account to allocate spend equally across variables. Let’s start off by tackling Ad Variations.

Ad Variations

An Ad Variation is a split test that specifically tests one variable at the ad level. These default to a 50/50 split, but Google does let you adjust that number if a different ratio is preferable. This function is perfect for testing any variation on Ad Copy or URLs for your search ads. You have the ability to set a start/end date, but our recommendation when running an ad variation is to keep the test live for at least 2 weeks. 

Custom Experiments

This is where you can test much more than just copy and final URLs in Google Ads. While Ad Variations are limited to Search, Custom Experiments can be applied to Video and Display campaigns as well. 

Testing any variable about a campaign is possible when creating a Custom Experiment for Search or Display. From bidding strategy to keyword targeting, you can quickly figure out which settings work best for your marketing goals. These experiments will show up as a new campaign in your account to make reporting a breeze. Any changes you make to budgets will be made automatically to the Test Campaign, ensuring that budget is split evenly across the two. This feature is incredibly handy when dealing with fluctuation in budget during testing. 

Hank Bidding Strategy Experiment

For Tuff’s partner Hank, a NYC based startup creating a thriving community of Adults 55 and older, testing bidding strategies early on in our growth marketing plan was imperative. With a shift in business objectives, we wanted to see which bidding strategy brought in the lowest CPL: Maximize Clicks or Maximize Conversions. Max Clicks was our Base campaign while Max Conversions was our Trial. 

We set this up to only experiment in our highest spending campaign. This way, Google has as much data as possible to gather insights, but we are still keeping the test small scale to keep a close eye on early performance. 

After 3 week of data, results between the two bidding strategies were pretty similar. The Base campaign spent 20% more over the time period, which resulted in 19% more conversions. This ratio makes sense, more spend brought in an equal amount of leads. Click Through Rate and CPC were about the same between the two, with no significant difference. Where we started to see a clear winner was in the efficiency metrics. 

Conversion Rate for our Trial campaign was 12% higher with a CPL 8% lower than our base. These are exactly the kind of metrics we wanted to see. A Max Conversions strategy, at its core, is meant to bring in higher quality traffic that converts more often and at a cheaper price. We were able to prove just that with this experiment. 

Exploring Broad Match Keywords With Mental Health Start Up MyWellbeing  

Earlier this year, we were looking to scale our PPC volume to bring in new users for our mental health partner, MyWellbeing, a startup with a mission of connecting people to their perfect therapist. One way to do this is to incorporate Broad Match keywords. This can be tricky though, Broad Match keywords tend to bring in some unwanted search terms to an account. An easy way to figure out if this keyword type will find success is to set up an experiment, and that’s exactly what we did. 

In 2 weeks of running the experiment, we were able to get clear results in favor of Phrase + Exact Match keywords (Base) instead of Broad Match (Trial). Here are the results:

  • Trial Conversions = 16% less than Base
  • Trial CPL = 22% more than Base
  • Trial Conversion Rate = 23% lower than Base

Broad Match performed worse across the board on all conversion-related metrics in Google, the exact metrics that we were looking to improve. This was an easy decision for us to not move forward with this keyword match type. Without an experiment like this, we would have seen performance drop on a much larger scale if we implemented the changes across campaigns. Another reason why we love to test, learn, and retest here at Tuff!

Split Testing Landing Pages with Hustle

For our partner Hustle, a peer-to-peer text message marketing platform, Google’s Experiments were the perfect way for us to learn which landing page worked best for our new campaign structure. To generate quality leads, it’s important to be sending users not just to your site, but to the right areas of your site. We wanted to find out if our Homepage, or our more streamlined demo request page, performed better within our campaigns. Here’s what we found:

We set up two different experiments for the two audience groups we were targeting. One of them was targeting more general keywords, the ones that are lower intent and naturally have some extra question marks surrounding them. The other test was with our higher intent, business line-related audience. These keywords naturally are more focused and come with some baseline knowledge of the product. 

What we found was that for general keywords, the Homepage was the right page to send traffic to. This landing page has more information, giving users a learning experience. CPL was 15% lower on our Homepage and brought in 20% more conversions than the Demo page we were testing it against. This was a clear sign that the more content heavy landing page worked better for this audience.

On the flipside, we found opposite results for our higher intent audience. The streamlined Demo page brought in a CPL that was 40% lower than the Homepage. This alone was grounds for us to call it a successful experiment after a few weeks. 

Regardless of the optimizations you want to test in Google Ads, Experiments go a long way in helping make that decision quickly and with actionable data. Tests like these go a long way in helping you understand what works best for YOU when it comes to Google Ads. No matter the industry or business goals being measured, you can trust that Experiments will push you in the right direction.