Why You Need Call Tracking for a Full View of the Customer Journey

More and more interactions are shifting online, but for businesses with a longer sales cycle or have high dollar goods and services, phone calls are still an essential part of business. The source of these calls remained a huge blind spot for many marketers until the invention of call tracking. Read below for four reasons why you need call tracking to get a full view of the customer journey.

Get Granular Insights on Paid Media Efforts

Many people understand the benefits of using call tracking numbers on your website to understand how many people called from desktop, as opposed to click to call tracking, which as a general rule of thumb, will only capture mobile phone calls. But what you may not realize is that call tracking software can capture more than just the number of calls you get. 

A call tracking software can capture your full URL, including UTM codes. Which means if you’ve put the name of your ad creative or copy in the utm_content field and the keyword in the utm_term field, you can now tell not only which campaigns drove the highest number of calls, but you can also drill down into which specific ads resonated most with your users. Feel free to get creative with your UTM fields for more granular insights on targeting options, ad placements, and ad size (remember that competitors can see your UTM codes, though). The best part a call tracking software that uses UTMs though? They’re platform agnostic, so even if you’re not using Google Analytics, you don’t have to miss out on insights.

Track All Calls from Your Google Listing (Google Business Profile)

Your website isn’t the only place online where people can find your phone number. If you have local businesses, you likely have local online listings, such as with Yelp, Facebook, and especially Google Business Profiles (FKA Google My Business). With its prominence on the search engine results page, your Google listing is the online listing likely to get the most traffic, but did you know their call reporting only reports on mobile calls? Enter your static call tracking number. 

By accurately capturing the amount of calls you receive from your Google listings, you can help better understand the ROI of your online profiles. From there, you may decide it’s worth it to engage with a listing management partner to ensure your listing is always up to date and can take advantage of the latest feature releases. Or you may decide that you need to venture into paid search for your local listings.

Take Credit for Offline Conversions

Understanding how traditional mediums impact your bottom line is just as important as understanding how digital mediums impact your bottom line. If you’re running any form of direct mail, billboard advertising (OOH), or even partnering with another company for a sponsorship opportunity, I highly recommend you invest in call tracking.  

Without call tracking, you are forced to rely on modeling and third-party attribution tools to help you understand what impact your offline advertising might have had. Instead, take a small part of that budget so you understand definitively what impact you had. Now, here I want to address an elephant in the room – this idea isn’t new, but not many people do it because they assume there is a high cost associated with call tracking. I’m personally partnered with Call Rail, which has plans starting at $45 for 250 minutes. With such a low cost to entry, this is a no-brainer on how to measure the effectiveness of your traditional mediums. 

Understand the Full Impact of A/B Testing

Call tracking with A/B testing isn’t new, but most times it’s not used to its fullest extent. If you’re thinking that you should only use call tracking when you’re testing out the placement or color of a phone number, think again. I recommend you implement call tracking whenever you’re testing out any on page element and get the full story of user interactions on the page. 

For example, when you changed the color of your button from red to blue, did you really lose any leads, or were people just more inclined to call? When they called, did they convert at a higher percentage? When you implement call tracking with all your A/B tests, you can answer these questions and so many more. 

Summary

Getting a holistic view of the customer journey requires you to collect data points about online and offline behavior of potential customers. A call tracking software gives never before seen insight into the ROI of your traditional and digital advertising. By providing insights on how to better allocate your media spend and optimize your creative assets, call tracking software is the one tool in your toolbox that will pay for itself. I’ve used Call Rail for over five years now, and I’m happy to now be a Call Rail affiliate. To sign up for a trial, visit their website here. Please note, by clicking this link, I may get a small commission at no cost to you.

3 Ways to Use First Party Data in Paid Social

With the deprecation of 3rd party cookies and other tracking technologies, marketers are looking more toward their first party data to optimize ads and create more efficient spend. When it comes to Facebook, here are three ways you can leverage your first party data.

Please consult your legal counsel before you use any first party data in marketing. This is not an article recommending how you should use first party data, rather it is an article outlining the ways that you could use first party data.

Offline Conversions

Offline Conversions for Facebook was a feature released in 2016 as ‘Offline Events’ that allows you to better measure the impact of your Facebook campaigns by importing – you guessed it – offline conversions. 

There are two ways to take advantage of offline conversions. First, you can manually upload your first party data as a new data source in the Events Manager section of Facebook. 

The second way to employ Offline Conversions is to automate your Offline Conversions through the use of an API or by using a partner integration, such as Square, LiveRamp, Marketo, and Segment/Twilio, among many others. Originally the Offline Conversions API was recommended, however now Facebook recommends using the Conversions API.

Facebook provides slightly more transparency than Google’s Offline Conversions in that Facebook allows you to see the split between online and offline conversions generated by your campaign. In contrast, Google groups both sets of conversions into one bucket.

First Party Data You Can Use

  • Email
  • Phone
  • First Name
  • Last Name
  • External ID
  • Gender
  • Birthdate
  • Geo Information (City, State, Zip Code) 

Examples of How to Use This Data: 

  • Understand Offline ROAS
  • Use data with audience targeting to expand your reach
  • Use data with audience targeting to exclude existing customers from your campaigns
  • Use data with audience targeting to upsell or cross-sell existing customers
  • Optimize ads and campaigns

Source: https://www.facebook.com/business/learn/facebook-offline-conversions

Advanced Matching

In 2016 Facebook released Advanced Matching, a new way to better understand the customers you’re reaching with your Facebook ads. Facebook’s Advanced Matching has some overlap with Offline Conversions in that they both work off first party data to better identify which users converted from your ads, but it’s important to note that they’re not dependent on each other. Whereas offline conversions are additional events you send to Facebook, advanced matching simply augments the events you already send with first party data. 

There are two ways to implement Advanced Matching. The first option is manual. This doesn’t require you to adjust any settings in Facebook, rather you modify the pixel itself. When you modify the pixel, you can include any one of Facebook’s first party fields, such as email or phone number, and map user inputs from forms to automatically populate these fields with those values. Manual Advanced Matching is appealing to some marketers because it doesn’t come with the same industry restrictions as Automatic Advanced Matching.

Much like Offline Conversions, you’ll control Automatic Advanced Matching by going into Facebook’s Events Manager. Since you’re not adding additional data sources, you’ll click on the name and ID of your existing data sources. Name and ID often refer to the Name and ID of your pixel, since that’s the primary data source your Facebook business account has. From there, visit the settings tab (again, because this is just an additional feature of already collected events) and turn on Automatic Advanced Matching.

Automatic Advanced Matching is appealing to some advertisers because it doesn’t require any code modification. Instead, Facebook automatically detects form fields on your website, such as those associated with log-in, registration, or purchase activity. From there, Facebook collects the field inputs and includes them with the events you’re already sending to Facebook. Facebook promises to not collect sensitive information, such as passwords or health data.

Ultimately Facebook recommends utilizing both methods of advanced matching, as Automatic Advanced Matching can collect more data, but Manual Advanced Matching doesn’t encounter the same limitations with technology (pixels deployed in an iframe or via an IMG pixel) or industries.

First Party Data You Can Use

  • Email
  • Phone
  • First Name
  • Last Name
  • External ID
  • Gender
  • Birthdate
  • Geo Information (City, State, Zip Code) 

Examples of How to Use This Data: 

  • Better understand cross-device conversions (if someone sees your ad on one device and fills out a form from another)
  • Better understand user behavior even when said user is logged out on Facebook.
  • Optimize ads and campaigns
  • More accurate targeting – increase your Custom Audience size by matching website visitors to Meta users

Source: https://www.facebook.com/business/help/611774685654668?id=1205376682832142

Audiences

In order to improve targeting, since 2012 Facebook has allowed advertisers to create audiences. Facebook offers three types of audiences: Custom Audiences, Lookalike Audiences, and a Saved Audience. 

Custom Audiences include people who have previously interacted with your business in one of many ways. The most common sources of Custom Audiences are below:

  1. Website: People who have interacted with your website, as determined by the Meta pixel installed on your website.
  2. Customer List: A list of your existing customers; this can include first party data as listed below.
  3. Offline Activity: A list of people who have interacted with your business through offline channels (this was mentioned in the above section).
  4. App Activity: People who have interacted with your app
  5. Meta Activity: People who have interacted with your assets on Meta, such as videos, Facebook or Instagram pages, or lead forms.

From any of those Custom Audiences, you can build a Lookalike Audience. Unlike Custom Audiences, Lookalike Audiences include people who aren’t yet your customer, but you want them to be. By mirroring the interests, demographics, and behaviors of people uploaded with your Custom Audience, Facebook can find you people who are more likely to convert. This feature can be amplified with Facebook’s new “Advantage Custom Audience,” “Advantage Lookalikes,” and “Advantage Detailed Targeting.”

Please note, Facebook has contradictory information on which Custom Audiences can be used as source audiences for Lookalike Audiences. In the instructions for creating a Lookalike Audience in Ads Manager, they mention that a Lookalike Audience cannot be created with pixel, app, or Meta engagement data: https://www.facebook.com/business/help/465262276878947?id=401668390442328. In their article detailing what a Source Audience is, they list no such limitation: https://www.facebook.com/business/help/475669712534537.

As with any advertising platform, the more first party data you upload, the better your targeting will be. Facebook mentions that the source audience must contain at least 100 people, but 1,000 to 50,000 is recommended. People in these lists are excluded from being targeted in your Lookalike Audiences.

First Party Data You Can Use

  • Email
  • Phone
  • Mobile App Identifier (IDFA, AAID)
  • First Name
  • Last Name
  • External ID
  • Gender
  • Birthdate
  • Geo Information (City, State, Zip Code) 

Examples of How to Use This Data: 

  • Expand your reach with Lookalike Audiences
  • Exclude existing customers from your campaigns
  • Upsell or cross-sell existing customers

Summary

If you’re considering using first party data with Facebook Ads, you can take advantage of Facebook’s Advanced Matching, Offline Conversions, and Audience features.

Comparing GA3 and GA4 Metrics in Reporting

Google Analytics 4 (GA4) is replacing Universal Analytics (GA3). To prepare everyone for the change, I’ve created a series of blog posts to help users understand the difference between the two versions. Today’s post is a guide to helping you compare your GA3 reporting with your GA4 reporting. You’ll learn how increases and decreases of certain metrics may be more representative of your shift to GA4, rather than optimizations you made as a marketer.

Sessions

Sessions are calculated differently in GA4 than in GA3. Your sessions may be lower in GA4 because:

  1. Session Timeout: While the automatic session duration is defaulted to 30 minutes in both versions of Google Analytics, GA4 allows you to extend that session timeout to 7 hours and 55 minutes. GA3 had a maximum timeout of only 4 hours. That means if you extended your session timeout to GA4’s maximum, someone that stepped away from your website for five hours would be counted as two sessions in GA3, while in GA4 they would be counted as one.
  2. Midnight Restart: In GA3, all sessions break at midnight. In GA4, this isn’t the case. This midnight cutoff is based on the time zone you set within your Google Analytics settings. For example, let’s imagine you’re an ecommerce vendor who gets a lot of traffic from across the U.S. on Thanksgiving. While your fellow East Coast residents have likely gone to bed at midnight, it’s only 9:00 p.m. on the West Coast, and people may still be up shopping. Any session that spans both 8:59 p.m. PST and 9:00 p.m. PST will appear as though you’ve gotten 2 sessions in GA3. In GA4, the session will persist and more accurately reflect the one session. 
  3. New Campaign Parameters: If someone encountered two of your campaigns within a short time frame while browsing the web, each of those campaigns would cause a new session to start in GA3. However, in GA4, the session won’t break. For example, if someone first clicks on a paid ad, goes to your website, immediately goes back to Google and clicks on your Google Business Places profile, GA3 would count that as two sessions. In contrast, GA4 would only count that as one session. 
  4. Automatic Bot Filter: GA4 has a bot filter that removes known bots and spider traffic from your website. Google populated this list based on their own research and the International Spiders and Bots List. GA3 did not have such a filter.

There’s one reason, however that your GA4 numbers may be higher than your GA3 numbers:

  1. Manual Filters: At the time of writing this post, GA4 does not allow for filters beyond IP filters. This lack of filters means that if you were filtering out hostname traffic in your GA3 property settings instead of configuring your Google Analytics tag within Google Tag Manager, your unwanted hostnames will appear in GA4.

Conversions

GA3 groups all website activity within a session. In doing so, Google also deduplicated any conversions within that session. If someone filled out a form 8 times, Google Analytics counted that as one goal completion. Conversely, GA4 isn’t a data model based on sessions. As a result, in GA4, those 8 form fills would be counted as 8 conversions. This difference means you will likely see an uptick in GA4 conversions compared to GA3, whether or not you make any website optimizations.

It’s also important to point out that because of this lack of deduplication, the transition to GA4 is a great time to test the validation on your forms. If your conversion event fires when you click a “submit” button without ever filling out a form, you likely need to tighten up your event logic to more accurately reflect your lead count.

Average Session Duration

To calculate the average time on page metric in GA3, Google would subtract the time you visited page 1 from the time you visited page 2. The inherent flaw was that Google Analytics had no way of measuring the time someone spent on your exit page because there was no subsequent pageview from which to subtract. 

Because Google could never figure out the length of time someone spent on an exit page, GA3 always undercalculated average session duration. Sometimes this was undercounted by mere seconds. Other times, it was under calculated by 10 minutes or more. 

GA4 measures time differently by sending a timestamp with every event. I dive deeper into the comparison in another blog post, but the bottom line remains the same. When you compare your average session duration metrics in GA3 to GA4, GA4 will likely have the higher number.

Bounce Rate

GA3’s bounce rate metric was largely influenced by the amount of event tracking you added to your Google Analytics account. The more events you had, the less bounces you likely had. As a result, there is a good chance anyone with lots of event tracking would naturally have a lower bounce rate in GA3. In contrast, account owners that did not implement any custom event tracking would naturally have a higher bounce rate.

In another blog post, I explain why GA4 bounce rate is a little harder to inflate. With GA4, event tracking will only decrease your bounce rate if you count that event as a conversion. The result is that if you had a super low bounce rate in Universal Analytics due to lots of event tracking, your GA4 bounce rate will likely be higher. Inversely, if your bounce rate was incredibly high in Universal Analytics, don’t be surprised if it lowers in GA4.

Summary

Below is an easy takeaway of what you should expect when you compare your GA3 to your GA4 reporting.

MetricResults in GA4 Reporting
SessionsLikely lower than GA3, but it depends
ConversionsLikely higher than GA3
Average Session DurationLikely higher than GA3
Bounce RateIt depends

Make sure to take this new measurement into consideration when doing analysis so you don’t inadvertently make an optimization decision based on incorrect assumptions. GA4 is continuously rolling out new features. Check back for updates or dive into other comparisons of Universal Analytics and GA4.

4 Reasons You Shouldn’t Use Google Analytics UX Metrics

I heart Google Analytics. Really. There is so much benefit in using it across departments to understand gaps in your content, design, development, and user experience. However, as versatile as Google Analytics may be, in some cases it’s better used as a supplemental diagnostic tool instead of your primary diagnostic tool. Given the overabundance of articles that highlight “X Ways You Can Use Google Analytics to Improve UX,” I wanted to provide a different viewpoint – 4 Reasons You Shouldn’t Use Google Analytics UX Metrics”

You Can’t Accurately Tell Demographics

Google Analytics launched the Demographic and Interests reports in 2013. These reports allowed you to see user demographics by enabling advertising reporting features for your Google Analytics property. Google collects demographic information three ways: Third-party DoubleClick cookie, Android Advertising ID, or IDFA (for users who specifically opt in or haven’t yet upgraded to iOS 14). With the continuing deprecation of third-party cookies and the increasing privacy restrictions around mobile device identifiers, it’s important to note that this reporting only shows a subset of your users. Even in Google’s test Google Analytics property, reporting shows for 40% of all users.

The fact that you’re only receiving a portion of demographic information doesn’t seem completely detrimental to your UX analysis. However, having that extra demographic information can be beneficial. As personalization comes to the forefront of every marketer’s roadmap, understanding how user behavior differs between segments will allow you to provide more meaningful personalization and a more meaningful A/B testing roadmap. A tool specifically designed to measure UX may offer the user the ability to self-identify, giving you that granular segmentation data.

Scroll Depth Doesn’t Allow for Analysis At Scale 

Scroll depth is one of GA4’s enhanced measurement metrics. Enhanced measurement means this particular reporting feature can be enabled with just the click of a button. This dimension gives marketers a layer of insight they never had before with no additional coding, but it has limitations. With enhanced measurement, the scroll depth event will only fire when a user has reached 90% of the way down a page. While this doesn’t render the feature completely useless, it doesn’t give you any indication of people actively engaged and willing to scroll 60% or 80% the way through your content. 

The 90% limitation also makes UX analysis at scale a little tricky. For example, 90% of a page scrolled on desktop may only be 50% of the same page scrolled through on mobile. Even within mobile devices, content on your iPhone pro-max will look a little different than your iPhone mini. This issue gets compounded with your different content lengths. While some pieces of your content are lengthy 1,000+ word blog articles, other content pieces could be your short and succinct 250 word “Contact” page. 

Before I go further, I’d like to demonstrate the context word count adds with a chart. Below are four blog posts and the percentage of users who scrolled 90% of the way down the page.

Page NamePercent of users who 90% Scrolled (Desktop)Percent of users who 90% Scrolled (Mobile)
/blog-post-185%70%
/blog-post-230%15%
/blog-post-340%20%
/blog-post-475%65%

Based on the data, you’d say that blog posts 2 and 3 are uninteresting to users. This may not necessarily be the case, though. Let’s look at the same data set with a “Word Count” column.

Page NameWord CountPercent of users who 90% Scrolled (Desktop)Percent of users who 90% Scrolled (Mobile)
/blog-post-1250 words85%70%
/blog-post-21,000 words30%15%
/blog-post-3900 words40%20%
/blog-post-4400 words75%65%

Unlike the first chart, it’s obvious to see that users don’t like to read more than 400 words. The length gave us additional context on the user experience not built into Google Analytics UX metrics. 

Now, in order to get around the first two issues with the scroll depth event, it’s possible to build your own custom scroll event (you’d have to anyways if you were using a Single Page Application like react). It’s also possible to add a custom dimension that t-shirt sizes your articles (small, medium, large, etc.). These custom events will get you closer to understanding how people consume your content, but it still leaves a gap when analyzing the UX on your page. 

What if people spend 4 minutes on paragraph five and 6 seconds on paragraph two? Where did people stop reading your content? Do they always leave the page when your Newsletter Subscription pop up appears? Are the ads in the middle of your blog article a deterrent? Unfortunately, these aggregated UX insights can’t be built into Google Analytics UX metrics.

Average Time on Page Is Unusable

I wrote another blog post comparing the measurement of time in GA3 vs. GA4. To calculate the average time on page metric in GA3, Google would subtract the time you visited page 1 from the time you visited page 2. The inherent flaw was that Google Analytics had no way of measuring the time someone spent on your exit page because there was no subsequent pageview from which to subtract. This flaw meant average time on page was always underreported or not reported at all.

While GA4 stepped up their game in how they measured time, average time on page, at the time of writing this, is still inaccessible to the average user with the GA4 interface. Advanced users who have knowledge of BigQuery can calculate the difference between pageview timestamps, but those without SQL knowledge have to use built-in GA4 metrics, like engagement rate.

In GA4, engaged users are ones who a) had a visit longer than 10 seconds b) visited 2+ pages or c) completed a conversion event. This metric doesn’t allow for more granular UX analysis, though. You’re either engaged or you’re not. A user who commits to reading your blog for 11 seconds is considered just as “engaged” as a user who reads your blog for 10 minutes. Instead of relying on time metrics in Google Analytics, a heatmapping software can provide more accurate UX metrics.

Unable to Determine Where Clicks Occurred

Within Google Tag Manager, you can fire a Google Analytics event on every single click that happens on your website. This is powerful for those analysts who get the question “How many people clicked this random button on this specific page?”. This feature is supercharged when developers standardize their code and you insert dynamic variables into your event names, such as link text, button classes, IDs, and form elements into those events. 

As much data as the All Click Events tag collects, it has limitations when trying to perform aggregated reporting on UX metrics. The main limitation is that users don’t always click within specific buttons or areas of your website. This is best illustrated with an example. If you have a large image on your homepage (your website hero), it could have a standardized class of “class=background.” Because your hero is such a large image though, you won’t be able to tell the difference between a click in the top left of the hero and the bottom right of the hero. 

Another example of the limitations faced with click tracking is when users try and click on text on your website. If someone is trying to click on text within a paragraph that isn’t clickable, your event tracking would only show that someone clicked on the paragraph, not that a specific word is misleading someone into thinking it’s a link. In contrast, using a heatmapping tool will give you a better picture of where people are clicking, faster, with less work.

Summary

Google Analytics is a fantastic tool, but take caution when using it to analyze your user experience. The lack of segmentation, aggregation, context, and granular insights can mislead you into thinking something is working, when it’s actually hurting the customer experience and your website conversion rate. I recommend always supplementing your Google Analytics data with a heatmapping tool like Hotjar to resolve the gaps in data mentioned above and help give you a more holistic view of user experience on your website. I’ve used both the free and business plans for over 6 years now, and I’m proud to now be part of their partner program. Sign up for Hotjar using this link and get an extended business trial. By signing up using this link, I may receive a commission at no cost to you.

Adding a Time Zone Filter in Google Data Studio

Heatmaps are a great way to visualize the time and day of the week for conversions on your website. It can help you plan out social posts, optimize bids on Google Ads, and schedule your email marketing campaigns.

However, for many companies that serve customers nationwide, the standard heatmap report in Google Data Studio could lead you to draw incorrect conclusions on when those conversions occur. Luckily, by adding a time zone filter to your reports, you can still gain accurate conversion insights to help you better understand your users.

How Time Is Reported in Google Analytics

Time reported in Google Analytics is based on the time zone you select in your property and view settings. That means that if your time zone in Google Analytics is set to Central Standard Time (CST) and a user converts in Seattle at 7:00 p.m. Pacific Time, your Google Analytics account will show a conversion at 9:00 p.m.

This large discrepancy between the time on the East Coast and the time on the West Coast is the reason why aggregating the hour of day metric within your reporting can lead to inaccurate assumptions about your target audience.

How To Set Up A Time Zone Filter in Google Data Studio

In order to set up time zones in Google Data Studio, you’ll need to set up a custom dimension within your Google Analytics data source. Once you’ve added the data source to your report, go to Resource > Manage Added Data Sources

Click “Edit” once your data source appears.

Then click “Add A Field”

Name your Field Name – I chose the name “Time Zone.” Then you’ll use a Case/When formula to categorize each metro in Google Analytics into a specific time zone.

To make it easy, you can copy and paste the code below:

CASE 
 WHEN REGEXP_MATCH(Metro, '.*(Anchorage AK|Fairbanks AK|Juneau AK).*') THEN 'Alaska Standard Time'
 WHEN REGEXP_MATCH(Metro, '.*(Dothan AL|Birmingham (.*) AL|Huntsville-Decatur (.*) AL|Montgomery-Selma, AL|Little Rock-Pine Bluff AR|Monroe LA-El Dorado AR|Ft. Smith-Fayetteville-Springdale-Rogers AR|Jonesboro AR|Panama City FL|Des Moines-Ames IA|Cedar Rapids-Waterloo-Iowa City & Dubuque IA|Sioux City IA|Rochester-Mason City-Austin,IA|Quincy IL-Hannibal MO-Keokuk IA|Chicago IL|Champaign & Springfield-Decatur IL|Peoria-Bloomington IL|Paducah KY-Cape Girardeau MO-Harrisburg-Mount Vernon IL|Davenport IA-Rock Island-Moline IL|Rockford IL|Shreveport LA|New Orleans LA|Lake Charles LA|Baton Rouge LA|Lafayette LA|Alexandria LA|Minneapolis-St. Paul MN|Mankato MN|St. Louis MO|Kansas City MO|Columbia-Jefferson City MO|Springfield MO|Ottumwa IA-Kirksville MO|St. Joseph MO|Jackson MS|Biloxi-Gulfport MS|Hattiesburg-Laurel MS|Columbus-Tupelo-West Point MS|Meridian MS|Greenwood-Greenville MS|Fargo-Valley City ND|Omaha NE|Lincoln & Hastings-Kearney NE|North Platte NE|Oklahoma City OK|Tulsa OK|Sherman-Ada, OK|Wichita Falls TX & Lawton OK|Memphis TN|Nashville TN|Houston TX|Dallas-Ft. Worth TX|Tyler-Longview(.*) TX|Austin TX|San Antonio TX|Harlingen-Weslaco-Brownsville-McAllen TX|Waco-Temple-Bryan TX|Lubbock TX|Corpus Christi TX|Laredo TX|Beaumont-Port Arthur TX|Amarillo TX|Abilene-Sweetwater TX|San Angelo TX|Victoria TX|Odessa-Midland TX|Milwaukee WI|Green Bay-Appleton WI|Madison WI|La Crosse-Eau Claire WI|Wausau-Rhinelander WI|Duluth MN-Superior WI|Joplin MO-Pittsburg KS|Topeka KS|Evansville IN|Wichita-Hutchinson KS|Minot-Bismarck-Dickinson(.*) ND|Jackson TN|Mobile AL-Pensacola (.*) FL|Bowling Green KY|Sioux Falls(.*) SD).*') THEN 'Central Standard Time'
 WHEN REGEXP_MATCH(Metro, '.*(Washington DC (.*)|Hartford & New Haven CT|Miami-Ft. Lauderdale FL|Tampa-St. Petersburg (.*) FL|Orlando-Daytona Beach-Melbourne FL|West Palm Beach-Ft. Pierce FL|Jacksonville FL|Ft. Myers-Naples FL|Gainesville FL|Atlanta GA|Macon GA|Savannah GA|Augusta GA|Columbus GA|Tallahassee FL-Thomasville GA|Albany GA|Indianapolis IN|South Bend-Elkhart IN|Louisville KY|Lexington KY|Providence-New Bedford,MA|Springfield-Holyoke MA|Baltimore MD|Salisbury MD|Portland-Auburn ME|Bangor ME|Presque Isle ME|Detroit MI|Lansing MI|Grand Rapids-Kalamazoo-Battle Creek MI|Charlotte NC|Greenville-New Bern-Washington NC|Raleigh-Durham (.*) NC|Greenville-Spartanburg-Asheville-Anderson|Greensboro-High Point-Winston Salem NC|Wilmington NC|Boston MA-Manchester NH|New York NY|Utica NY|Rochester NY|Albany-Schenectady-Troy NY|Buffalo NY|Syracuse NY|Burlington VT-Plattsburgh NY|Binghamton NY|Elmira (.*) NY|Watertown NY|Cleveland-Akron (.*) OH|Dayton OH|Cincinnati OH|Columbus OH|Toledo OH|Youngstown OH|Zanesville OH|Wheeling WV-Steubenville OH|Lima OH|Philadelphia PA|Harrisburg-Lancaster-Lebanon-York PA|Johnstown-Altoona-State College PA|Wilkes Barre-Scranton PA|Pittsburgh PA|Erie PA|Charleston SC|Florence-Myrtle Beach SC|Columbia SC|Knoxville TN|Richmond-Petersburg VA|Norfolk-Portsmouth-Newport News VA|Roanoke-Lynchburg VA|Charlottesville VA|Tri-Cities TN-VA|Harrisonburg VA|Charleston-Huntington WV|Clarksburg-Weston WV|Bluefield-Beckley-Oak Hill WV|Parkersburg WV|Ft. Wayne IN|Terre Haute IN|Marquette MI|Alpena MI|Chattanooga TN|Lafayette IN|Flint-Saginaw-Bay City MI|Traverse City-Cadillac MI).*') THEN 'Eastern Standard Time'
 WHEN REGEXP_MATCH(Metro, '.*(Honolulu HI).*') THEN 'Hawaii Standard Time'
 WHEN REGEXP_MATCH(Metro, '.*(Phoenix AZ|Denver CO|Grand Junction-Montrose CO|Colorado Springs-Pueblo CO|Boise ID|Missoula MT|Helena MT|Butte-Bozeman MT|Billings, MT|Great Falls MT|Glendive MT|Cheyenne WY-Scottsbluff NE|Albuquerque-Santa Fe NM|Rapid City SD|El Paso TX|Salt Lake City UT|Casper-Riverton WY|Tucson (.*) AZ|Idaho Falls-Pocatello ID|Twin Falls ID).*') THEN 'Mountain Standard Time'
 WHEN REGEXP_MATCH(Metro, '.*(Los Angeles CA|San Francisco-Oakland-San Jose CA|San Diego CA|Monterey-Salinas CA|Santa Barbara-Santa Maria-San Luis Obispo CA|Sacramento-Stockton-Modesto CA|Chico-Redding CA|Palm Springs CA|Fresno-Visalia CA|Yuma AZ-El Centro CA|Bakersfield CA|Eureka CA|Portland OR|Seattle-Tacoma WA|Spokane WA|Yakima-Pasco-Richland-Kennewick WA|Reno NV|Las Vegas NV|Eugene OR|Medford-Klamath Falls OR|Bend OR).*') THEN 'Pacific Standard Time'
 ELSE 'Not Set/Outside of US'
END

Once you’ve set up your calculated field, you’ll need to add a control drop down to your Data Studio Report.

Within the drop down settings, choose the custom dimension you created and then you’re all set!

Summary

Bad data leads to bad decisions. By adding a time zone filter to your Google Data Studio Reports, you can make better decisions by looking at your data with a more accurate lens.

Stay in the know with email updates!

* indicates required

Transforming Google Data Studio Reports: Using Conditional Formatting

One of the most difficult parts of creating digital marketing reports is effectively and efficiently communicating the results of your online efforts. To help you create more intuitive Google Data Studio reports, I have compiled three great suggestions for you to use. Since these posts are written in a longer “how-to” format, I have spliced them up into three different blogs, making them more digestible. This suggestion is all about using conditional formatting to focus your reader’s attention on specific attributes of your report.

When Conditional Formatting Can Provide Extra Value to Your Reports

Traditionally conditional formatting is used to highlight different thresholds in your reporting. For example, you could highlight in green landing pages that received more than 200 sessions in the past month and had a conversion rate greater than 5%. You could also highlight in red landing pages that had received more than 200 sessions in the past month and had a conversion rate of less than .25%.

You can even use conditional formatting in a less orthodox way, such as highlighting:

  1. Pages where you are performing A/B testing
  2. Pages with a certain version of your contact form
  3. Pages with that highlight a certain brand message
  4. Locations with more than four review stars on Google
  5. Blogs written by certain authors

How To Employ Conditional Formatting in Data Studio Reports

To add conditional formatting to your existing charts, mouse over the right hand side of your screen, where you can select the style tab.

Once you select “Add,” a bar will appear at the bottom giving you the option to create a rule. For a simple rule, select your condition and an input value, then click “Save.”

For a more complex rule, you’ll follow the same process, but the condition selected should be REGEX. If you are not familiar with regex, read my regex 101 blog. Although a much more manual process, it will help you highlight data that doesn’t fit within traditional threshold rules.

Summary

The easier you make it for people to read your reports, the quicker and more likely they are to understand your key takeaways. Highlighting aspects of your reports with conditional formatting will draw your reader’s attention to key takeaways in a visually stimulating way. For other ways to make your Data Studio reports more intuitive, read my other two blogs about cleaning up your URLs and renaming metrics and dimensions.

Transforming Google Data Studio Reports: Renaming Metrics & Dimensions

One of the most difficult parts of creating digital marketing reports is effectively and efficiently communicating the results of your online efforts. To help you create better Google Data Studio reports, I have compiled three great suggestions for you to use. Since these posts are written in a longer “how-to” format, I have spliced them up into three different blogs, making them more digestible. This suggestion is all about making your metric and dimension names more intuitive.

When Metric & Dimension Names Aren’t Intuitive In Your Reports

As you are building your Data Studio reports, you will notice that the names of your metrics and dimensions import as well (as they should). However, sometimes these names are too long or do not match your data after you have applied filters. As a result, you will want to change the names for your metrics and dimensions.

I first found this tip useful with my goal metrics. By default, Data Studio not only imports the goal name, but it also imports the goal number. As a result, instead of your goal reading simply, “Form Submissions,” you would have “Form Submissions (Goal 3 Completions).” To anyone who does not regularly use Google Analytics, this is not only completely irrelevant information, but it can also be confusing information.

Another time this would be helpful is if you have decided to take your Google Analytics reports to the next level and transform your URLs. One example I gave in my previous blog post is transforming your location URLs to match the standard naming conventions of your differing locations when you feature them in your report. For example, instead of having your location URL read “/locations/dallas-main-street/”, it could read “Store #152”.

If you do that, however, it may also be beneficial to change your metric name to “Locations.” This way it is easier for your team to more quickly understand the main takeaways, such as “Here are the main locations that people visited online” or “Here are the locations where the most amount of people filled out a contact form.”

How To Rename Metrics & Dimensions in Google Data Studio

The process of changing the names of your metrics and dimensions in Google Data Studio is quite simple. After you set up your data visualization in a scorecard, table, chart, or other format, mouse over your metrics and dimensions on the right-hand side. A pencil will appear over the “AUT” portion of your metric and the “ABC” portion of your dimension.

After that, fill in the “name” portion of the metric or dimension with the name of your choosing, and you are done!

Summary

The easier you make it for people to read your reports, the quicker and more likely they are to understand your key takeaways. Update your metric and dimension names so that they are shorter and easier to decipher for those outside of the digital marketing world. For other ways to make your Data Studio reports more intuitive, read my other two blogs about cleaning up your URLs and employing conditional formatting.

Transforming Google Data Studio Reports: Cleaning Up URLs

One of the most difficult parts of creating digital marketing reports is effectively and efficiently communicating the results of your online efforts. To help you create more intuitive Google Data Studio reports, I have compiled three great suggestions for you to use. Since these posts are written in a longer “how-to” format, I have spliced them up into three different blogs, making them more digestible. This suggestion is all about making your URLs easier to read and understand.

When URLs Aren’t Intuitive In Your Reports

In a perfect world, your URLs never change and they match your page title to a tee. Unfortunately, this perfect world is far, far away for many analytics practitioners. Furthermore, even when your URL meets both of these conditions, it can be hard to read for people who are not knee-deep in Google Analytics data every day. Below are common cases when you might need to give your URL a different name to make it easier to comprehend for report recipients:

  1. Your Homepage: Often your homepage URL is a simple forward slash. When not accompanied by the rest of your domain, it is less intuitive that the symbol “/” is the front page of your website. Make it easier for your reader by changing the name in your reports to “Homepage.”
  2. Your Location Pages: For multi-location businesses, interpreting URLs is not always impossible, rather it is just inconvenient. Try changing the page path “/locations/dallas-main-street” to “Dallas – Main Street Location.” Even better, stick to standard naming conventions used throughout the organization. For example, if the Dallas Main Street location is referred to as Store # 152, use that nomenclature in your reporting.
  3. Grouping Pages: Sometimes landing pages are more user-friendly without grouping them under a subfolder. For example, you may want to use the URL “example.com/50-percent-off” on your Google Ads campaigns instead of “example.com/landing-page/50-percent-off” to get a higher click-through rate. While this is a smart move by your PPC team, remembering all your landing page URLs when looking at a report can be a pain. Try adding a prefix in your reporting to add more clarity, such as “PPC Landing Page: 50% Off.”

How To Rename URLs In Data Studio

After you have added your data source (in this case, Google Analytics) into Google Data Studio, go to Resource > Manage Added Data Sources.

Click “Edit” once your data source appears.

Name your Field Name – I chose the name “Modified URL.” Then you will use a Case/When formula to categorize your URLs.

In the screenshot, I have given two types of REGEXP_MATCH formulas to produce our desired end result. The first one is

WHEN REGEXP_MATCH(Landing Page, "^/$") THEN "Homepage"

This formula will change the page path “/” to the word “Homepage.” If you are not familiar with regex and haven’t read my regex 101 post, the formula may seem like a foreign language. Translated, the formula says “Whenever you see a landing page that exactly matches ‘/’, make it say “Homepage” instead.

This formula will only work for those who have not prepended the hostname to the URL in Google Analytics. If you have prepended your hostname, make sure to add it in right after the caret.

The second formula is much more simple:

WHEN REGEXP_MATCH(Landing Page, '.*(/locations/dallas-main-street/).*') THEN "Dallas - Main Street"

Translated, this says “Whenever you see the URL ‘/locations/dallas-main-street/’, use ‘Dallas – Main Street’ instead. When you use this formula on your own data, replace my fake URL with the URL you want to use. Even though this formula uses regular expressions, you will notice there is no need for you to escape characters such as dashes or forward slashes in either one of these examples.

Note: If you have a question mark in your URL, you won’t be able to simply escape the character with a slash (\). To make this work, I’ve found you’ll need to instead put your question mark with brackets around it. For example, if my URL was /locations?dallas-main-street/ I would instead write /locations[?]dallas-main-street/

Lastly, make sure when you are transforming URLs that you end your formula with ELSE Landing Page, otherwise, URLs that do not fit into the specific cases you defined will not show up at all, leading to a lot of missing data.

Once you have finished writing your formulas, click Update and then Done!

Summary

The easier you make it for people to read your reports, the quicker and more likely they are to understand your key takeaways. Transform your URLs so that they can be comprehended quicker by those outside of the digital marketing world. For other ways to make your Data Studio reports more intuitive, read my other two blogs about changing your metric and dimension names and employing conditional formatting.