Source: Juris Digital,By Casey Meraz: https://jurisdigital.com/local-finder-where-searchers-are-clicking/
Local search results have seen a lot of changes recently with the most significant being the addition of the local finder and then having paid ads in the local finder. These changes really got me interested to find out where users are most likely to click and what may influence their click behavior. This is a follow up study to our 3 pack click test study. This study focuses solely on the local finder in desktop results since phone and tablet results likely have more local intent by proximity. A mobile study will be coming at a later time.
For this study we had 300 total participants contribute over 3 different click test studies. We used law firms for all three examples. Keep in mind that there are a lot of variables that will go into a clicking or transactional decision. In this study we will mainly look at the impact of reviews and ads.
Knowing where users click in search results and understanding the customer journey will help you focus your efforts on what efforts are going to deliver the strongest ROI for your business.
As a search marketer I’m biased and trained to not click on the paid ads. As a matter of habit I skip right over them as I explained in my search behavior journey on my most recent post on Moz.
My hypothesis was that paid ads, position and star ratings will have the most impact on when a user clicks. I believe that many users will bypass the ads and even the number 1-2 ranking positions if other business listings above the fold contained more stars than the top listings.
Let’s see what the data showed…
Our Testing Method
Since I didn’t want to gear these results towards a specific demographic I used UsabilityHub.com’s click testing software with the random demographics information. The respondents were mostly US based users who were given a simple question with a screenshot of the results and then asked why they clicked where they clicked.
Unfortunately we are not able to test the emotional response of users who were actually making the search in a time of need which is of course another variable.
Test #1: Denver Personal Injury Lawyer
In this test we served our users with a simple question: “Imagine you’re looking for a lawyer on Google. Which result are you most likely to click?”
They were then served this image of the local finder which contained 5.5 total results. Out of the results we had 1 ad in the finder in the #1 position and basically 2 results with varying review scores. We served this to 100 survey participants to get an idea where they would click. Although you will find that the user base is diverse the biggest demographic was males between the ages of 25-29.
The results were definitely different than I expected. As you can see by reviewing the heatmap below we saw many clicks to the Ad, the #2 ranking law firm, and the #5 ranking law firm.
Breaking it down into an easier to see format you can see the results of the first click test we did. In the end based off of the randomized demographics we saw 34 clicks to the ad, 33 clicks to the #5 listing, and 21 clicks to the #2 listing.
Based off of these findings it appears the #1 position with the ad carried a lot of weight as did the local (free) listing with the highest number of reviews.
After looking at this data I wanted to find out how these results would vary or if they would if I just looked at the survey participants in the United States. Looking at the screenshot below you will see that the placing remained the same with the ad winning the most clicks, the most reviews winning second place, and the #2 listing winning 3rd place.
So what made users want to click this result the most? My hypothesis because I hate ads was due to the stars. During the survey we asked our participants to let us know why they chose the result they did. Look at the result below. What are your thoughts?
The participants who provided feedback were clear and the most recurring answers were that it was first and it had star ratings.
This can be further proven by looking at the word cloud of the 100 responses below. In a word cloud the most repeating words will appear bigger:
Although First is one of the prominent words the biggest seem to be rating, reviews and good which made me wonder if reviews are more heavily weighted in users eyes then positioning.
Key Takeaways from Study 1:
- The ad in the #1 position got the most clicks
- The ad also had the highest review score
- The second most clicked listing was above the fold but lower on the page. It had the highest number of reviews.
- I think having the response data shows that people were not really looking at the name of the firm when making a click decision, although in this example there were not any keyword stuffed business names.
- Reviews seemed to matter the most
- Most users clicked on the business listing and not the website direct meaning they would still have to take another step before contacting the firm. This is where a GREAT & professionally optimized listing would come into play.
Test #2: San Diego Divorce Attorney
In this test we served our users with a simple question: “If you were looking for an attorney, which result are you most likely to click?”
This search result was chosen specifically because it contained 5 search results, 4 of which had reviews in which the number of reviews varied from 8 to 45. Since reviews were a prominent fixture in the first click test study I thought it was important to get more data. Just like study #1 we got 100 click responses from a wide variety of users.
With this result we have a few more variables than last time. We see various listings with star results but also results with spammy business names that SEO’s would be sure to avoid. Let’s see what decisions users made with these diverse examples.
You can see from the results below that these results were different from the last study. The majority of the clicks went to the #2 position in this example.
If you look at the number breakdown below you will find that the #2 listing had 46 clicks which was more than double the amount of clicks from the advertisement. If you looked at the example above you will notice that the #2 listing had the most reviews out of any of the other listings on that page.
Those who were surveyed provided some interesting insights this time that I thought were very intriguing. These are below:
Notice how stars are once again prominent. Interestingly enough we have two examples of people who think like me and skipped the ad just because it was an ad. Another interesting note was of course the business name factor where a user indicated that the company “619 Divorce” looked like a scam. I can relate with this because when I want to work with someone, I want to work with a company that appears to be legitimate.
The people that we surveyed also created enough data for us to create a word cloud. The results that stuck out to me most were Reviews, Rating, Star, Ad, Number, High, and Best. Take a look at the word cloud below.
Key Takeaways from Study 2:
- The listing in the #2 position got the most clicks
- The listing also had the most number of reviews but it did not have the highest review score. (Does having a 5 look like spam?)
- The second most clicked listing was the ad
- The ad mentions “Certified family law…” which some participants took well to.
- Some of the participants found they would not click on a result that looked spammy.
Test #3: NOLA Car Accident Lawyer
In this test we served our users with a simple question: “Which result are you most likely to click on when looking for a lawyer?”
This was also a random demographics based survey. I picked this search result because I was able to get 5 business listings above the fold, two of which were ads WITHOUT reviews, and three free listings which had reviews. You can see what this search result page looked like below:
Looking at the results above you can see that the non paid listings all have reviews ranging from 5-13. Although the scores vary from 4.3 – 5.0 you will notice that the stars basically looked all filled in at a quick glance. To me there is not a major visual distinction between the rating and how the stars display.
It will be interesting to see where the participants clicked. Drumroll please…
As you can see above the users clicks in this example were mostly to the number 4 and 3 positions in the local finder with the majority going to listing #4. In this case the #4 listing has the most reviews and the highest aggregate review rating as well.
The breakdown of this data is rather interesting. The majority of clicks from all users went to the #4 result which won with 47 total clicks. This is over 3 times the number of clicks the top AD without reviews got.
Another interesting takeaway is that the listing that got the most clicks also added a city modifier to its business name (which is against Google’s guidelines). Most of the comments as to why users clicked on the results they chose were again related to reviews. But there were also some interesting ones that I want to share as well:
- “This law firm has a 5.0 rating – I generally don’t click on ads either in these contexts, so Brandner was the way to go.” – Seems he thinks 5.0 may be spammy.
- “I like seeing the list of lawyers, the ratings especially and the links to their websites”
- “many reviews compared to the others + it has “law firm” in the name”
- “With 5 ratings and not perfect score, the recommendation seems more believable.”
- “Has the most reviews and is not open 24/7”
It’s clear that every searcher is unique and has their own opinions when deciding which result to click on. It’s interesting however that most of the correlation for this result was also review and authority based as you can see in the word cloud below:
Again we see popular words like rating, best, highest, rated, etc. as popular key words used by the respondents of the survey. So what are the key takeaways here?
Key Takeaways from Study 3:
- The listing position did not appear to matter as long as it was above the fold. The reviews mattered most.
- Although some were deterred by the 5.0 rating the fact remains that most of the clicks went to the listing with the most reviews and the highest review score.
- Paid Ads may not make sense for those who don’t have at least 5 reviews. Ads with the most reviews might be a good choice.
- The listing that got the second most clicks also had the city name in the business name. (This is againstGoogle’s Guidelines, specifically: Adding unnecessary information to your name (e.g., “Google Inc. – Mountain View Corporate Headquarters” instead of “Google”) by including marketing taglines, store codes, special characters, hours or closed/open status, phone numbers, website URLs, service/product information, location/address or directions, or containment information (e.g. “Chase ATM in Duane Reade”) is not permitted.)
Overall Study Conclusion
To me this study is just the tip of the iceberg in learning what impacts user behavior. Knowing where users click can really help you focus in on the strategy that’s going to have the most impact for your business. Here are some of the key takeaways from the three tests we conducted:
- Reviews correlated highly with the top click through rates. Having the most reviews and the most positive aggregate rating will result in more clicks.
- Showing above the fold is still very important. It’s not going to help you as much if you don’t have a good review strategy in place.
- A user clicking on your listing is not the end of the customer journey. Having a sexy optimized listing is going to help convert that user.
- Ads did get clicks but the ads with reviews seemed to get the most clicks.
Although I’m still not a big fan of ads its clear that there are many people out there who do make that click journey. If paid is within your budget it may make sense to try it after you’ve achieved your desired star ratings for your results.