Twitter Analytics: Your Best Free Gift

When posting on Twitter you can’t expect to read some how-to’s on what to tweet and expect to be an overnight Twitter sensation. Sometimes that happens to people, but when it does, it’s just very good luck! To be successful on Twitter, you can’t just post some content and hope for the best. You need to be smart about it, and in order to be smart about it you need to look at your Twitter analytics.

There are a ton of tools out there for looking at Twitter analytics and statistics, but lucky for you Twitter actually has a pretty comprehensive analytics tool. And as an added bonus, it’s pretty easy to use.

When you get to your analytics page (found under your profile menu) there are four main tabs that break down your analytics.

 

Screen Shot 2016-04-18 at 12.11.43 AM

Home

This is the overview of how your Twitter page is doing based on a monthly breakdown.

The top bar gives you a 28 day summary that looks at how your stats have changed since the last 28 days. This gives you a quick visual on how you’re doing.

Screen Shot 2016-04-18 at 12.12.26 AM

From there you can look at each month a little more in depth. Looking at the month allows you to see your top tweet, and your top mention. The top tweet is decided based off of impressions. And the top mention is a tweet someone mentions you in that got the most engagements. They also provide you with your top follower, and your top media tweet.

Screen Shot 2016-04-18 at 12.20.37 AM

Then there’s a summary of how many times you posted, how many people interacted with you (impressions, visits, and mentions), and how many followers you gained.

 

Screen Shot 2016-04-18 at 12.20.29 AM

Why is this helpful?

This overview gives you a quick glance at how you’re doing. Are you improving since the last month, or is there something that’s bringing you down? In addition, showing you what your top posts are each month lets you see what’s really working. And, finding out who your top follower of the month might help you make connections (or you can maybe show them some extra love).

Tweet Activity

This part of the analytics gives you a closer look at how your tweets are doing once you release them to the world.

The first thing you’re going to see a graph that illustrates how many tweets you posted (the grey bars) and how many organic impressions those tweets received. Organic impressions are the people who interacted with your tweets. These can be followers, but the organic quality means that they found your tweets without the help of your advertisements.

tweet activity

From there you can look at the stats of all of your tweets as well as your top tweets for your selected time frame. Twitter’s default is 28 day segments, but you can look at specific months, or other unique frames of time.

stuff

Analytics also has a breakdown of people’s engagements with your tweets for the selected time period. They are each broken down into separate graphs that give you a quick visual of how you’re doing.

engagements

Why is this important to me?

Looking at what tweets get the most impressions and engagements is proof when trying to figure out what you should be posting on Twitter. There are two ways to look at it:

  1. If you’re stuck and don’t know what’s working for you, look at these analytics to see when there are spikes of engagement. If your chart is pretty consistent at 5k impressions a day, but then one day you have 12k impressions, look and see what was so special on that day. And from there you can post it again! Don’t be afraid to repost quality content.
  2. If you’re posting new content (for example, a new ad campaign with different visuals) you can look at your analytics to see how people are engaging with the new content.

Audience

Audience insights allows you to get a picture of who your followers are and what they’re interested in. In my opinion this isn’t as foolproof as the other tools, but it does allow you to get to know your audience, which is always beneficial.

When you open this tab, there is an overview of the total number of followers you have, and then it breaks down into what your followers are interested in. Twitter also lets you know what their occupation is, favorite TV genres, and other demographic information.

 

audeicne 2

Why is this important to me?

Like I said, this allows you to form a picture of who your followers are and what they’re interested in. You can create content and curate content that is more applicable to their interests. As long as it also applies to your brand! You can also look at your organic audience. So seeing what they’re interested in allows you to take steps to making them part of your regular audience.

Events

One last note is the events tab. This shows you past, current, and upcoming events that people are going to be tweeting about. Not only does it show you the event, it also shows you the demographics of who’s tweeting, as well as the most popular tweets, and hashtags for that event.

events

Why is this important to me?

Events allows you to stay current, and it also gives you an opportunity to engage with your audience.

Conclusion: Use your Analytics!

The tool is free, and for the most part, pretty easy to understand. It helps you take the guessing out of what’s working for you on Twitter, and it allows you to get a better understanding of your audience.

Rock the Moz Bar

Here at Searchable, we have already talked about Moz a number of times, including their blog and their On-Page Grader. Today we are going to take a look at the Moz SEO Toolbar, an in-browser extension for Chrome and Firefox that lets you track SEO while browsing the web. While there are some upgraded features available with Moz Pro, we will be focusing on getting the most out of the free tools in the Moz SEO Toolbar.

The Basics

Being an in-browser tool, the Moz SEO Toolbar is always there to provide background information on your webpage or what makes a competitor’s website rank so high. The toolbar has two main features, the SERP overlay and the web page analysis.

SERP Overlay

The SERP overlay is a feature that helps explain why different web pages rank so highly on Google, Bing or Yahoo search. In this overlay, each result is shown with their page and domain authority scores. Page authority serves as an indicator of how strong the individual page is, while domain authority shows the strength of the website as a whole. For example, let us say you want to start a boutique cat clothing store. The Google SERP if you search the term “cats” looks like this:

Cats Search Engine Result Page
Each of the top 3 posts have mid-range page authority ranking and 100 domain authority. It would be very hard to break into this SERP.

On the other hand, if you search cat boutique, this is what the SERP looks like:

Cat Boutique Results Page
These pages have a much lower page and domain authority score on average. As a result, ranking high on the cat boutique SERP would be a much more attainable goal and something that would take a lot less time and effort.

Web Page Analysis

In addition to the SERP overlay, the MOZ toolbar also lets you gain insight into the what ranks certain pages so highly. Keeping with the cat boutique example, we can look at the top ranking result when you search for cat boutique.

Hemmingway Web Analysis

Hemingway’s has low page and domain authority, but a low spam score and decent Facebook activity. The low authority rankings mean the page does not have that much clout and is susceptible to being passed on the SERP. However, the low spam score means search engines are not penalizing the page for Black Hat SEO, and the Facebook activity shows the company is active on social media.

The two tools on the left of the toolbar help dig a little bit deeper into the page. The Magnifying glass over the web page is the page analysis tool. This tool pulls up information about a website that is important for SEO, such as headers and meta descriptions.

Page Analysis Tool

The other tool is the highlighter tool, which highlights links on the page.

Link Highlighter

In addition to looking into competitors, the web page analysis feature can also be used on your own web page. When used on your site, this tool can help you identify your strengths and weaknesses or point out holes in your website’s SEO.

As pointed out earlier in this article, and as can be seen throughout the images, there is a full version of this toolbar. The full version will give you access to more metrics and analytics and can be tried out for 30 days free of charge. The free version of the Moz toolbar should be more than enough to get you started on your SEO journey.

For the Chrome version of the Moz Toolbar click here. For the Firefox version, click here. For more posts about free SEO analytics tools, read Ana’s article on 4 Free SEO Tools You Didn’t Know You Needed.

SEO blogs to add to your RSS feed

Right now, you’re probably thinking, “why would this amazingly helpful SEO blog promote other SEO blogs?” But if you think back to our post about link building campaigns, you will remember that being friendly, even with competitors, is helpful for a website’s authority. So, here are some pretty great SEO blogs that answer some more in-depth questions you might have about SEO and optimizing your content marketing.

The Moz Blog

Moz is already a leader in subscribable and downloadable SEO tools, so it is no surprise that their blog is a leader in search as well. When visiting the Moz Blog, you won’t find shameless plugs for their costly tools and devices. Instead your reading will open SEO doors that you didn’t know existed.

Much of the blog is devoted to how-to’s that are fairly straight-forward if you are looking to improve some aspects of your site. Even better, is Moz’s thoughtful posts on content marketing in general. These types of posts are great for those who are looking to expand on their website topics after you’ve been in the content marketing game for some time.  

keyword map
[Source: Moz]
One of the best things about the Moz Blog is the abundance of visuals. It’s sometimes to difficult to include relevant images or graphics in your post about SEO, which is not necessarily a topic made for visuals. Moz does an excellent job of creating visuals out of their data to help readers take in information more easily, like the chart on the right detailing keyword usage on a website.

The Orbiter

The Orbiter is the web marketing blog of Orbit Media Studios, a web design and development company led by web expert Andy Crestodina. The Orbit team take turns writing posts, usually centered on their specialty topics. This is where to go when you want to know what the experts are thinking.

The blog’s voice is extremely easy to follow and makes the reader feel like an equal, which is important when it comes to the sometimes confusing and frustrating task of optimizing online content. Each post is broken down for easy-reading and, even when a post is long, does not leave the reader exhausted.

The blog’s topics are mostly focused on SEO, website optimization, and content creation, but there are also more general topics, like instructions for making a simple content mission statement. Not only does this blog want to help your business’ online presence, but the people behind it want to help your brand as a whole, which is comforting.   

Search Engine Land

If you are a part of a business that just needs to know what’s happening in the realm of SEO innovations, then Search Engine Land has to be number 1 on your list of sources. Be warned: this is not a blog to be visited by SEO beginners. If you have become comfortable with SEO jargon and carrying out web optimization tasks is a breeze, then you can give this blog a shot.

Search Engine Land blog home page
[Source: Search Engine Land home page screenshot]
By a simple scroll through Search Engine Land’s home page, it’s obvious that this blog cannot get enough of Google. Just look at the screenshot on the left of their current home page with four references to the big G without even having to scroll.

And why should they shy away from the number one search engine that essentially dictates all rules and regulations for SEO? Google’s algorithms, tools, and ranking systems are changing almost as quickly as a marketer can learn them, so why not stay updated?  

Search Engine Land posts about 4-5 times a day. This can seem a little daunting when you don’t have much time to skim through thousands of words for some practical information, but the deep content that this blog provides could be worth it. For marketers wanting to take their brand to the next level.

kissmetrics

infographic on e-commerce sites
[Source: kissmetrics’ infographic]
kissmetrics does not waste time on flashy layouts and witty blog voice. Instead, the writers behind this web marketing and tracking blog get right to business. It’s easy to tell that this blog is managed by experts in SEO analytics because of the varied and detail-oriented content.

Not only does kissmetrics offer helpful, step-by-step posts on measuring your SEO’s success, but there are also webinars, infographics, and marketing guides to skim through when you are looking for something more than words. Some of this content requires a subscription to kissmetrics, which will cost you. If you spend a few months with the kissmetrics blog and find it extremely useful for your business and marketing goals, then maybe throwing some money their way wouldn’t be the worst idea. In the meantime you can marvel at one of kissmetrics’ many thorough infographics. This one is all about making your ecommerce site trustworthy.

 

Hopefully these alternative SEO blog options didn’t make you completely jump ship on Searchable. We know we can’t cover everything, so we hope you can find something useful for your business’ SEO needs from other web marketing fanatics.     

Why and How to Track Your Webpage

Now you are well on your way with your SEO journey. So far you have helped Google find your website and avoided some early pitfalls that many business encounter when starting to optimize their websites. Now is a good time to look at the progress you have made and how to track your SEO progress.

Why should I track my websites rankings?

Tracking your website’s page rankings and traffic flow may seem like an unnecessary addition to your already busy schedule. Good SEO will work whether or not it is tracked, right? Real data gets real results. Tracking can help make your optimization more efficient and effective, as well as allow you to spend less time on SEO.

How do I start tracking?

Establish a baseline:

The first step to tracking your website is building a baseline of your website’s current traffic. Using Google Analytics or any number of other online tools, many of which are free, you can establish your webpage’s current metrics, such as Search Engine Results Page (SERP) ranking or daily number of views. These baseline statistics will help point out areas that need improvement on your website.

Set SMART goals:

Once your baseline information is established, you can set targets to work towards for improving your website. Setting SMART, which stands for Specific, Measurable, Attainable, Relevant and Time-based, goals allows you to give trackability and structure to your objectives. By making your tracking goals SMART objectives, you will also improve the targeted metrics. Instead of setting a goal to “improve daily pageviews”, set a SMART goal like “increase daily pageviews by 20% in three months”.

What should I track?

Different metrics help show your website’s strengths and weaknesses in various areas. Here are a few basic metrics to start looking into:

Bounce Rate:

Bounce rate looks at the number of people who visit your website without interacting at all. High bounce rates can indicate problems with your website, such as a slow loading screen or a lack of interesting and engaging content. Lower your bounce rate by helping people find the content they are looking for on your page. You can also use tools such as GTmetrix, recently featured in Ana’s 4 SEO Tools You Didn’t Know You Needed article, which helps optimize your webpage’s loading.

Referring Pages/Keyword Rankings:
Knowing how people are getting to your pages is important to knowing your audience and spending your time and resources properly. Referring pages is a name for when websites link to your pages and how many people followed these links. Knowing who linked to you can provide new websites to build connections with. Keyword rankings show how high on the SERP your website appears with different keywords. Looking at which keywords score well with search engines can determine what words to highlight in advertising. Additionally, some sites can help determine unpopular keywords that could be used by your website in the future.

Link Strength:

Link Strength is a metric that shows how strong the links on your website are. While links to other websites are good, too many links or links to low ranking websites will actually hurt your website’s credibility, causing it to drop in SERP ranking. Ecreative Link Juice Calculator, another one of Ana’s  4 SEO Tools You Didn’t Know You Needed, can help you make sure your links are strong and relevant.

 

Tracking your website can help optimize your page and the way your SEO flows in general. This will ultimately provide you with more leads and better connections all with less time spent. When it comes to tracking, a little set-up can go a long way.

 

For more information on Analytics, read Becca’s article on Robots.txt. If you wanted to read my previous work, check out my post on SEO Ethics

Robots.txt Guide

[Image Source]

A quick guide on the basics and origins of the Robots Exclusion Protocol

Along your SEO journey you may have come across the acronym REP or the term robots.txt. These are two ways of describing the Robots Exclusion Protocol (REP) or Robots Exclusion Standard. This blog post will serve to educate you on robots.txt and inform you of how it could help your small business.

Screen Shot 2016-03-18 at 2.50.36 PM
[Image created using Jason Davie’s Word Cloud Generator]
What the heck is it?

1994 was a big year for SEO. Not only was the first blog created, but the original REP was also formulated.The Robots Exclusion protocol is a standard or text file that communicates with website crawlers. Think of it like a compass that points the crawlers in the right direction when it comes to which parts of a website they should scan and index. Search engines are greedy. They want to scan and index as much information as they possibly can. What this means is that they will assume everything on your blog or website is available to scan unless you tell them otherwise. That’s where the Robots Exclusion Standard comes in. While this standard can be very helpful to anyone, including small business owners, it must also be used with great care.

Why would I want to use robots.txt?

The Robots Exclusion Protocol essentially allows you to control the crawler traffic on your website. This would come in handy if you don’t want Google crawling two very similar pages on your site and wasting what Google terms your “crawl budget”. Basically, a crawl budget is the number of times a search engine will crawl one of your web pages each time it visits the site. As you can imagine, it’s imperative to understand this concept if you want to develop and maintain a successful SEO strategy.

How do I Use it? 

To create a robots.txt file, you’ll have to put it in the top level directory of your server. If you have Windows, robotstxt.org recommends using the program notepad.exe to create your REP. Likewise, if you have a Macbook, use TextEdit. When a robot is looking for the robots.txt file, it will strip down the URL in a particular way. For example, if our URL for this blog was http://www.searchable.com/shop/index.html, the robot crawling the page would remove the “/shop/index.html” and replace it with “/robots.txt” so the URL now looks like this: http://www.searchable.com/robots.txt”. What this means for you is you will want to put the robots.txt file where you would put your website’s main index.htmlAs for what exactly to put, here’s a Robots.txt cheat sheet from MOZ bar on some of the more common REP language. I’ve also added some additional helpful text:

Block all web crawlers from ALL content: User-agent: *

Disallow: /

Block a specific crawler from a specific folder: User-agent: Googlebot

Disallow: /no-google/

Block a specific crawler from a specific webpage: User-agent: Googlebot

Disallow: /no-google/blocked-page.html

Sitemap Parameter: User-agent: *

Disallow:

Sitemap: http://www.example.com/none-standard-location/sitemap-.xml

Allow all web crawlers to access all files: User-agent: *

Allow:

The Downside

It’s important to keep in mind that denying robots the ability to crawl a webpage denies the link it’s value. In other words, the use of the Robots Exclusion Standard could potentially mean a decrease in the effects of your current Search Engine Optimization. Say someone links a page from your website that you’ve hidden from Google using REP. Now Google has a way of avoiding the Robots.txt protocol you used by indexing your web page through that third-party link. According to Yoast, if you have a section on your website that you do not want to show in Google’s search results, but that still generates a lot of links, you should not use the REP. Yoast suggests that instead of using REP, you should use a “noindex, follow” robots meta tag. This way, search engines like Google will still be able to properly distribute the link value for that page across your website. Another way to avoid this situation would be to password protect a file rather than excluding it from search results with robots.txt.

As you can see, robots.txt can be very useful to someone who is in the process of optimizing their website. On the other hand, it can be very tricky to use and could hurt your SEO if not used properly. In my opinion, robots.txt can be a very useful tool for optimizing your website. So long as you follow the guidelines outlines above, I’m confident that the Robots Exclusion Protocol will greatly improve your SEO. As always, Searchable is here to help. Leave any questions, comments or suggestions regarding Robots.txt in the comments below.

 

SEO Good Example: Nike Golf

 

A Case Study on what we can Learn from Nike Golf and their SEO.  

Nike, Inc. was founded in 1964. Since then it has dominated the sports equipment and activewear industry. Nike Golf is a sub-brand of Nike that, as the name suggests, is specific to the sport of Golf. Swellpath, now a part of 6D Analytics, is a digital marketing company that conducted a case study on Nike Golf in which the objective was to boost product awareness and drive website traffic.

Nike Golf: Before

According to Swellpath, Nike Golf’s biggest hindrance was the lack of a focused keyword strategy combined with a website that was hard for search engines to crawl for data. As we know, search engines crawl through content in order to ascertain what is relevant, thereby boosting certain websites’ content to the top of search results. 

Take a look at this great graphic from Swellpath’s case study:

What users were seeing (left) and what search engines were seeing (right)

Nike Golf Before

[Image courtesy of Swellpath]

The Game Plan

Taking into account the many options that Nike Golf had to optimize their website, Swellpath settled on using SWFObject2, which is an open-source Javascript library. A Javascript library is a set of pre-written Javascript that essentially makes it easier to develop other Javascript-based applications. Think of it  like a toolkit that gives you all the instruments you need to run a Javascript program. This library was appealing to Swellpath because it could more effectively provide content for search engine spiders. The library does this by storing a HTML-based version of the website behind the scenes that can be presented whenever a user visits the site. This also makes the mobile website much more user friendly. Additionally, SWFObject2 allows administrators to embed flash content that doesn’t rely on a specific scripting language. This makes the content accessible to a larger audience because any users that have Javascript disabled on their browsers, will still be able to see the Flash content.  Read more on the benefits of SWFObject2 here.

Results of the case study

According to 6D Analytics in conjunction with Swellpath, organic search traffic on the Nike Golf website increased by a staggering 348% in 2 years. This is important because organic search traffic includes both branded (“Nike”) and non-branded searches. In other words, in the time between the 2010 PGA golf season and the 2012 PGA Golf season, website traffic increased a total of 348%. Non-branded website traffic alone increased by 250%.

Going back to the previous infographic from Swellpath, on the left is what users see now. On the right is what search engines see now. As you can see, search engines like Google are now picking up on keywords fro, ultimately driving more traffic to the Nike Golf website.  

Nike Golf After

[Image Courtesy of Swellpath]

What we can learn from this

The main objective of any Search Engine Optimization is to put your brand ahead of others in search results. What we can learn from Nike Golf is that organic search matters. Small businesses (as well as large corporations) cannot sacrifice accessibility in the name of a memorable visual experience. Therefore, when building a website and optimizing it for search engines, make sure to factor in a good keyword strategy that will both drive traffic and boost product awareness.