301 Redirects: How They Affect Your SEO

A 301 redirect is a way to redirect traffic to a current and updated web page. These are necessary if a URL is broken or no longer active. 301 redirects are important for user experience, which can also strongly affect the SEO of your website.

301 or 302?

To first clarify, there are two kinds of redirects. 301s are permanent and include PageRank, MozRank, Page Authority, and Traffic Value. Previous pages will be removed from Google’s index and replaced by the new page. Some percentage of “link juice” will be lost in addition to PageRank.

Alternatively, 302 redirects are temporary. These are best used if testing a new webpage. 302s can be used as a temporary detour while fixing a page. They may not pass any traffic value to the new page, so it is best to avoid them if possible. All PageRank, MozRank, and Page Authority will remain the same, and the detour page will not gain any traffic value. After working hard on the SEO of your website, you don’t want to sacrifice your accumulated SEO value for a 302 redirect. They have the potential to greatly harm your SEO if created by accident, so they should be used sparingly and with care. Google will still view them the same as 301s, but it’s best to be specific and relevant and stick to 301 redirects.

But my SEO?

Search engines like Google will quickly notice the change to a redirect. Your SEO is immediately affected. However, if your website is large and complex, it could take a while for Google to completely forget the original page. If your website is new, you shouldn’t need any redirects, but it is important to realize the immediate effects of your website’s SEO.

301 redirects are a great way to accidentally test your SEO. Moz tested Google’s theories to see how much 301s can actually change your SEO, and they found results identical to Google’s predictions. On average, SEO is 15 percent below where it would be without any redirects. If your SEO is well-established, you could have better results despite the 301 redirect. However, if your website is extremely new and undeveloped, then that redirect could produce even worse SEO results. While 15 percent is only a benchmark, you can develop your own measurements using MozBar. By testing your redirects, you are able to see the ease of use for visitors while ensuring traffic is continuing correctly.

Still, Watch Out

While it is possible to add or delete 301 redirects, it is best to keep them forever. If you do remove a 301 redirect, then Google’s connection to your web page is broken, and the reputation of your domain becomes flawed.

Still, it is important to not redirect all old pages to a homepage. While this may seem like a holistic fix, it actually causes Google to view these pages as a 404. Instead, redirect old pages to other internal pages instead of a homepage. Your site and content is likely to change, so redirect old pages to newer, more updated content to help your visitors and your SEO.

In Conclusion

301 redirects can multiply quickly as a website gains more content and web pages. While necessary to redirect broken or outdated pages, it is best to build your website carefully to avoid them. If you don’t redirect your page at all, then there will be a dead 404 page, which should be avoided at all costs. Even if your website is relatively new, it is important to pay attention to URLs early on to ensure traffic is logical and allows for growth.

Head Towards Better SEO With Headers

Headers are more than just an exciting soccer move, but actually play a very important role in Search Engine Optimization. Incorporating headers into your website can help improve web traffic and SERP rankings.

The Basics

Headers are a part of a webpage’s architecture that help separate titles from the main text of the page. There are six levels of headers  that start with header 1, or h1, and go all the way down to header 6, h6.  H1s are the most important, usually the title on the page, and are a very broad description of the topic at hand. As the header numbers increase, their text gets more specific, but they also become less important. For example, let us look at the headings for a hypothetical ice cream shop.

 

<h1>Scoops Ice Cream Shop Menu</h1>

<h2>Soft Serve</h2>

<h3>Chocolate</h3>

<h3>Vanilla</h3>

<h4>Toppings</h4>

<h5>Nuts</h5>

<h5>Sprinkles</h5>

<h2>Milkshakes</h2>

<h3>Chocolate</h3>

<h3>Vanilla</h3>

<h3>Strawberry</h3>

 

As you can see, as the header numbers increase, the topic gets more specific, from the menu, to the title of the page, to the type of ice cream, to the ice cream flavors, to the toppings before taking a step back to move onto a different type.  It is important to note that you cannot skip levels in your headings. Jumping from a h1 to an h3 will break the HTML coding. Additionally, every page should have one h1 heading, but only one h1 heading. H1s are the most important heading, but having more than one creates confusion both for customers and search engines.

 

Headers and SEO

So how exactly do headers help optimize your webpage?

Importance

Headers help denote important title text from the rest of the information on the page.  Google’s Hummingbird algorithm uses headers, especially the h1 and h2s, to determine pertinent information on the page and help put relevant sites on their result page. Looking back at our example from earlier, the ice cream shop would rank well for “local soft serve”, a h2, but be ignored for “local sprinkles”, a h5.

Organization

Organization, both for you and the visitors of your web page, is another benefit of headers. Headers help to provide structure for the pages, as well as keep them organized. Pages look better and can make content easier for users to find.

Consistency

Search engine algorithms compare headers against the body text of a page when determining what to put on a results page. Headers that match the body text get ranked higher, while mismatched headers and text will fall in the rankings.

 

Pitfalls of Headers

While headers are an easy thing to start with when optimizing your website, there are a few things to avoid.

Too Many or Too Few h1s

Each page should have one h1 heading, but only one h1 header. The h1 header is a vital part of the page layout and SEO, but multiple h1s usually confuse both readers and algorithms.

Spamming Text into Headers

Headers are important in identifying keywords and other important information on a page. However, tons of headers or headers with paragraphs of text are viewed as spam by some algorithms and hurt your search engine standings.

Hiding Text

One older method of getting higher rankings on search engines was to put keywords unrelated to your product in headers on your page, but coloring them the same as your background. This prevented everyone but the search engine algorithms from seeing the text and would cause irrelevant pages to pop up on SERPs. Now this practice is considered unethical and your page will be penalized.

 

Although they do not have the biggest impact on your SEO,  headers are an important part of your optimization process and an easy way to score some goals early on.
For more basic SEO, check out our Basics of the Basics article. For more pitfalls to look out for as you begin your journey, check out our 5 Pitfalls to Avoid When Beginning Your SEO Journey.

Rock the Moz Bar

Here at Searchable, we have already talked about Moz a number of times, including their blog and their On-Page Grader. Today we are going to take a look at the Moz SEO Toolbar, an in-browser extension for Chrome and Firefox that lets you track SEO while browsing the web. While there are some upgraded features available with Moz Pro, we will be focusing on getting the most out of the free tools in the Moz SEO Toolbar.

The Basics

Being an in-browser tool, the Moz SEO Toolbar is always there to provide background information on your webpage or what makes a competitor’s website rank so high. The toolbar has two main features, the SERP overlay and the web page analysis.

SERP Overlay

The SERP overlay is a feature that helps explain why different web pages rank so highly on Google, Bing or Yahoo search. In this overlay, each result is shown with their page and domain authority scores. Page authority serves as an indicator of how strong the individual page is, while domain authority shows the strength of the website as a whole. For example, let us say you want to start a boutique cat clothing store. The Google SERP if you search the term “cats” looks like this:

Cats Search Engine Result Page
Each of the top 3 posts have mid-range page authority ranking and 100 domain authority. It would be very hard to break into this SERP.

On the other hand, if you search cat boutique, this is what the SERP looks like:

Cat Boutique Results Page
These pages have a much lower page and domain authority score on average. As a result, ranking high on the cat boutique SERP would be a much more attainable goal and something that would take a lot less time and effort.

Web Page Analysis

In addition to the SERP overlay, the MOZ toolbar also lets you gain insight into the what ranks certain pages so highly. Keeping with the cat boutique example, we can look at the top ranking result when you search for cat boutique.

Hemmingway Web Analysis

Hemingway’s has low page and domain authority, but a low spam score and decent Facebook activity. The low authority rankings mean the page does not have that much clout and is susceptible to being passed on the SERP. However, the low spam score means search engines are not penalizing the page for Black Hat SEO, and the Facebook activity shows the company is active on social media.

The two tools on the left of the toolbar help dig a little bit deeper into the page. The Magnifying glass over the web page is the page analysis tool. This tool pulls up information about a website that is important for SEO, such as headers and meta descriptions.

Page Analysis Tool

The other tool is the highlighter tool, which highlights links on the page.

Link Highlighter

In addition to looking into competitors, the web page analysis feature can also be used on your own web page. When used on your site, this tool can help you identify your strengths and weaknesses or point out holes in your website’s SEO.

As pointed out earlier in this article, and as can be seen throughout the images, there is a full version of this toolbar. The full version will give you access to more metrics and analytics and can be tried out for 30 days free of charge. The free version of the Moz toolbar should be more than enough to get you started on your SEO journey.

For the Chrome version of the Moz Toolbar click here. For the Firefox version, click here. For more posts about free SEO analytics tools, read Ana’s article on 4 Free SEO Tools You Didn’t Know You Needed.

Google Algorithms: Panda, Penguin, and Hummingbird

[Image Source, Edited by me on Picmonkey]

Algorithms are intended to make internet searches easier and more accurate. They are intended to give you what you’re looking for without having to comb through five or ten pages of search engine results. You may have recently heard a lot of talk regarding Instagram’s new algorithm, however, the algorithm’s we’ll be discussing today are slightly different. In this blog post I will attempt to explain Google’s algorithms to you so that you will understand how they affect pagerank and search engine results. The 3 algorithms I’ll be discussing are Google’s Panda, Penguin, and Hummingbird.

Algorithm #1: Panda

Panda Update[Image Source]

This program was first released in February of 2011. According to Search Engine Land, this Google update is meant to stop poor quality websites from getting to the top of the search engine results page. Essentially, Panda is a spam-fighting algorithm. Google’s goal in creating Panda was to lower websites that have a lot of spam and raise websites that have quality content to the top of the search results once more. Search Engine Land also reported that it’s very possible Google “baked” the Panda algorithm into their core ranking algorithm in late 2015.  Like the rest of the algorithms we’ll be talking about, Panda did not replace Google’s principal ranking algorithm, it only replaced an outdated part.

 

Algorithm #2: Penguin

Penguin Update

[Image Source]

This algorithm was first introduced in April 2012. Like the Panda update, this algorithm was meant to filter out spam so search engine results would not be affected by low-quality websites. More specifically, this update addresses websites that buy links or unethically obtain them from link networks that are designed to increase Google rankings.  When Penguin was first released, there was a small uproar from website owners about their site traffic decreasing. This is likely because they were hit by Penguin and deemed to be spam. To remedy this, we suggest removing any spam from your website. If your site traffic has climbed back to normal levels even after purging your site of anything spam-related, we suggest using this form to alert Google to your problem.

Algorithm #3: Hummingbird

Anna's_hummingbird[Image Source]

This Google update was rolled out in September of 2013 and it’s purpose was to be “fast and precise”….like a hummingbird! According to Search Engine Land, conversational search is one of the things the Hummingbird update has addressed. In other words, rather than focusing on a keyword by itself, this update takes an entire sentence into account when ranking pages. This means that, theoretically, Hummingbird is allowing Google to assess the meaning behind words, rather than just the words themselves. This is a smart move when we consider the introduction of communication technology like Siri and Google’s version “Google Now”. Siri was introduced in April of 2010 and Google Now followed a few years later in July of 2012. Less than a year later, Google rolls out the Hummingbird update, which would make this technology easier to use. So when you ask Siri or Google Now, “Where’s the closest brunch place to my home?”, Google is now better equipped to provide the answer. If you’ve provided Google with your address, it will understand what you mean by “home”. If you say “place” Google should now understand that you mean a physical location.

This was just a brief introduction to some of the most popular Google Algorithms that have been rolled out in the past few years. Personally, I think algorithms can be highly beneficial to website owners and internet users alike. I would love to hear your thoughts in the comments.

4 Free SEO Tools You Didn’t Know You Needed

If you are feeling fairly confident in your mastering of basic optimization for your business’ website, then it’s time to take your SEO tactics to a new level. There is a wide world of easy-to-use and free online tools that can help your website’s content thrive, as well as keep an eye on the competition. These free tools have received some love from SEO experts, so give them a go.  

1. Ecreative Link Juice Calculator

This first tool is for those pages on your website that you take pride in because they have a lot of outbound links directing traffic to your internal pages and other sites. But, believe it or not, too many links can actually harm your homepage’s authority, as well as interior pages. As web marketing expert Andy Crestodina notes, even an enormous site such as Amazon only has around 100 links in on its homepage.

Ecreative link juice tool
[Source: Ecreative results page for LUC.edu]
To use Ecreative’s tool, simply paste your homepage URL into the bar and press enter. The “juice calculator” will let you know if you have too many links and will normally give you a suggestion on how many links you should remove. For example, Loyola University Chicago’s homepage (LUC.edu) has 294 links on its homepage alone and this tool suggests they get that number down to around 100 to increase their link juice.

 

2. Moz On-Page Grader

Moz is a leader in SEO tools and expertise. They have a large number of great tools to choose from. One of the simplest is their On-Page Grader, which scans a page of your website and measures your keyword optimization. Moz uses 30 different criteria to calculate where your best optimization is coming from and suggests how to focus on the positive SEO areas of your page. The tool will also perform weekly page audits to make sure you are keeping up with the ongoing process of keyword optimization.

This tool is unfortunately only available for free during a 30-day trial. However, if you are a Moz Pro member, you can enjoy this tool as much as your heart desires.

3. Linkstant

Remember that link building campaign that your started six months ago? What happened with that? We know you don’t have an intern lying around who can answer these questions for you (and if you do, you are one lucky business owner). That’s where Linkstant comes in.

If you have lost track of where your inbound links are coming from and where they aren’t, then Linkstant can wrangle your incoming links in one place. Just create an account and you can get all of your links in order. This tool offer real-time information on what links are performing well and what links are duds. It will allow you to not only be satisfied with the amount of external links you have built, but to have confidence that those links are doing some good for your site’s traffic.

4. GTmetrix

When trying to optimize every little nook and cranny of your website, the simple things can get lost in the SEO shuffle. One of those “little things” is page speed. How fast or slow a page of your site loads can make your bounce rate (the percentage of visitors to your site land on a page, and then the site altogether) skyrocket, driving your traffic into the ground.

Luckily, GTmetrix can offer you an analysis of loading speed on every page of your site and can also offer suggestions on how you can make slow-loading pages faster than ever. Google actually uses page speed in its PageRank algorithm, so it is best to not overlook this on your own website.

 

These tools don’t have all of the bells and whistles that paid tools have, but they get the job done. For those marketers and business owners who have limited time and money, test the tools out before you commit to any paid subscriptions or software. Start with these tools and work your way up if you really want to work with the Internet’s most high-tech SEO robots in the future.

The Present & Future of Google Algorithms

Google is constantly revising and improving its algorithms in order to keep the search engine experience as efficient and authentic as possible. Here are some of the most recent improvements as well as what changes we can expect in the future:

Mobile-Optimization over Desktop

Mobile platforms are gaining a significant presence over traditional desktop search engines. Whether through mobile devices or mobile apps, people are searching for answers at their fingertips. Last year, mobile searches surpassed desktop searches and the new Google algorithm Mobilegeddon focused more on mobile accessibility. This provided more optimal search results. 70% of mobile searches lead to online action, so it is important not only for SEO but also for your business. While some say that desktop-specific sites are no longer necessary, you should still primarily focus on a template that is easy for visitors to use while also recognizable from all platforms. User experience is essential.

Considering 89% of mobile time is on apps, Google is beginning to pay more attention to them. When you include important keywords in your app it makes your business more searchable and easier to access. This trend is only going to grow, so it is essential for small business owners to stay on top of their SEO and optimize their mobile experience. Again, aim to optimize the user’s experience over the search engine results. Lines are beginning to blur between what is web and what is social media. Don’t fret – this just provides more opportunities to improve SEO.

Social over Factual

Social media is also increasing its presence within search feeds. Now, when searching a certain business, social media posts will be indexed in results. This allows the customer’s experience to improve your SEO but doesn’t discredit actual content from your website. All this means is that you need to connect your content to your social media platforms in order to create a more cohesive brand. Creating quality content for your site as well as your social media platforms is only going to increase your search engine presence and overall visitor satisfaction.

Aggregated Content over News

Twitter is always changing it’s algorithm, which consequently changes the way search engines work to aggregate live coverage and events. Content aggregating can be tricky, but it can also provide more specific results for target visitors. Twitter offers “Moments”, which explain an event through third-party coverage and opinions as opposed to typical news and media coverage. It is advised that you create or curate content, not aggregate, however, this trend is most likely going to affect Google’s algorithm in the future. Be prepared and open to change.

Video over Text in B2C Content

Video will continue to be important in Business-To-Consumer Content. While written content is the standard, video apps like Vine, Snapchat, and Periscope are revolutionizing the way we consume visual content. Google ranks video content as 50x better than plain text content, a number that will only grow in the future. Consumers want to watch exciting videos on their feeds that allow them to connect with your social media content. While searching, they want specific, exciting examples that visually share what your business has to offer. Proper keywords and tags will promote your video content in Google, but also keep in mind that quality content is still most important.

Conclusion

While the many Google algorithm updates like Hummingbird, Panda, Penguin, Pigeon, Payday, and Pirate may sound like a weird zoo, don’t let them intimidate you. As always, never try to manipulate the algorithms. It is important to remain on top of the latest trends. Be sure to create the best website and content possible for your visitors. Remember, the goal is to optimize their experience. Google is constantly searching for ways to help determine what searchers really want.
What Google algorithm updates would you like to see in the future?

Go-To Glossary for SEO

The SEO world is full of phrases and terms that can be difficult to understand as a beginner. The below glossary is a brief source for those looking to begin a smooth transition into improving SEO for their small businesses. The terms on our list will help you to navigate our blog more easily, as well as others.

  • 301 redirect: When a URL is no longer in use by a website or a web page is deleted, a 404 error usually pops up (“page not found”). A 301 redirect will navigate users away from the abandoned or deleted page to a new page that the web manager specifies. Having a 301 redirect could lower bounce and increase page views on content to which users are redirected.    
  • Alt text: a piece of text that may appear to some users in place of an image. Some users are online via text-only browser, so this text will help those visitors to better understand the content of your page.  
  • Anchor text: the text that holds a hyperlink to another source. Anchor text should not be vague, such as “this article” or “click here.” Anchor text needs to describe what users will find when they click on the link. Keep the linked text between one and five words.    
  • Authority: The trustworthiness, knowledge, reliability, and respect of your website. Authority is one of the key attributes of a website that Google uses to rank a site, and many metrics contribute to authority, such as external links, age of site, and popularity among searchers.   
  • Black Hat SEO: The opposite of white hat SEO, a type of SEO strategy that does not abide by best practices of guidelines established by Google and other search engine leaders. Tactics are usually unethical and can result in a website being penalized by search engines that will give the site implementing black hat SEO a poor ranking.
    Mac n' Cheese recipes on AllRecipes.com
    [Source: AllRecipes.com]
  • The Fold: The spot where a web page becomes cut off by the bottom of a screen or computer monitor. The most eye-catching, user-oriented content on a website’s homepage should be above this point to avoid a high bounce rate due to difficult navigability. On the right, you will see the fold for this All Recipes page is just below the header images. 

 

  • Information architecture:The organization and structure of digital content. the architecture of a website’s information strives to be findable by spiders, but also usable for users of the content which we are organizing. Bad information architecture can include unclear page titles, meta descriptions that don’t utilize keywords, and websites with poorly designed navigation.
  • Keyword:A term or phrase that is used in a web page’s title, content, meta-description, and tags that can lead to the page having a higher ranking for such a term/phrase in search. Strategically choosing keywords that are searched often or sought out by your audience is key.
  • Link building: An effort by a website manager to attract external links to one or more pages of their website. Building links is desirable because inbound links to a website can boost its page rank on Google. Link building is carried out through traditional public relations techniques and mutual linking to relevant websites. Link building helps to increase a website’s authority.    
    Meta description screen grab
    [Source: Google screen grab for search “Chicago mac n’ cheese”]
  • Meta description: The text that appears below the title and URL of a web page in a search engine. Meta descriptions usually include keywords and the terms that are being searched. Creating your own meta description to highlight key phrases is essential for building SEO. Below, you can see that the meta description for the page is “Mac and cheese is one of nearly everyone’s favorite comfort foods and here is the recognition it…”

 

  • Page rank: A system that determines a website’s ranking for certain keywords or phrases. Google’s PageRank algorithm scores site’s based on external links to site, relevance of content, and site authority. If your website has made it onto the first Google SERP for a certain keyword, then you’ve made it, but there is always room for improvement.
    SERP
    [Source: Google screen grab for search “Chicago mac n’ cheese”]
  • Search Engine Results Page (SERP): The list of search engine results that appear when searching a specific keyword or phrase. The order of results of a Google SERP are determined based on Google PageRank (discussed above). Many times, the first one or two results is an advertisement result by AdWords, which does not necessarily appear based on PageRank. 

 

 

 

  • Web 2.0: Contemporary computer technologies that are Web-based (run by the internet and not PC software) and that allow users to interact with content more freely and with more ease. An example of this this is the shift from using Microsoft Excel to Google Sheets, which allows multiple people to edit and control a document remotely.  
  • White Hat SEO: In contrast to black hat SEO, a strategy for SEO that follows the ethical guidelines and best practices put forth by search engines. Playing by the rules through white hat SEO is more likely to improve your PageRank on Google.  

While this list is by no means an all-inclusive dictionary for your SEO adventures, it is a good start for a beginner. If you don’t understand some SEO jargon next time you are watching a tutorial or reading one of our blog posts, your first step should be to look it up. No SEO stone should go unturned!

 

SEO Good Example: Nike Golf

 

A Case Study on what we can Learn from Nike Golf and their SEO.  

Nike, Inc. was founded in 1964. Since then it has dominated the sports equipment and activewear industry. Nike Golf is a sub-brand of Nike that, as the name suggests, is specific to the sport of Golf. Swellpath, now a part of 6D Analytics, is a digital marketing company that conducted a case study on Nike Golf in which the objective was to boost product awareness and drive website traffic.

Nike Golf: Before

According to Swellpath, Nike Golf’s biggest hindrance was the lack of a focused keyword strategy combined with a website that was hard for search engines to crawl for data. As we know, search engines crawl through content in order to ascertain what is relevant, thereby boosting certain websites’ content to the top of search results. 

Take a look at this great graphic from Swellpath’s case study:

What users were seeing (left) and what search engines were seeing (right)

Nike Golf Before

[Image courtesy of Swellpath]

The Game Plan

Taking into account the many options that Nike Golf had to optimize their website, Swellpath settled on using SWFObject2, which is an open-source Javascript library. A Javascript library is a set of pre-written Javascript that essentially makes it easier to develop other Javascript-based applications. Think of it  like a toolkit that gives you all the instruments you need to run a Javascript program. This library was appealing to Swellpath because it could more effectively provide content for search engine spiders. The library does this by storing a HTML-based version of the website behind the scenes that can be presented whenever a user visits the site. This also makes the mobile website much more user friendly. Additionally, SWFObject2 allows administrators to embed flash content that doesn’t rely on a specific scripting language. This makes the content accessible to a larger audience because any users that have Javascript disabled on their browsers, will still be able to see the Flash content.  Read more on the benefits of SWFObject2 here.

Results of the case study

According to 6D Analytics in conjunction with Swellpath, organic search traffic on the Nike Golf website increased by a staggering 348% in 2 years. This is important because organic search traffic includes both branded (“Nike”) and non-branded searches. In other words, in the time between the 2010 PGA golf season and the 2012 PGA Golf season, website traffic increased a total of 348%. Non-branded website traffic alone increased by 250%.

Going back to the previous infographic from Swellpath, on the left is what users see now. On the right is what search engines see now. As you can see, search engines like Google are now picking up on keywords fro, ultimately driving more traffic to the Nike Golf website.  

Nike Golf After

[Image Courtesy of Swellpath]

What we can learn from this

The main objective of any Search Engine Optimization is to put your brand ahead of others in search results. What we can learn from Nike Golf is that organic search matters. Small businesses (as well as large corporations) cannot sacrifice accessibility in the name of a memorable visual experience. Therefore, when building a website and optimizing it for search engines, make sure to factor in a good keyword strategy that will both drive traffic and boost product awareness.

What is SEO? The Basics of the Basics

Search Engine Optimization, or SEO, is important to companies to bring more people to your website. Search engines lead visitors to your website from relevant searches, and SEO helps maximize these connections.

An SEO Timeline

How did SEO even begin? It first originated in the early 1990s with the world’s first website and continued its birth for the next decade. Excite was the first search platform, followed by Yahoo and later Google. Each engine continued to simplify the big bad world of data. It was easy to manipulate search engines at this time, but Google began to see opportunities for the future.

In 2003, Google began optimizing search engines by improving the value and relevancy of results. Local SEO began in practice to provide users accurate information regarding maps, locations, and more. Ethical practices were encouraged by Google even in the early days.

After a few years, Google began to encourage real-time search results promoting content media with Universal Search. In 2008, Google Suggest made SEO even more targeted for users. In 2010, Google continued to become stricter as the clear leader in search engines. Social media began to alter search results, prompting the creation of Google+.

Currently, privacy and personalization are coming into conflict around SEO. While digital marketers want to create the most customized experience for visitors, users want to maintain their privacy and not have search engines read their minds completely. Still, content must be personalized with quality content to be competitive.

While Google is not the only search engine, it dominates the industry. 65% of searches are Googled, followed by 20% done with Microsoft and 13% with Yahoo. Google likes to keep it clean, though, so they constantly change the algorithm to keep searching as authentic as possible. This ensures ethical practices within digital marketing and keeps it fair between sites that actual deserve the highest rankings.

Basic Optimization

In addition to high-quality content, search engines also rank sites by how visitors engage with your site, loading speed and mobile ease, and the amount of unique content. Sites with higher retention rates are ranked higher than sites that lead users to return right back to their search query.

Keywords are crucial in optimizing your site. What will your targeted visitors be searching to find you? It is also important to research your selected keywords and ensure they are deriving the desired results. More than just comparing to competitors, you want to check on search volume and relevance regarding your target keywords. Keywords reside in more than just tags, too. On-page optimization includes title tags, meta descriptions, body content, alt attributes, url structure, schema, as well as markup.

Information architecture is also crucial in SEO. It is best to avoid flat information architecture; you want to provide the most linked-to pages. By having your most important search pages ranked high in your personal information architecture, search engines will rank your own information higher. Make sure you also avoid header responses, such as 404 errors. If your pages have been relocated, do not hold on to them. You want to help your visitors find desired content, not frustrate them. It is easy to overlook issues like this, redirects, and duplicate content, but you want to prevent any difficulty in accessibility. Even just a few unnecessary clicks can prod users to return to their original search, losing your audience. After all, your website is for others.

Once you begin to figure out your own SEO, how do you measure your results? Rank your keywords and record your organic traffic and leads. Analytics from the web are never perfect, so there can be flaws in your tracking. Lifetime value metrics can be tricky, so consider your organic users.

Past traditional SEO, there is also international and local cases as well as search engines within  app stores. These all provide important insights, depending on your particular industry and target audience. Be open-minded and think outside the box. Reverse engineering can help improve your users’ experiences and continue to improve your rankings.

SEO is constantly evolving, and this is an only an overview. Continue to follow our blog for the basics, the latest, and more.

 

Spiders on the Web: A brief guide on the inner workings of search engines

We often take the ability to use a search engine for granted. Google has made search so easy that we do not have to worry about finding information. Googling something we do not know has become second nature to us. We trust Google’s results to be relevant and accurate. In order to best optimize your website for Google, it is helpful to know how the search engine works.

Pre-Google

This article contains a detailed account about the rise of Google, if dense material isn’t your thing continue reading this summary. Early search engines, like AltaVista, would compare search terms to their database and whichever page had the most similar terms would be considered the most relevant. This was problematic because the results weren’t necessarily the most relevant. For example, if you wanted to search Columbia Sportswear, Dick’s Sporting Goods might be the first result, rather than the corporate web page, because they list a lot of Columbia products. The amount of times a keyword was mentioned in the page outweighed other information in determining relevance. Search results weren’t as helpful because users had to filter through the results themselves. Yahoo was different. They used human judgement to aid their search engine’s results. Yahoo hired people to read through webpages to pull keywords and write summaries. In 1997, Google changed the game by offering better results, without clutter. Instead of looking at the text of a website, Google observed the patterns of hyperlinks, specifically, the number and type of incoming and outgoing hyperlinks. 

Search Now

When you type your search query into Google, the search engine sends out little programs called crawlers or spiders. These crawlers jump from page to page by following links. As the spiders travel, they send a copy of each page to the search engine. Google then creates an index of the words on that page. Algorithms are complex mathematical equations which are designed to find hints to understand what you are searching for. There are plenty of great videos on Youtube, like this one, that will help you visualize the search process. Going back to our example, when you type in “Columbia Sportswear” on Google, the first search result will be the manufacturer’s website, not a retailer’s (i.e. Dick’s Sporting Goods). This is because of Google’s algorithm, which gives more weight to a company’s homepage.

Why it all matters

According to this article, less than 6% of users clicked on a link on the second page of Google results.  As a small business owner, it is crucial for you to maximize the chances of your website being on that first page. The process of making your website search-engine-friendly is called Search Engine Optimization, or as it’s commonly referred to, SEO. As the name suggests, Search Engine Optimization is the process of optimizing your website’s content for Google’s crawlers. How often websites are crawled depends on how often the content on the website changes. Therefore, frequent updates to your website are essential to driving traffic.